Nov 26 15:07:15 crc systemd[1]: Starting Kubernetes Kubelet... Nov 26 15:07:15 crc restorecon[4690]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:15 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:16 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:16 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:16 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:16 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:16 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:16 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:16 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:16 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:16 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:16 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:16 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:16 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:16 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:16 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:16 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:16 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:16 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:16 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:16 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:16 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:16 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:16 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:16 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:16 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:16 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:16 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:16 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:16 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:16 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:16 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:16 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:16 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:16 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:16 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:16 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:16 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:16 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:16 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:16 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:16 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:16 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:16 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:16 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:16 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:16 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:16 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:16 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:16 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:16 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:16 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:16 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:16 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:16 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:16 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:16 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:16 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:16 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:16 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:16 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:16 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:16 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:16 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:16 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:16 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:16 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:16 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:16 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:16 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:16 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:16 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:16 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:16 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:16 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:16 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:16 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:16 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:16 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:16 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:16 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:16 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:16 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:16 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:16 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:16 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:16 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:16 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:16 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:16 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:16 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:16 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:16 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:16 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:16 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:16 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:16 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:16 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:16 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:16 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:16 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:16 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:16 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 26 15:07:16 crc restorecon[4690]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Nov 26 15:07:16 crc restorecon[4690]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Nov 26 15:07:16 crc restorecon[4690]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Nov 26 15:07:16 crc restorecon[4690]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Nov 26 15:07:16 crc restorecon[4690]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Nov 26 15:07:16 crc restorecon[4690]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Nov 26 15:07:16 crc restorecon[4690]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Nov 26 15:07:16 crc restorecon[4690]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Nov 26 15:07:16 crc restorecon[4690]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Nov 26 15:07:16 crc restorecon[4690]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Nov 26 15:07:16 crc restorecon[4690]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Nov 26 15:07:16 crc restorecon[4690]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Nov 26 15:07:16 crc restorecon[4690]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Nov 26 15:07:16 crc restorecon[4690]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 26 15:07:16 crc restorecon[4690]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 26 15:07:16 crc restorecon[4690]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 26 15:07:16 crc restorecon[4690]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 26 15:07:16 crc restorecon[4690]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 26 15:07:16 crc restorecon[4690]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 26 15:07:16 crc restorecon[4690]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 26 15:07:16 crc restorecon[4690]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 26 15:07:16 crc restorecon[4690]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 26 15:07:16 crc restorecon[4690]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 26 15:07:16 crc restorecon[4690]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 26 15:07:16 crc restorecon[4690]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 26 15:07:16 crc restorecon[4690]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 26 15:07:16 crc restorecon[4690]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 26 15:07:16 crc restorecon[4690]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 26 15:07:16 crc restorecon[4690]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 26 15:07:16 crc restorecon[4690]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 26 15:07:16 crc restorecon[4690]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 26 15:07:16 crc restorecon[4690]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 26 15:07:16 crc restorecon[4690]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 26 15:07:16 crc restorecon[4690]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 26 15:07:16 crc restorecon[4690]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Nov 26 15:07:16 crc restorecon[4690]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 26 15:07:16 crc restorecon[4690]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Nov 26 15:07:16 crc restorecon[4690]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Nov 26 15:07:16 crc restorecon[4690]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Nov 26 15:07:16 crc restorecon[4690]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Nov 26 15:07:16 crc restorecon[4690]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Nov 26 15:07:16 crc restorecon[4690]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Nov 26 15:07:16 crc restorecon[4690]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Nov 26 15:07:16 crc restorecon[4690]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Nov 26 15:07:16 crc restorecon[4690]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Nov 26 15:07:16 crc restorecon[4690]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Nov 26 15:07:16 crc restorecon[4690]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Nov 26 15:07:16 crc kubenswrapper[4785]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Nov 26 15:07:16 crc kubenswrapper[4785]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Nov 26 15:07:16 crc kubenswrapper[4785]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Nov 26 15:07:16 crc kubenswrapper[4785]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Nov 26 15:07:16 crc kubenswrapper[4785]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Nov 26 15:07:16 crc kubenswrapper[4785]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.775479 4785 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Nov 26 15:07:16 crc kubenswrapper[4785]: W1126 15:07:16.780521 4785 feature_gate.go:330] unrecognized feature gate: InsightsConfig Nov 26 15:07:16 crc kubenswrapper[4785]: W1126 15:07:16.780580 4785 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Nov 26 15:07:16 crc kubenswrapper[4785]: W1126 15:07:16.780591 4785 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Nov 26 15:07:16 crc kubenswrapper[4785]: W1126 15:07:16.780601 4785 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Nov 26 15:07:16 crc kubenswrapper[4785]: W1126 15:07:16.780610 4785 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Nov 26 15:07:16 crc kubenswrapper[4785]: W1126 15:07:16.780619 4785 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Nov 26 15:07:16 crc kubenswrapper[4785]: W1126 15:07:16.780628 4785 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Nov 26 15:07:16 crc kubenswrapper[4785]: W1126 15:07:16.780636 4785 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Nov 26 15:07:16 crc kubenswrapper[4785]: W1126 15:07:16.780645 4785 feature_gate.go:330] unrecognized feature gate: OVNObservability Nov 26 15:07:16 crc kubenswrapper[4785]: W1126 15:07:16.780654 4785 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Nov 26 15:07:16 crc kubenswrapper[4785]: W1126 15:07:16.780662 4785 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Nov 26 15:07:16 crc kubenswrapper[4785]: W1126 15:07:16.780669 4785 feature_gate.go:330] unrecognized feature gate: SignatureStores Nov 26 15:07:16 crc kubenswrapper[4785]: W1126 15:07:16.780677 4785 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Nov 26 15:07:16 crc kubenswrapper[4785]: W1126 15:07:16.780685 4785 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Nov 26 15:07:16 crc kubenswrapper[4785]: W1126 15:07:16.780693 4785 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Nov 26 15:07:16 crc kubenswrapper[4785]: W1126 15:07:16.780700 4785 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Nov 26 15:07:16 crc kubenswrapper[4785]: W1126 15:07:16.780708 4785 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Nov 26 15:07:16 crc kubenswrapper[4785]: W1126 15:07:16.780716 4785 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Nov 26 15:07:16 crc kubenswrapper[4785]: W1126 15:07:16.780723 4785 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Nov 26 15:07:16 crc kubenswrapper[4785]: W1126 15:07:16.780731 4785 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Nov 26 15:07:16 crc kubenswrapper[4785]: W1126 15:07:16.780738 4785 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Nov 26 15:07:16 crc kubenswrapper[4785]: W1126 15:07:16.780748 4785 feature_gate.go:330] unrecognized feature gate: NewOLM Nov 26 15:07:16 crc kubenswrapper[4785]: W1126 15:07:16.780756 4785 feature_gate.go:330] unrecognized feature gate: PinnedImages Nov 26 15:07:16 crc kubenswrapper[4785]: W1126 15:07:16.780764 4785 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Nov 26 15:07:16 crc kubenswrapper[4785]: W1126 15:07:16.780771 4785 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Nov 26 15:07:16 crc kubenswrapper[4785]: W1126 15:07:16.780779 4785 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Nov 26 15:07:16 crc kubenswrapper[4785]: W1126 15:07:16.780789 4785 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Nov 26 15:07:16 crc kubenswrapper[4785]: W1126 15:07:16.780800 4785 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Nov 26 15:07:16 crc kubenswrapper[4785]: W1126 15:07:16.780811 4785 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Nov 26 15:07:16 crc kubenswrapper[4785]: W1126 15:07:16.780821 4785 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Nov 26 15:07:16 crc kubenswrapper[4785]: W1126 15:07:16.780829 4785 feature_gate.go:330] unrecognized feature gate: Example Nov 26 15:07:16 crc kubenswrapper[4785]: W1126 15:07:16.780838 4785 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Nov 26 15:07:16 crc kubenswrapper[4785]: W1126 15:07:16.780849 4785 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Nov 26 15:07:16 crc kubenswrapper[4785]: W1126 15:07:16.780859 4785 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Nov 26 15:07:16 crc kubenswrapper[4785]: W1126 15:07:16.780869 4785 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Nov 26 15:07:16 crc kubenswrapper[4785]: W1126 15:07:16.780878 4785 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Nov 26 15:07:16 crc kubenswrapper[4785]: W1126 15:07:16.780886 4785 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Nov 26 15:07:16 crc kubenswrapper[4785]: W1126 15:07:16.780895 4785 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Nov 26 15:07:16 crc kubenswrapper[4785]: W1126 15:07:16.780902 4785 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Nov 26 15:07:16 crc kubenswrapper[4785]: W1126 15:07:16.780910 4785 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Nov 26 15:07:16 crc kubenswrapper[4785]: W1126 15:07:16.780919 4785 feature_gate.go:330] unrecognized feature gate: PlatformOperators Nov 26 15:07:16 crc kubenswrapper[4785]: W1126 15:07:16.780927 4785 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Nov 26 15:07:16 crc kubenswrapper[4785]: W1126 15:07:16.780935 4785 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Nov 26 15:07:16 crc kubenswrapper[4785]: W1126 15:07:16.780942 4785 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Nov 26 15:07:16 crc kubenswrapper[4785]: W1126 15:07:16.780954 4785 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Nov 26 15:07:16 crc kubenswrapper[4785]: W1126 15:07:16.780965 4785 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Nov 26 15:07:16 crc kubenswrapper[4785]: W1126 15:07:16.780973 4785 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Nov 26 15:07:16 crc kubenswrapper[4785]: W1126 15:07:16.780981 4785 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Nov 26 15:07:16 crc kubenswrapper[4785]: W1126 15:07:16.780989 4785 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Nov 26 15:07:16 crc kubenswrapper[4785]: W1126 15:07:16.780997 4785 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Nov 26 15:07:16 crc kubenswrapper[4785]: W1126 15:07:16.781006 4785 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Nov 26 15:07:16 crc kubenswrapper[4785]: W1126 15:07:16.781014 4785 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Nov 26 15:07:16 crc kubenswrapper[4785]: W1126 15:07:16.781022 4785 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Nov 26 15:07:16 crc kubenswrapper[4785]: W1126 15:07:16.781029 4785 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Nov 26 15:07:16 crc kubenswrapper[4785]: W1126 15:07:16.781047 4785 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Nov 26 15:07:16 crc kubenswrapper[4785]: W1126 15:07:16.781054 4785 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Nov 26 15:07:16 crc kubenswrapper[4785]: W1126 15:07:16.781063 4785 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Nov 26 15:07:16 crc kubenswrapper[4785]: W1126 15:07:16.781073 4785 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Nov 26 15:07:16 crc kubenswrapper[4785]: W1126 15:07:16.781082 4785 feature_gate.go:330] unrecognized feature gate: GatewayAPI Nov 26 15:07:16 crc kubenswrapper[4785]: W1126 15:07:16.781090 4785 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Nov 26 15:07:16 crc kubenswrapper[4785]: W1126 15:07:16.781097 4785 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Nov 26 15:07:16 crc kubenswrapper[4785]: W1126 15:07:16.781105 4785 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Nov 26 15:07:16 crc kubenswrapper[4785]: W1126 15:07:16.781113 4785 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Nov 26 15:07:16 crc kubenswrapper[4785]: W1126 15:07:16.781120 4785 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Nov 26 15:07:16 crc kubenswrapper[4785]: W1126 15:07:16.781127 4785 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Nov 26 15:07:16 crc kubenswrapper[4785]: W1126 15:07:16.781135 4785 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Nov 26 15:07:16 crc kubenswrapper[4785]: W1126 15:07:16.781143 4785 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Nov 26 15:07:16 crc kubenswrapper[4785]: W1126 15:07:16.781169 4785 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Nov 26 15:07:16 crc kubenswrapper[4785]: W1126 15:07:16.781176 4785 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Nov 26 15:07:16 crc kubenswrapper[4785]: W1126 15:07:16.781184 4785 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Nov 26 15:07:16 crc kubenswrapper[4785]: W1126 15:07:16.781191 4785 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.782253 4785 flags.go:64] FLAG: --address="0.0.0.0" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.782285 4785 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.782301 4785 flags.go:64] FLAG: --anonymous-auth="true" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.782312 4785 flags.go:64] FLAG: --application-metrics-count-limit="100" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.782324 4785 flags.go:64] FLAG: --authentication-token-webhook="false" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.782334 4785 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.782346 4785 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.782357 4785 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.782367 4785 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.782376 4785 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.782386 4785 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.782395 4785 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.782404 4785 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.782413 4785 flags.go:64] FLAG: --cgroup-root="" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.782422 4785 flags.go:64] FLAG: --cgroups-per-qos="true" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.782431 4785 flags.go:64] FLAG: --client-ca-file="" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.782440 4785 flags.go:64] FLAG: --cloud-config="" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.782449 4785 flags.go:64] FLAG: --cloud-provider="" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.782458 4785 flags.go:64] FLAG: --cluster-dns="[]" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.782472 4785 flags.go:64] FLAG: --cluster-domain="" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.782481 4785 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.782491 4785 flags.go:64] FLAG: --config-dir="" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.782500 4785 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.782509 4785 flags.go:64] FLAG: --container-log-max-files="5" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.782520 4785 flags.go:64] FLAG: --container-log-max-size="10Mi" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.782529 4785 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.782538 4785 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.782548 4785 flags.go:64] FLAG: --containerd-namespace="k8s.io" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.782587 4785 flags.go:64] FLAG: --contention-profiling="false" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.782596 4785 flags.go:64] FLAG: --cpu-cfs-quota="true" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.782605 4785 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.782614 4785 flags.go:64] FLAG: --cpu-manager-policy="none" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.782623 4785 flags.go:64] FLAG: --cpu-manager-policy-options="" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.782634 4785 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.782643 4785 flags.go:64] FLAG: --enable-controller-attach-detach="true" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.782652 4785 flags.go:64] FLAG: --enable-debugging-handlers="true" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.782661 4785 flags.go:64] FLAG: --enable-load-reader="false" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.782670 4785 flags.go:64] FLAG: --enable-server="true" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.782679 4785 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.782689 4785 flags.go:64] FLAG: --event-burst="100" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.782699 4785 flags.go:64] FLAG: --event-qps="50" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.782707 4785 flags.go:64] FLAG: --event-storage-age-limit="default=0" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.782716 4785 flags.go:64] FLAG: --event-storage-event-limit="default=0" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.782725 4785 flags.go:64] FLAG: --eviction-hard="" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.782736 4785 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.782745 4785 flags.go:64] FLAG: --eviction-minimum-reclaim="" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.782753 4785 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.782763 4785 flags.go:64] FLAG: --eviction-soft="" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.782771 4785 flags.go:64] FLAG: --eviction-soft-grace-period="" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.782781 4785 flags.go:64] FLAG: --exit-on-lock-contention="false" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.782790 4785 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.782799 4785 flags.go:64] FLAG: --experimental-mounter-path="" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.782808 4785 flags.go:64] FLAG: --fail-cgroupv1="false" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.782816 4785 flags.go:64] FLAG: --fail-swap-on="true" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.782825 4785 flags.go:64] FLAG: --feature-gates="" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.782844 4785 flags.go:64] FLAG: --file-check-frequency="20s" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.782853 4785 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.782862 4785 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.782872 4785 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.782880 4785 flags.go:64] FLAG: --healthz-port="10248" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.782890 4785 flags.go:64] FLAG: --help="false" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.782898 4785 flags.go:64] FLAG: --hostname-override="" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.782907 4785 flags.go:64] FLAG: --housekeeping-interval="10s" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.782916 4785 flags.go:64] FLAG: --http-check-frequency="20s" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.782925 4785 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.782934 4785 flags.go:64] FLAG: --image-credential-provider-config="" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.782942 4785 flags.go:64] FLAG: --image-gc-high-threshold="85" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.782951 4785 flags.go:64] FLAG: --image-gc-low-threshold="80" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.782960 4785 flags.go:64] FLAG: --image-service-endpoint="" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.782969 4785 flags.go:64] FLAG: --kernel-memcg-notification="false" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.782977 4785 flags.go:64] FLAG: --kube-api-burst="100" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.782986 4785 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.782996 4785 flags.go:64] FLAG: --kube-api-qps="50" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.783005 4785 flags.go:64] FLAG: --kube-reserved="" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.783013 4785 flags.go:64] FLAG: --kube-reserved-cgroup="" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.783023 4785 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.783032 4785 flags.go:64] FLAG: --kubelet-cgroups="" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.783040 4785 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.783049 4785 flags.go:64] FLAG: --lock-file="" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.783060 4785 flags.go:64] FLAG: --log-cadvisor-usage="false" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.783069 4785 flags.go:64] FLAG: --log-flush-frequency="5s" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.783078 4785 flags.go:64] FLAG: --log-json-info-buffer-size="0" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.783092 4785 flags.go:64] FLAG: --log-json-split-stream="false" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.783100 4785 flags.go:64] FLAG: --log-text-info-buffer-size="0" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.783110 4785 flags.go:64] FLAG: --log-text-split-stream="false" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.783118 4785 flags.go:64] FLAG: --logging-format="text" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.783127 4785 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.783136 4785 flags.go:64] FLAG: --make-iptables-util-chains="true" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.783145 4785 flags.go:64] FLAG: --manifest-url="" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.783154 4785 flags.go:64] FLAG: --manifest-url-header="" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.783166 4785 flags.go:64] FLAG: --max-housekeeping-interval="15s" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.783176 4785 flags.go:64] FLAG: --max-open-files="1000000" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.783186 4785 flags.go:64] FLAG: --max-pods="110" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.783195 4785 flags.go:64] FLAG: --maximum-dead-containers="-1" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.783204 4785 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.783213 4785 flags.go:64] FLAG: --memory-manager-policy="None" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.783222 4785 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.783231 4785 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.783239 4785 flags.go:64] FLAG: --node-ip="192.168.126.11" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.783249 4785 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.783269 4785 flags.go:64] FLAG: --node-status-max-images="50" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.783278 4785 flags.go:64] FLAG: --node-status-update-frequency="10s" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.783287 4785 flags.go:64] FLAG: --oom-score-adj="-999" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.783296 4785 flags.go:64] FLAG: --pod-cidr="" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.783304 4785 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.783319 4785 flags.go:64] FLAG: --pod-manifest-path="" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.783327 4785 flags.go:64] FLAG: --pod-max-pids="-1" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.783336 4785 flags.go:64] FLAG: --pods-per-core="0" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.783345 4785 flags.go:64] FLAG: --port="10250" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.783355 4785 flags.go:64] FLAG: --protect-kernel-defaults="false" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.783363 4785 flags.go:64] FLAG: --provider-id="" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.783372 4785 flags.go:64] FLAG: --qos-reserved="" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.783381 4785 flags.go:64] FLAG: --read-only-port="10255" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.783390 4785 flags.go:64] FLAG: --register-node="true" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.783401 4785 flags.go:64] FLAG: --register-schedulable="true" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.783411 4785 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.783425 4785 flags.go:64] FLAG: --registry-burst="10" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.783434 4785 flags.go:64] FLAG: --registry-qps="5" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.783443 4785 flags.go:64] FLAG: --reserved-cpus="" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.783451 4785 flags.go:64] FLAG: --reserved-memory="" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.783462 4785 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.783471 4785 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.783480 4785 flags.go:64] FLAG: --rotate-certificates="false" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.783489 4785 flags.go:64] FLAG: --rotate-server-certificates="false" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.783497 4785 flags.go:64] FLAG: --runonce="false" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.783506 4785 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.783515 4785 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.783525 4785 flags.go:64] FLAG: --seccomp-default="false" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.783535 4785 flags.go:64] FLAG: --serialize-image-pulls="true" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.783544 4785 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.783578 4785 flags.go:64] FLAG: --storage-driver-db="cadvisor" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.783588 4785 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.783597 4785 flags.go:64] FLAG: --storage-driver-password="root" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.783606 4785 flags.go:64] FLAG: --storage-driver-secure="false" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.783615 4785 flags.go:64] FLAG: --storage-driver-table="stats" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.783625 4785 flags.go:64] FLAG: --storage-driver-user="root" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.783633 4785 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.783642 4785 flags.go:64] FLAG: --sync-frequency="1m0s" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.783651 4785 flags.go:64] FLAG: --system-cgroups="" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.783660 4785 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.783674 4785 flags.go:64] FLAG: --system-reserved-cgroup="" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.783683 4785 flags.go:64] FLAG: --tls-cert-file="" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.783691 4785 flags.go:64] FLAG: --tls-cipher-suites="[]" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.783702 4785 flags.go:64] FLAG: --tls-min-version="" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.783711 4785 flags.go:64] FLAG: --tls-private-key-file="" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.783720 4785 flags.go:64] FLAG: --topology-manager-policy="none" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.783729 4785 flags.go:64] FLAG: --topology-manager-policy-options="" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.783738 4785 flags.go:64] FLAG: --topology-manager-scope="container" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.783747 4785 flags.go:64] FLAG: --v="2" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.783759 4785 flags.go:64] FLAG: --version="false" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.783770 4785 flags.go:64] FLAG: --vmodule="" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.783780 4785 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.783789 4785 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Nov 26 15:07:16 crc kubenswrapper[4785]: W1126 15:07:16.784015 4785 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Nov 26 15:07:16 crc kubenswrapper[4785]: W1126 15:07:16.784027 4785 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Nov 26 15:07:16 crc kubenswrapper[4785]: W1126 15:07:16.784037 4785 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Nov 26 15:07:16 crc kubenswrapper[4785]: W1126 15:07:16.784046 4785 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Nov 26 15:07:16 crc kubenswrapper[4785]: W1126 15:07:16.784054 4785 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Nov 26 15:07:16 crc kubenswrapper[4785]: W1126 15:07:16.784062 4785 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Nov 26 15:07:16 crc kubenswrapper[4785]: W1126 15:07:16.784070 4785 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Nov 26 15:07:16 crc kubenswrapper[4785]: W1126 15:07:16.784078 4785 feature_gate.go:330] unrecognized feature gate: InsightsConfig Nov 26 15:07:16 crc kubenswrapper[4785]: W1126 15:07:16.784086 4785 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Nov 26 15:07:16 crc kubenswrapper[4785]: W1126 15:07:16.784094 4785 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Nov 26 15:07:16 crc kubenswrapper[4785]: W1126 15:07:16.784103 4785 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Nov 26 15:07:16 crc kubenswrapper[4785]: W1126 15:07:16.784111 4785 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Nov 26 15:07:16 crc kubenswrapper[4785]: W1126 15:07:16.784119 4785 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Nov 26 15:07:16 crc kubenswrapper[4785]: W1126 15:07:16.784127 4785 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Nov 26 15:07:16 crc kubenswrapper[4785]: W1126 15:07:16.784135 4785 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Nov 26 15:07:16 crc kubenswrapper[4785]: W1126 15:07:16.784142 4785 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Nov 26 15:07:16 crc kubenswrapper[4785]: W1126 15:07:16.784150 4785 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Nov 26 15:07:16 crc kubenswrapper[4785]: W1126 15:07:16.784157 4785 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Nov 26 15:07:16 crc kubenswrapper[4785]: W1126 15:07:16.784167 4785 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Nov 26 15:07:16 crc kubenswrapper[4785]: W1126 15:07:16.784177 4785 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Nov 26 15:07:16 crc kubenswrapper[4785]: W1126 15:07:16.784184 4785 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Nov 26 15:07:16 crc kubenswrapper[4785]: W1126 15:07:16.784194 4785 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Nov 26 15:07:16 crc kubenswrapper[4785]: W1126 15:07:16.784204 4785 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Nov 26 15:07:16 crc kubenswrapper[4785]: W1126 15:07:16.784213 4785 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Nov 26 15:07:16 crc kubenswrapper[4785]: W1126 15:07:16.784221 4785 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Nov 26 15:07:16 crc kubenswrapper[4785]: W1126 15:07:16.784229 4785 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Nov 26 15:07:16 crc kubenswrapper[4785]: W1126 15:07:16.784237 4785 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Nov 26 15:07:16 crc kubenswrapper[4785]: W1126 15:07:16.784245 4785 feature_gate.go:330] unrecognized feature gate: PlatformOperators Nov 26 15:07:16 crc kubenswrapper[4785]: W1126 15:07:16.784252 4785 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Nov 26 15:07:16 crc kubenswrapper[4785]: W1126 15:07:16.784260 4785 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Nov 26 15:07:16 crc kubenswrapper[4785]: W1126 15:07:16.784268 4785 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Nov 26 15:07:16 crc kubenswrapper[4785]: W1126 15:07:16.784275 4785 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Nov 26 15:07:16 crc kubenswrapper[4785]: W1126 15:07:16.784285 4785 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Nov 26 15:07:16 crc kubenswrapper[4785]: W1126 15:07:16.784294 4785 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Nov 26 15:07:16 crc kubenswrapper[4785]: W1126 15:07:16.784303 4785 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Nov 26 15:07:16 crc kubenswrapper[4785]: W1126 15:07:16.784313 4785 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Nov 26 15:07:16 crc kubenswrapper[4785]: W1126 15:07:16.784321 4785 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Nov 26 15:07:16 crc kubenswrapper[4785]: W1126 15:07:16.784330 4785 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Nov 26 15:07:16 crc kubenswrapper[4785]: W1126 15:07:16.784338 4785 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Nov 26 15:07:16 crc kubenswrapper[4785]: W1126 15:07:16.784346 4785 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Nov 26 15:07:16 crc kubenswrapper[4785]: W1126 15:07:16.786756 4785 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Nov 26 15:07:16 crc kubenswrapper[4785]: W1126 15:07:16.786781 4785 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Nov 26 15:07:16 crc kubenswrapper[4785]: W1126 15:07:16.786791 4785 feature_gate.go:330] unrecognized feature gate: OVNObservability Nov 26 15:07:16 crc kubenswrapper[4785]: W1126 15:07:16.786799 4785 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Nov 26 15:07:16 crc kubenswrapper[4785]: W1126 15:07:16.786806 4785 feature_gate.go:330] unrecognized feature gate: PinnedImages Nov 26 15:07:16 crc kubenswrapper[4785]: W1126 15:07:16.786814 4785 feature_gate.go:330] unrecognized feature gate: GatewayAPI Nov 26 15:07:16 crc kubenswrapper[4785]: W1126 15:07:16.786825 4785 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Nov 26 15:07:16 crc kubenswrapper[4785]: W1126 15:07:16.786833 4785 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Nov 26 15:07:16 crc kubenswrapper[4785]: W1126 15:07:16.786840 4785 feature_gate.go:330] unrecognized feature gate: Example Nov 26 15:07:16 crc kubenswrapper[4785]: W1126 15:07:16.786848 4785 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Nov 26 15:07:16 crc kubenswrapper[4785]: W1126 15:07:16.786856 4785 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Nov 26 15:07:16 crc kubenswrapper[4785]: W1126 15:07:16.786864 4785 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Nov 26 15:07:16 crc kubenswrapper[4785]: W1126 15:07:16.786872 4785 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Nov 26 15:07:16 crc kubenswrapper[4785]: W1126 15:07:16.786880 4785 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Nov 26 15:07:16 crc kubenswrapper[4785]: W1126 15:07:16.786887 4785 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Nov 26 15:07:16 crc kubenswrapper[4785]: W1126 15:07:16.786897 4785 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Nov 26 15:07:16 crc kubenswrapper[4785]: W1126 15:07:16.786905 4785 feature_gate.go:330] unrecognized feature gate: SignatureStores Nov 26 15:07:16 crc kubenswrapper[4785]: W1126 15:07:16.786915 4785 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Nov 26 15:07:16 crc kubenswrapper[4785]: W1126 15:07:16.786925 4785 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Nov 26 15:07:16 crc kubenswrapper[4785]: W1126 15:07:16.786935 4785 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Nov 26 15:07:16 crc kubenswrapper[4785]: W1126 15:07:16.786944 4785 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Nov 26 15:07:16 crc kubenswrapper[4785]: W1126 15:07:16.786952 4785 feature_gate.go:330] unrecognized feature gate: NewOLM Nov 26 15:07:16 crc kubenswrapper[4785]: W1126 15:07:16.786960 4785 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Nov 26 15:07:16 crc kubenswrapper[4785]: W1126 15:07:16.786968 4785 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Nov 26 15:07:16 crc kubenswrapper[4785]: W1126 15:07:16.786982 4785 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Nov 26 15:07:16 crc kubenswrapper[4785]: W1126 15:07:16.786990 4785 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Nov 26 15:07:16 crc kubenswrapper[4785]: W1126 15:07:16.786997 4785 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Nov 26 15:07:16 crc kubenswrapper[4785]: W1126 15:07:16.787005 4785 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Nov 26 15:07:16 crc kubenswrapper[4785]: W1126 15:07:16.787013 4785 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Nov 26 15:07:16 crc kubenswrapper[4785]: W1126 15:07:16.787020 4785 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Nov 26 15:07:16 crc kubenswrapper[4785]: W1126 15:07:16.787028 4785 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.787040 4785 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.803840 4785 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.803896 4785 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Nov 26 15:07:16 crc kubenswrapper[4785]: W1126 15:07:16.804052 4785 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Nov 26 15:07:16 crc kubenswrapper[4785]: W1126 15:07:16.804066 4785 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Nov 26 15:07:16 crc kubenswrapper[4785]: W1126 15:07:16.804104 4785 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Nov 26 15:07:16 crc kubenswrapper[4785]: W1126 15:07:16.804119 4785 feature_gate.go:330] unrecognized feature gate: PlatformOperators Nov 26 15:07:16 crc kubenswrapper[4785]: W1126 15:07:16.804130 4785 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Nov 26 15:07:16 crc kubenswrapper[4785]: W1126 15:07:16.804140 4785 feature_gate.go:330] unrecognized feature gate: GatewayAPI Nov 26 15:07:16 crc kubenswrapper[4785]: W1126 15:07:16.804148 4785 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Nov 26 15:07:16 crc kubenswrapper[4785]: W1126 15:07:16.804155 4785 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Nov 26 15:07:16 crc kubenswrapper[4785]: W1126 15:07:16.804163 4785 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Nov 26 15:07:16 crc kubenswrapper[4785]: W1126 15:07:16.804171 4785 feature_gate.go:330] unrecognized feature gate: OVNObservability Nov 26 15:07:16 crc kubenswrapper[4785]: W1126 15:07:16.804178 4785 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Nov 26 15:07:16 crc kubenswrapper[4785]: W1126 15:07:16.804186 4785 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Nov 26 15:07:16 crc kubenswrapper[4785]: W1126 15:07:16.804193 4785 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Nov 26 15:07:16 crc kubenswrapper[4785]: W1126 15:07:16.804201 4785 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Nov 26 15:07:16 crc kubenswrapper[4785]: W1126 15:07:16.804210 4785 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Nov 26 15:07:16 crc kubenswrapper[4785]: W1126 15:07:16.804217 4785 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Nov 26 15:07:16 crc kubenswrapper[4785]: W1126 15:07:16.804225 4785 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Nov 26 15:07:16 crc kubenswrapper[4785]: W1126 15:07:16.804233 4785 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Nov 26 15:07:16 crc kubenswrapper[4785]: W1126 15:07:16.804241 4785 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Nov 26 15:07:16 crc kubenswrapper[4785]: W1126 15:07:16.804250 4785 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Nov 26 15:07:16 crc kubenswrapper[4785]: W1126 15:07:16.804258 4785 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Nov 26 15:07:16 crc kubenswrapper[4785]: W1126 15:07:16.804265 4785 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Nov 26 15:07:16 crc kubenswrapper[4785]: W1126 15:07:16.804273 4785 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Nov 26 15:07:16 crc kubenswrapper[4785]: W1126 15:07:16.804281 4785 feature_gate.go:330] unrecognized feature gate: PinnedImages Nov 26 15:07:16 crc kubenswrapper[4785]: W1126 15:07:16.804290 4785 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Nov 26 15:07:16 crc kubenswrapper[4785]: W1126 15:07:16.804298 4785 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Nov 26 15:07:16 crc kubenswrapper[4785]: W1126 15:07:16.804306 4785 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Nov 26 15:07:16 crc kubenswrapper[4785]: W1126 15:07:16.804319 4785 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Nov 26 15:07:16 crc kubenswrapper[4785]: W1126 15:07:16.804330 4785 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Nov 26 15:07:16 crc kubenswrapper[4785]: W1126 15:07:16.804367 4785 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Nov 26 15:07:16 crc kubenswrapper[4785]: W1126 15:07:16.804375 4785 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Nov 26 15:07:16 crc kubenswrapper[4785]: W1126 15:07:16.804386 4785 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Nov 26 15:07:16 crc kubenswrapper[4785]: W1126 15:07:16.804397 4785 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Nov 26 15:07:16 crc kubenswrapper[4785]: W1126 15:07:16.804406 4785 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Nov 26 15:07:16 crc kubenswrapper[4785]: W1126 15:07:16.804414 4785 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Nov 26 15:07:16 crc kubenswrapper[4785]: W1126 15:07:16.804422 4785 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Nov 26 15:07:16 crc kubenswrapper[4785]: W1126 15:07:16.804430 4785 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Nov 26 15:07:16 crc kubenswrapper[4785]: W1126 15:07:16.804438 4785 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Nov 26 15:07:16 crc kubenswrapper[4785]: W1126 15:07:16.804446 4785 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Nov 26 15:07:16 crc kubenswrapper[4785]: W1126 15:07:16.804455 4785 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Nov 26 15:07:16 crc kubenswrapper[4785]: W1126 15:07:16.804464 4785 feature_gate.go:330] unrecognized feature gate: SignatureStores Nov 26 15:07:16 crc kubenswrapper[4785]: W1126 15:07:16.804473 4785 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Nov 26 15:07:16 crc kubenswrapper[4785]: W1126 15:07:16.804481 4785 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Nov 26 15:07:16 crc kubenswrapper[4785]: W1126 15:07:16.804492 4785 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Nov 26 15:07:16 crc kubenswrapper[4785]: W1126 15:07:16.804500 4785 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Nov 26 15:07:16 crc kubenswrapper[4785]: W1126 15:07:16.804510 4785 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Nov 26 15:07:16 crc kubenswrapper[4785]: W1126 15:07:16.804517 4785 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Nov 26 15:07:16 crc kubenswrapper[4785]: W1126 15:07:16.804525 4785 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Nov 26 15:07:16 crc kubenswrapper[4785]: W1126 15:07:16.804598 4785 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Nov 26 15:07:16 crc kubenswrapper[4785]: W1126 15:07:16.804606 4785 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Nov 26 15:07:16 crc kubenswrapper[4785]: W1126 15:07:16.804615 4785 feature_gate.go:330] unrecognized feature gate: InsightsConfig Nov 26 15:07:16 crc kubenswrapper[4785]: W1126 15:07:16.804624 4785 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Nov 26 15:07:16 crc kubenswrapper[4785]: W1126 15:07:16.804631 4785 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Nov 26 15:07:16 crc kubenswrapper[4785]: W1126 15:07:16.804640 4785 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Nov 26 15:07:16 crc kubenswrapper[4785]: W1126 15:07:16.804648 4785 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Nov 26 15:07:16 crc kubenswrapper[4785]: W1126 15:07:16.804655 4785 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Nov 26 15:07:16 crc kubenswrapper[4785]: W1126 15:07:16.804664 4785 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Nov 26 15:07:16 crc kubenswrapper[4785]: W1126 15:07:16.804674 4785 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Nov 26 15:07:16 crc kubenswrapper[4785]: W1126 15:07:16.804683 4785 feature_gate.go:330] unrecognized feature gate: Example Nov 26 15:07:16 crc kubenswrapper[4785]: W1126 15:07:16.804692 4785 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Nov 26 15:07:16 crc kubenswrapper[4785]: W1126 15:07:16.804702 4785 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Nov 26 15:07:16 crc kubenswrapper[4785]: W1126 15:07:16.804712 4785 feature_gate.go:330] unrecognized feature gate: NewOLM Nov 26 15:07:16 crc kubenswrapper[4785]: W1126 15:07:16.804720 4785 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Nov 26 15:07:16 crc kubenswrapper[4785]: W1126 15:07:16.804728 4785 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Nov 26 15:07:16 crc kubenswrapper[4785]: W1126 15:07:16.804736 4785 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Nov 26 15:07:16 crc kubenswrapper[4785]: W1126 15:07:16.804744 4785 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Nov 26 15:07:16 crc kubenswrapper[4785]: W1126 15:07:16.804752 4785 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Nov 26 15:07:16 crc kubenswrapper[4785]: W1126 15:07:16.804760 4785 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Nov 26 15:07:16 crc kubenswrapper[4785]: W1126 15:07:16.804767 4785 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Nov 26 15:07:16 crc kubenswrapper[4785]: W1126 15:07:16.804777 4785 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Nov 26 15:07:16 crc kubenswrapper[4785]: W1126 15:07:16.804787 4785 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.804803 4785 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Nov 26 15:07:16 crc kubenswrapper[4785]: W1126 15:07:16.805076 4785 feature_gate.go:330] unrecognized feature gate: GatewayAPI Nov 26 15:07:16 crc kubenswrapper[4785]: W1126 15:07:16.805091 4785 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Nov 26 15:07:16 crc kubenswrapper[4785]: W1126 15:07:16.805101 4785 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Nov 26 15:07:16 crc kubenswrapper[4785]: W1126 15:07:16.805109 4785 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Nov 26 15:07:16 crc kubenswrapper[4785]: W1126 15:07:16.805119 4785 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Nov 26 15:07:16 crc kubenswrapper[4785]: W1126 15:07:16.805127 4785 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Nov 26 15:07:16 crc kubenswrapper[4785]: W1126 15:07:16.805136 4785 feature_gate.go:330] unrecognized feature gate: Example Nov 26 15:07:16 crc kubenswrapper[4785]: W1126 15:07:16.805144 4785 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Nov 26 15:07:16 crc kubenswrapper[4785]: W1126 15:07:16.805152 4785 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Nov 26 15:07:16 crc kubenswrapper[4785]: W1126 15:07:16.805160 4785 feature_gate.go:330] unrecognized feature gate: InsightsConfig Nov 26 15:07:16 crc kubenswrapper[4785]: W1126 15:07:16.805168 4785 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Nov 26 15:07:16 crc kubenswrapper[4785]: W1126 15:07:16.805176 4785 feature_gate.go:330] unrecognized feature gate: SignatureStores Nov 26 15:07:16 crc kubenswrapper[4785]: W1126 15:07:16.805183 4785 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Nov 26 15:07:16 crc kubenswrapper[4785]: W1126 15:07:16.805191 4785 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Nov 26 15:07:16 crc kubenswrapper[4785]: W1126 15:07:16.805199 4785 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Nov 26 15:07:16 crc kubenswrapper[4785]: W1126 15:07:16.805206 4785 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Nov 26 15:07:16 crc kubenswrapper[4785]: W1126 15:07:16.805214 4785 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Nov 26 15:07:16 crc kubenswrapper[4785]: W1126 15:07:16.805222 4785 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Nov 26 15:07:16 crc kubenswrapper[4785]: W1126 15:07:16.805229 4785 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Nov 26 15:07:16 crc kubenswrapper[4785]: W1126 15:07:16.805238 4785 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Nov 26 15:07:16 crc kubenswrapper[4785]: W1126 15:07:16.805245 4785 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Nov 26 15:07:16 crc kubenswrapper[4785]: W1126 15:07:16.805256 4785 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Nov 26 15:07:16 crc kubenswrapper[4785]: W1126 15:07:16.805267 4785 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Nov 26 15:07:16 crc kubenswrapper[4785]: W1126 15:07:16.805276 4785 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Nov 26 15:07:16 crc kubenswrapper[4785]: W1126 15:07:16.805285 4785 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Nov 26 15:07:16 crc kubenswrapper[4785]: W1126 15:07:16.805295 4785 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Nov 26 15:07:16 crc kubenswrapper[4785]: W1126 15:07:16.805304 4785 feature_gate.go:330] unrecognized feature gate: OVNObservability Nov 26 15:07:16 crc kubenswrapper[4785]: W1126 15:07:16.805312 4785 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Nov 26 15:07:16 crc kubenswrapper[4785]: W1126 15:07:16.805319 4785 feature_gate.go:330] unrecognized feature gate: NewOLM Nov 26 15:07:16 crc kubenswrapper[4785]: W1126 15:07:16.805327 4785 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Nov 26 15:07:16 crc kubenswrapper[4785]: W1126 15:07:16.805336 4785 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Nov 26 15:07:16 crc kubenswrapper[4785]: W1126 15:07:16.805343 4785 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Nov 26 15:07:16 crc kubenswrapper[4785]: W1126 15:07:16.805351 4785 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Nov 26 15:07:16 crc kubenswrapper[4785]: W1126 15:07:16.805358 4785 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Nov 26 15:07:16 crc kubenswrapper[4785]: W1126 15:07:16.805368 4785 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Nov 26 15:07:16 crc kubenswrapper[4785]: W1126 15:07:16.805379 4785 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Nov 26 15:07:16 crc kubenswrapper[4785]: W1126 15:07:16.805387 4785 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Nov 26 15:07:16 crc kubenswrapper[4785]: W1126 15:07:16.805395 4785 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Nov 26 15:07:16 crc kubenswrapper[4785]: W1126 15:07:16.805403 4785 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Nov 26 15:07:16 crc kubenswrapper[4785]: W1126 15:07:16.805411 4785 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Nov 26 15:07:16 crc kubenswrapper[4785]: W1126 15:07:16.805420 4785 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Nov 26 15:07:16 crc kubenswrapper[4785]: W1126 15:07:16.805429 4785 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Nov 26 15:07:16 crc kubenswrapper[4785]: W1126 15:07:16.805438 4785 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Nov 26 15:07:16 crc kubenswrapper[4785]: W1126 15:07:16.805447 4785 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Nov 26 15:07:16 crc kubenswrapper[4785]: W1126 15:07:16.805455 4785 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Nov 26 15:07:16 crc kubenswrapper[4785]: W1126 15:07:16.805463 4785 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Nov 26 15:07:16 crc kubenswrapper[4785]: W1126 15:07:16.805471 4785 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Nov 26 15:07:16 crc kubenswrapper[4785]: W1126 15:07:16.805481 4785 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Nov 26 15:07:16 crc kubenswrapper[4785]: W1126 15:07:16.805490 4785 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Nov 26 15:07:16 crc kubenswrapper[4785]: W1126 15:07:16.805499 4785 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Nov 26 15:07:16 crc kubenswrapper[4785]: W1126 15:07:16.805507 4785 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Nov 26 15:07:16 crc kubenswrapper[4785]: W1126 15:07:16.805516 4785 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Nov 26 15:07:16 crc kubenswrapper[4785]: W1126 15:07:16.805525 4785 feature_gate.go:330] unrecognized feature gate: PlatformOperators Nov 26 15:07:16 crc kubenswrapper[4785]: W1126 15:07:16.805533 4785 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Nov 26 15:07:16 crc kubenswrapper[4785]: W1126 15:07:16.805541 4785 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Nov 26 15:07:16 crc kubenswrapper[4785]: W1126 15:07:16.805549 4785 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Nov 26 15:07:16 crc kubenswrapper[4785]: W1126 15:07:16.805588 4785 feature_gate.go:330] unrecognized feature gate: PinnedImages Nov 26 15:07:16 crc kubenswrapper[4785]: W1126 15:07:16.805597 4785 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Nov 26 15:07:16 crc kubenswrapper[4785]: W1126 15:07:16.805605 4785 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Nov 26 15:07:16 crc kubenswrapper[4785]: W1126 15:07:16.805613 4785 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Nov 26 15:07:16 crc kubenswrapper[4785]: W1126 15:07:16.805621 4785 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Nov 26 15:07:16 crc kubenswrapper[4785]: W1126 15:07:16.805629 4785 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Nov 26 15:07:16 crc kubenswrapper[4785]: W1126 15:07:16.805637 4785 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Nov 26 15:07:16 crc kubenswrapper[4785]: W1126 15:07:16.805645 4785 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Nov 26 15:07:16 crc kubenswrapper[4785]: W1126 15:07:16.805653 4785 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Nov 26 15:07:16 crc kubenswrapper[4785]: W1126 15:07:16.805661 4785 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Nov 26 15:07:16 crc kubenswrapper[4785]: W1126 15:07:16.805668 4785 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Nov 26 15:07:16 crc kubenswrapper[4785]: W1126 15:07:16.805676 4785 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Nov 26 15:07:16 crc kubenswrapper[4785]: W1126 15:07:16.805684 4785 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Nov 26 15:07:16 crc kubenswrapper[4785]: W1126 15:07:16.805691 4785 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Nov 26 15:07:16 crc kubenswrapper[4785]: W1126 15:07:16.805698 4785 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.805711 4785 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.806864 4785 server.go:940] "Client rotation is on, will bootstrap in background" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.814798 4785 bootstrap.go:85] "Current kubeconfig file contents are still valid, no bootstrap necessary" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.814940 4785 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.816641 4785 server.go:997] "Starting client certificate rotation" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.816691 4785 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.816949 4785 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2026-02-24 05:52:08 +0000 UTC, rotation deadline is 2025-12-01 07:17:56.45512627 +0000 UTC Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.817091 4785 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 112h10m39.63804057s for next certificate rotation Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.845938 4785 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.848811 4785 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.873189 4785 log.go:25] "Validated CRI v1 runtime API" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.913919 4785 log.go:25] "Validated CRI v1 image API" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.916075 4785 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.921781 4785 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2025-11-26-15-03-05-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.921828 4785 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:42 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:43 fsType:tmpfs blockSize:0}] Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.939285 4785 manager.go:217] Machine: {Timestamp:2025-11-26 15:07:16.936670626 +0000 UTC m=+0.615036410 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2800000 MemoryCapacity:33654124544 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:0559caf5-1d73-4afa-a2e0-e8d6b738bfd5 BootID:564b0a72-079b-4004-bac6-7e7947bc6860 Filesystems:[{Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827064320 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:42 Capacity:3365408768 Type:vfs Inodes:821633 HasInodes:true} {Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:43 Capacity:1073741824 Type:vfs Inodes:4108169 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827060224 Type:vfs Inodes:4108169 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:18:3b:b0 Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:18:3b:b0 Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:c2:97:aa Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:7a:19:99 Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:20:8b:f2 Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:a2:4d:43 Speed:-1 Mtu:1496} {Name:eth10 MacAddress:da:35:49:f5:47:36 Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:36:d7:30:27:e6:9d Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654124544 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.939626 4785 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.939794 4785 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.940802 4785 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.941036 4785 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.941079 4785 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.945781 4785 topology_manager.go:138] "Creating topology manager with none policy" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.945828 4785 container_manager_linux.go:303] "Creating device plugin manager" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.946328 4785 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.946372 4785 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.946692 4785 state_mem.go:36] "Initialized new in-memory state store" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.946883 4785 server.go:1245] "Using root directory" path="/var/lib/kubelet" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.951518 4785 kubelet.go:418] "Attempting to sync node with API server" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.951582 4785 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.951610 4785 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.951632 4785 kubelet.go:324] "Adding apiserver pod source" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.951679 4785 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.956187 4785 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.957228 4785 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Nov 26 15:07:16 crc kubenswrapper[4785]: W1126 15:07:16.958846 4785 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.44:6443: connect: connection refused Nov 26 15:07:16 crc kubenswrapper[4785]: E1126 15:07:16.958957 4785 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.44:6443: connect: connection refused" logger="UnhandledError" Nov 26 15:07:16 crc kubenswrapper[4785]: W1126 15:07:16.959053 4785 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.44:6443: connect: connection refused Nov 26 15:07:16 crc kubenswrapper[4785]: E1126 15:07:16.959112 4785 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.44:6443: connect: connection refused" logger="UnhandledError" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.959907 4785 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.961516 4785 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.961586 4785 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.961601 4785 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.961615 4785 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.961637 4785 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.961654 4785 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.961667 4785 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.961689 4785 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.961704 4785 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.961718 4785 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.961761 4785 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.961776 4785 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.962799 4785 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.963537 4785 server.go:1280] "Started kubelet" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.963720 4785 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.963963 4785 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.964581 4785 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.964671 4785 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.44:6443: connect: connection refused Nov 26 15:07:16 crc systemd[1]: Started Kubernetes Kubelet. Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.966210 4785 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.966259 4785 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.966461 4785 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-21 15:14:04.042819295 +0000 UTC Nov 26 15:07:16 crc kubenswrapper[4785]: E1126 15:07:16.966634 4785 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.966687 4785 volume_manager.go:287] "The desired_state_of_world populator starts" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.966716 4785 volume_manager.go:289] "Starting Kubelet Volume Manager" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.966876 4785 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Nov 26 15:07:16 crc kubenswrapper[4785]: W1126 15:07:16.967717 4785 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.44:6443: connect: connection refused Nov 26 15:07:16 crc kubenswrapper[4785]: E1126 15:07:16.967840 4785 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.44:6443: connect: connection refused" logger="UnhandledError" Nov 26 15:07:16 crc kubenswrapper[4785]: E1126 15:07:16.968577 4785 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.44:6443: connect: connection refused" interval="200ms" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.969615 4785 server.go:460] "Adding debug handlers to kubelet server" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.977240 4785 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.977281 4785 factory.go:55] Registering systemd factory Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.977298 4785 factory.go:221] Registration of the systemd container factory successfully Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.977897 4785 factory.go:153] Registering CRI-O factory Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.978086 4785 factory.go:221] Registration of the crio container factory successfully Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.978150 4785 factory.go:103] Registering Raw factory Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.978174 4785 manager.go:1196] Started watching for new ooms in manager Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.979143 4785 manager.go:319] Starting recovery of all containers Nov 26 15:07:16 crc kubenswrapper[4785]: E1126 15:07:16.978030 4785 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.44:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.187b96f016bd10ee default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-11-26 15:07:16.963487982 +0000 UTC m=+0.641853786,LastTimestamp:2025-11-26 15:07:16.963487982 +0000 UTC m=+0.641853786,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.985692 4785 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.985733 4785 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.985744 4785 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.985754 4785 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.985763 4785 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.985774 4785 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.985782 4785 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.985792 4785 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.985807 4785 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.985818 4785 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.985826 4785 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.985836 4785 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.985846 4785 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.985857 4785 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.985868 4785 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.985879 4785 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.985889 4785 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.985898 4785 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.985908 4785 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.985915 4785 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.985926 4785 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.985935 4785 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.985944 4785 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.985952 4785 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.985961 4785 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.985994 4785 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.986006 4785 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.986035 4785 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.986044 4785 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.986053 4785 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.986061 4785 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.986069 4785 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.986078 4785 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.986086 4785 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.986095 4785 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.986103 4785 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.986113 4785 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.986122 4785 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.986133 4785 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.986146 4785 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.986156 4785 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.986167 4785 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.986177 4785 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.986188 4785 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.986223 4785 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.986232 4785 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.986243 4785 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.986252 4785 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.986261 4785 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.986272 4785 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.986283 4785 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.986293 4785 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.986306 4785 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.986318 4785 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.986330 4785 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.986341 4785 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.986353 4785 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.986362 4785 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.986372 4785 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.986381 4785 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.986391 4785 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.986401 4785 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.986410 4785 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.986419 4785 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.986429 4785 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.986438 4785 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.986447 4785 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.986457 4785 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.986466 4785 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.986476 4785 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.986486 4785 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.986498 4785 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.986509 4785 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.986519 4785 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.986530 4785 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.986539 4785 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.986607 4785 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.986619 4785 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.986629 4785 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.986637 4785 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.986646 4785 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.986655 4785 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.986664 4785 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.986678 4785 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.986688 4785 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.986696 4785 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.986706 4785 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.986716 4785 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.986726 4785 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.986735 4785 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.986744 4785 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.986754 4785 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.986763 4785 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.986772 4785 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.986782 4785 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.986791 4785 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.986800 4785 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.986810 4785 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.986819 4785 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.986828 4785 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.986837 4785 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.986846 4785 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.986855 4785 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.986864 4785 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.986878 4785 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.986888 4785 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.986899 4785 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.986910 4785 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.986920 4785 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.986932 4785 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.986942 4785 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.986953 4785 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.986963 4785 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.986973 4785 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.986982 4785 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.986991 4785 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.987000 4785 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.987009 4785 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.987023 4785 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.987033 4785 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.987041 4785 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.987082 4785 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.987091 4785 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.987100 4785 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.987109 4785 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.987120 4785 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.987128 4785 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.987138 4785 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.987148 4785 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.987158 4785 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.987167 4785 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.987177 4785 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.987187 4785 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.987195 4785 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.987205 4785 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.987216 4785 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.987225 4785 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.987235 4785 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.987245 4785 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.987256 4785 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.987266 4785 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.987276 4785 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.987287 4785 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.987299 4785 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.987310 4785 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.987322 4785 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.987331 4785 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.987342 4785 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.987354 4785 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.987365 4785 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.987378 4785 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.987470 4785 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.987480 4785 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.987490 4785 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.987501 4785 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.987524 4785 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.993255 4785 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.993285 4785 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.993298 4785 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.993317 4785 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.993327 4785 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.993340 4785 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.993350 4785 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.993363 4785 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.993375 4785 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.993426 4785 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.993439 4785 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.993453 4785 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.993464 4785 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.993475 4785 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.993487 4785 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.993498 4785 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.993510 4785 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.993520 4785 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.993533 4785 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.993544 4785 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.993569 4785 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.993579 4785 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.993590 4785 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.993601 4785 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.993613 4785 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.993623 4785 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.993634 4785 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.993644 4785 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.993654 4785 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.993665 4785 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.993677 4785 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.993688 4785 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.993698 4785 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.993709 4785 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.993720 4785 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.993731 4785 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.993743 4785 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.993753 4785 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.993765 4785 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.993776 4785 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.993788 4785 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.993799 4785 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.993811 4785 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.993822 4785 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.993834 4785 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.993846 4785 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.993856 4785 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.993879 4785 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.993888 4785 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.993900 4785 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.993911 4785 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.993922 4785 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.993931 4785 reconstruct.go:97] "Volume reconstruction finished" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.993938 4785 reconciler.go:26] "Reconciler: start to sync state" Nov 26 15:07:16 crc kubenswrapper[4785]: I1126 15:07:16.996675 4785 manager.go:324] Recovery completed Nov 26 15:07:17 crc kubenswrapper[4785]: I1126 15:07:17.005163 4785 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 26 15:07:17 crc kubenswrapper[4785]: I1126 15:07:17.007040 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:07:17 crc kubenswrapper[4785]: I1126 15:07:17.007096 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:07:17 crc kubenswrapper[4785]: I1126 15:07:17.007113 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:07:17 crc kubenswrapper[4785]: I1126 15:07:17.010508 4785 cpu_manager.go:225] "Starting CPU manager" policy="none" Nov 26 15:07:17 crc kubenswrapper[4785]: I1126 15:07:17.010545 4785 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Nov 26 15:07:17 crc kubenswrapper[4785]: I1126 15:07:17.010595 4785 state_mem.go:36] "Initialized new in-memory state store" Nov 26 15:07:17 crc kubenswrapper[4785]: I1126 15:07:17.032071 4785 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Nov 26 15:07:17 crc kubenswrapper[4785]: I1126 15:07:17.034354 4785 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Nov 26 15:07:17 crc kubenswrapper[4785]: I1126 15:07:17.035021 4785 status_manager.go:217] "Starting to sync pod status with apiserver" Nov 26 15:07:17 crc kubenswrapper[4785]: I1126 15:07:17.035080 4785 kubelet.go:2335] "Starting kubelet main sync loop" Nov 26 15:07:17 crc kubenswrapper[4785]: E1126 15:07:17.035187 4785 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Nov 26 15:07:17 crc kubenswrapper[4785]: I1126 15:07:17.035573 4785 policy_none.go:49] "None policy: Start" Nov 26 15:07:17 crc kubenswrapper[4785]: W1126 15:07:17.035635 4785 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.44:6443: connect: connection refused Nov 26 15:07:17 crc kubenswrapper[4785]: E1126 15:07:17.035709 4785 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.44:6443: connect: connection refused" logger="UnhandledError" Nov 26 15:07:17 crc kubenswrapper[4785]: I1126 15:07:17.036644 4785 memory_manager.go:170] "Starting memorymanager" policy="None" Nov 26 15:07:17 crc kubenswrapper[4785]: I1126 15:07:17.036677 4785 state_mem.go:35] "Initializing new in-memory state store" Nov 26 15:07:17 crc kubenswrapper[4785]: E1126 15:07:17.067131 4785 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Nov 26 15:07:17 crc kubenswrapper[4785]: I1126 15:07:17.093490 4785 manager.go:334] "Starting Device Plugin manager" Nov 26 15:07:17 crc kubenswrapper[4785]: I1126 15:07:17.093762 4785 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Nov 26 15:07:17 crc kubenswrapper[4785]: I1126 15:07:17.093785 4785 server.go:79] "Starting device plugin registration server" Nov 26 15:07:17 crc kubenswrapper[4785]: I1126 15:07:17.094152 4785 eviction_manager.go:189] "Eviction manager: starting control loop" Nov 26 15:07:17 crc kubenswrapper[4785]: I1126 15:07:17.094169 4785 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Nov 26 15:07:17 crc kubenswrapper[4785]: I1126 15:07:17.094385 4785 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Nov 26 15:07:17 crc kubenswrapper[4785]: I1126 15:07:17.094482 4785 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Nov 26 15:07:17 crc kubenswrapper[4785]: I1126 15:07:17.094490 4785 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Nov 26 15:07:17 crc kubenswrapper[4785]: E1126 15:07:17.105010 4785 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Nov 26 15:07:17 crc kubenswrapper[4785]: I1126 15:07:17.136287 4785 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-controller-manager/kube-controller-manager-crc","openshift-kube-scheduler/openshift-kube-scheduler-crc","openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc"] Nov 26 15:07:17 crc kubenswrapper[4785]: I1126 15:07:17.136388 4785 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 26 15:07:17 crc kubenswrapper[4785]: I1126 15:07:17.139971 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:07:17 crc kubenswrapper[4785]: I1126 15:07:17.140045 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:07:17 crc kubenswrapper[4785]: I1126 15:07:17.140078 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:07:17 crc kubenswrapper[4785]: I1126 15:07:17.140432 4785 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 26 15:07:17 crc kubenswrapper[4785]: I1126 15:07:17.141407 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 26 15:07:17 crc kubenswrapper[4785]: I1126 15:07:17.141472 4785 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 26 15:07:17 crc kubenswrapper[4785]: I1126 15:07:17.144747 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:07:17 crc kubenswrapper[4785]: I1126 15:07:17.144792 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:07:17 crc kubenswrapper[4785]: I1126 15:07:17.144805 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:07:17 crc kubenswrapper[4785]: I1126 15:07:17.144940 4785 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 26 15:07:17 crc kubenswrapper[4785]: I1126 15:07:17.145206 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:07:17 crc kubenswrapper[4785]: I1126 15:07:17.145256 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:07:17 crc kubenswrapper[4785]: I1126 15:07:17.145268 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:07:17 crc kubenswrapper[4785]: I1126 15:07:17.145337 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Nov 26 15:07:17 crc kubenswrapper[4785]: I1126 15:07:17.145388 4785 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 26 15:07:17 crc kubenswrapper[4785]: I1126 15:07:17.146101 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:07:17 crc kubenswrapper[4785]: I1126 15:07:17.146139 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:07:17 crc kubenswrapper[4785]: I1126 15:07:17.146153 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:07:17 crc kubenswrapper[4785]: I1126 15:07:17.146257 4785 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 26 15:07:17 crc kubenswrapper[4785]: I1126 15:07:17.146512 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Nov 26 15:07:17 crc kubenswrapper[4785]: I1126 15:07:17.146583 4785 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 26 15:07:17 crc kubenswrapper[4785]: I1126 15:07:17.146937 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:07:17 crc kubenswrapper[4785]: I1126 15:07:17.146969 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:07:17 crc kubenswrapper[4785]: I1126 15:07:17.146984 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:07:17 crc kubenswrapper[4785]: I1126 15:07:17.148312 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:07:17 crc kubenswrapper[4785]: I1126 15:07:17.148349 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:07:17 crc kubenswrapper[4785]: I1126 15:07:17.148365 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:07:17 crc kubenswrapper[4785]: I1126 15:07:17.148576 4785 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 26 15:07:17 crc kubenswrapper[4785]: I1126 15:07:17.148882 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Nov 26 15:07:17 crc kubenswrapper[4785]: I1126 15:07:17.148937 4785 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 26 15:07:17 crc kubenswrapper[4785]: I1126 15:07:17.150607 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:07:17 crc kubenswrapper[4785]: I1126 15:07:17.150631 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:07:17 crc kubenswrapper[4785]: I1126 15:07:17.150642 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:07:17 crc kubenswrapper[4785]: I1126 15:07:17.151093 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:07:17 crc kubenswrapper[4785]: I1126 15:07:17.151115 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:07:17 crc kubenswrapper[4785]: I1126 15:07:17.151126 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:07:17 crc kubenswrapper[4785]: I1126 15:07:17.151134 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:07:17 crc kubenswrapper[4785]: I1126 15:07:17.151184 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:07:17 crc kubenswrapper[4785]: I1126 15:07:17.151207 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:07:17 crc kubenswrapper[4785]: I1126 15:07:17.151442 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 26 15:07:17 crc kubenswrapper[4785]: I1126 15:07:17.151481 4785 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 26 15:07:17 crc kubenswrapper[4785]: I1126 15:07:17.152354 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:07:17 crc kubenswrapper[4785]: I1126 15:07:17.152377 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:07:17 crc kubenswrapper[4785]: I1126 15:07:17.152388 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:07:17 crc kubenswrapper[4785]: E1126 15:07:17.169499 4785 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.44:6443: connect: connection refused" interval="400ms" Nov 26 15:07:17 crc kubenswrapper[4785]: I1126 15:07:17.195956 4785 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 26 15:07:17 crc kubenswrapper[4785]: I1126 15:07:17.196115 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 26 15:07:17 crc kubenswrapper[4785]: I1126 15:07:17.196202 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Nov 26 15:07:17 crc kubenswrapper[4785]: I1126 15:07:17.196333 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 26 15:07:17 crc kubenswrapper[4785]: I1126 15:07:17.196427 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 26 15:07:17 crc kubenswrapper[4785]: I1126 15:07:17.196501 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 26 15:07:17 crc kubenswrapper[4785]: I1126 15:07:17.196548 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Nov 26 15:07:17 crc kubenswrapper[4785]: I1126 15:07:17.196607 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 26 15:07:17 crc kubenswrapper[4785]: I1126 15:07:17.196639 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 26 15:07:17 crc kubenswrapper[4785]: I1126 15:07:17.196669 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 26 15:07:17 crc kubenswrapper[4785]: I1126 15:07:17.196698 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 26 15:07:17 crc kubenswrapper[4785]: I1126 15:07:17.196735 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 26 15:07:17 crc kubenswrapper[4785]: I1126 15:07:17.196825 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 26 15:07:17 crc kubenswrapper[4785]: I1126 15:07:17.196892 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Nov 26 15:07:17 crc kubenswrapper[4785]: I1126 15:07:17.196988 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Nov 26 15:07:17 crc kubenswrapper[4785]: I1126 15:07:17.197034 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 26 15:07:17 crc kubenswrapper[4785]: I1126 15:07:17.197913 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:07:17 crc kubenswrapper[4785]: I1126 15:07:17.197979 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:07:17 crc kubenswrapper[4785]: I1126 15:07:17.198002 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:07:17 crc kubenswrapper[4785]: I1126 15:07:17.198050 4785 kubelet_node_status.go:76] "Attempting to register node" node="crc" Nov 26 15:07:17 crc kubenswrapper[4785]: E1126 15:07:17.198608 4785 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.44:6443: connect: connection refused" node="crc" Nov 26 15:07:17 crc kubenswrapper[4785]: I1126 15:07:17.298633 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 26 15:07:17 crc kubenswrapper[4785]: I1126 15:07:17.298698 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 26 15:07:17 crc kubenswrapper[4785]: I1126 15:07:17.298721 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 26 15:07:17 crc kubenswrapper[4785]: I1126 15:07:17.298743 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Nov 26 15:07:17 crc kubenswrapper[4785]: I1126 15:07:17.298774 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 26 15:07:17 crc kubenswrapper[4785]: I1126 15:07:17.298815 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 26 15:07:17 crc kubenswrapper[4785]: I1126 15:07:17.298839 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Nov 26 15:07:17 crc kubenswrapper[4785]: I1126 15:07:17.298870 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Nov 26 15:07:17 crc kubenswrapper[4785]: I1126 15:07:17.298900 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 26 15:07:17 crc kubenswrapper[4785]: I1126 15:07:17.298927 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 26 15:07:17 crc kubenswrapper[4785]: I1126 15:07:17.298930 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 26 15:07:17 crc kubenswrapper[4785]: I1126 15:07:17.298947 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 26 15:07:17 crc kubenswrapper[4785]: I1126 15:07:17.299008 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 26 15:07:17 crc kubenswrapper[4785]: I1126 15:07:17.299048 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 26 15:07:17 crc kubenswrapper[4785]: I1126 15:07:17.298971 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Nov 26 15:07:17 crc kubenswrapper[4785]: I1126 15:07:17.299029 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Nov 26 15:07:17 crc kubenswrapper[4785]: I1126 15:07:17.298904 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 26 15:07:17 crc kubenswrapper[4785]: I1126 15:07:17.299048 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 26 15:07:17 crc kubenswrapper[4785]: I1126 15:07:17.299002 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 26 15:07:17 crc kubenswrapper[4785]: I1126 15:07:17.299100 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Nov 26 15:07:17 crc kubenswrapper[4785]: I1126 15:07:17.299090 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 26 15:07:17 crc kubenswrapper[4785]: I1126 15:07:17.298952 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 26 15:07:17 crc kubenswrapper[4785]: I1126 15:07:17.299301 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 26 15:07:17 crc kubenswrapper[4785]: I1126 15:07:17.299368 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 26 15:07:17 crc kubenswrapper[4785]: I1126 15:07:17.299412 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 26 15:07:17 crc kubenswrapper[4785]: I1126 15:07:17.299397 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 26 15:07:17 crc kubenswrapper[4785]: I1126 15:07:17.299492 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 26 15:07:17 crc kubenswrapper[4785]: I1126 15:07:17.299436 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 26 15:07:17 crc kubenswrapper[4785]: I1126 15:07:17.299483 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Nov 26 15:07:17 crc kubenswrapper[4785]: I1126 15:07:17.299601 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Nov 26 15:07:17 crc kubenswrapper[4785]: I1126 15:07:17.398956 4785 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 26 15:07:17 crc kubenswrapper[4785]: I1126 15:07:17.400836 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:07:17 crc kubenswrapper[4785]: I1126 15:07:17.400905 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:07:17 crc kubenswrapper[4785]: I1126 15:07:17.400931 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:07:17 crc kubenswrapper[4785]: I1126 15:07:17.400972 4785 kubelet_node_status.go:76] "Attempting to register node" node="crc" Nov 26 15:07:17 crc kubenswrapper[4785]: E1126 15:07:17.401683 4785 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.44:6443: connect: connection refused" node="crc" Nov 26 15:07:17 crc kubenswrapper[4785]: I1126 15:07:17.486957 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 26 15:07:17 crc kubenswrapper[4785]: I1126 15:07:17.497144 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Nov 26 15:07:17 crc kubenswrapper[4785]: I1126 15:07:17.520307 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Nov 26 15:07:17 crc kubenswrapper[4785]: I1126 15:07:17.531093 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Nov 26 15:07:17 crc kubenswrapper[4785]: I1126 15:07:17.537509 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 26 15:07:17 crc kubenswrapper[4785]: W1126 15:07:17.543449 4785 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-765e7663046d39c1622b9ca981a93146607fceebd5de9f709b130402b7d09683 WatchSource:0}: Error finding container 765e7663046d39c1622b9ca981a93146607fceebd5de9f709b130402b7d09683: Status 404 returned error can't find the container with id 765e7663046d39c1622b9ca981a93146607fceebd5de9f709b130402b7d09683 Nov 26 15:07:17 crc kubenswrapper[4785]: W1126 15:07:17.544393 4785 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-833140d6013c7211851cd7e74c156977514a72446402bf50f4a452c62ca88d8f WatchSource:0}: Error finding container 833140d6013c7211851cd7e74c156977514a72446402bf50f4a452c62ca88d8f: Status 404 returned error can't find the container with id 833140d6013c7211851cd7e74c156977514a72446402bf50f4a452c62ca88d8f Nov 26 15:07:17 crc kubenswrapper[4785]: W1126 15:07:17.557361 4785 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-8d43c6489d5c3495e5ed292f4277b148ec3a704d9ecf4199425c1ea9f177a44e WatchSource:0}: Error finding container 8d43c6489d5c3495e5ed292f4277b148ec3a704d9ecf4199425c1ea9f177a44e: Status 404 returned error can't find the container with id 8d43c6489d5c3495e5ed292f4277b148ec3a704d9ecf4199425c1ea9f177a44e Nov 26 15:07:17 crc kubenswrapper[4785]: W1126 15:07:17.561280 4785 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-fd66b636b03e76be26e51e06b4b829746f8c47024148ce42ed03f1fa28a62369 WatchSource:0}: Error finding container fd66b636b03e76be26e51e06b4b829746f8c47024148ce42ed03f1fa28a62369: Status 404 returned error can't find the container with id fd66b636b03e76be26e51e06b4b829746f8c47024148ce42ed03f1fa28a62369 Nov 26 15:07:17 crc kubenswrapper[4785]: W1126 15:07:17.562017 4785 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/crio-a36ec5056d7e6deba946fa7d17376914f87cf0f1e8cdfe97205f9d55c960497b WatchSource:0}: Error finding container a36ec5056d7e6deba946fa7d17376914f87cf0f1e8cdfe97205f9d55c960497b: Status 404 returned error can't find the container with id a36ec5056d7e6deba946fa7d17376914f87cf0f1e8cdfe97205f9d55c960497b Nov 26 15:07:17 crc kubenswrapper[4785]: E1126 15:07:17.570123 4785 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.44:6443: connect: connection refused" interval="800ms" Nov 26 15:07:17 crc kubenswrapper[4785]: W1126 15:07:17.793669 4785 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.44:6443: connect: connection refused Nov 26 15:07:17 crc kubenswrapper[4785]: E1126 15:07:17.793757 4785 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.44:6443: connect: connection refused" logger="UnhandledError" Nov 26 15:07:17 crc kubenswrapper[4785]: I1126 15:07:17.802041 4785 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 26 15:07:17 crc kubenswrapper[4785]: I1126 15:07:17.803366 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:07:17 crc kubenswrapper[4785]: I1126 15:07:17.803397 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:07:17 crc kubenswrapper[4785]: I1126 15:07:17.803405 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:07:17 crc kubenswrapper[4785]: I1126 15:07:17.803424 4785 kubelet_node_status.go:76] "Attempting to register node" node="crc" Nov 26 15:07:17 crc kubenswrapper[4785]: E1126 15:07:17.803948 4785 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.44:6443: connect: connection refused" node="crc" Nov 26 15:07:17 crc kubenswrapper[4785]: W1126 15:07:17.854328 4785 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.44:6443: connect: connection refused Nov 26 15:07:17 crc kubenswrapper[4785]: E1126 15:07:17.854476 4785 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.44:6443: connect: connection refused" logger="UnhandledError" Nov 26 15:07:17 crc kubenswrapper[4785]: E1126 15:07:17.862674 4785 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.44:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.187b96f016bd10ee default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-11-26 15:07:16.963487982 +0000 UTC m=+0.641853786,LastTimestamp:2025-11-26 15:07:16.963487982 +0000 UTC m=+0.641853786,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Nov 26 15:07:17 crc kubenswrapper[4785]: W1126 15:07:17.878648 4785 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.44:6443: connect: connection refused Nov 26 15:07:17 crc kubenswrapper[4785]: E1126 15:07:17.878770 4785 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.44:6443: connect: connection refused" logger="UnhandledError" Nov 26 15:07:17 crc kubenswrapper[4785]: I1126 15:07:17.965926 4785 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.44:6443: connect: connection refused Nov 26 15:07:17 crc kubenswrapper[4785]: I1126 15:07:17.966814 4785 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-07 14:54:35.857799091 +0000 UTC Nov 26 15:07:17 crc kubenswrapper[4785]: I1126 15:07:17.966885 4785 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 263h47m17.89091653s for next certificate rotation Nov 26 15:07:18 crc kubenswrapper[4785]: I1126 15:07:18.040189 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"a36ec5056d7e6deba946fa7d17376914f87cf0f1e8cdfe97205f9d55c960497b"} Nov 26 15:07:18 crc kubenswrapper[4785]: I1126 15:07:18.041722 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"fd66b636b03e76be26e51e06b4b829746f8c47024148ce42ed03f1fa28a62369"} Nov 26 15:07:18 crc kubenswrapper[4785]: I1126 15:07:18.043695 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"8d43c6489d5c3495e5ed292f4277b148ec3a704d9ecf4199425c1ea9f177a44e"} Nov 26 15:07:18 crc kubenswrapper[4785]: I1126 15:07:18.045171 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"765e7663046d39c1622b9ca981a93146607fceebd5de9f709b130402b7d09683"} Nov 26 15:07:18 crc kubenswrapper[4785]: I1126 15:07:18.046303 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"833140d6013c7211851cd7e74c156977514a72446402bf50f4a452c62ca88d8f"} Nov 26 15:07:18 crc kubenswrapper[4785]: E1126 15:07:18.371502 4785 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.44:6443: connect: connection refused" interval="1.6s" Nov 26 15:07:18 crc kubenswrapper[4785]: W1126 15:07:18.429407 4785 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.44:6443: connect: connection refused Nov 26 15:07:18 crc kubenswrapper[4785]: E1126 15:07:18.429580 4785 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.44:6443: connect: connection refused" logger="UnhandledError" Nov 26 15:07:18 crc kubenswrapper[4785]: I1126 15:07:18.604365 4785 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 26 15:07:18 crc kubenswrapper[4785]: I1126 15:07:18.605639 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:07:18 crc kubenswrapper[4785]: I1126 15:07:18.605676 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:07:18 crc kubenswrapper[4785]: I1126 15:07:18.605689 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:07:18 crc kubenswrapper[4785]: I1126 15:07:18.605714 4785 kubelet_node_status.go:76] "Attempting to register node" node="crc" Nov 26 15:07:18 crc kubenswrapper[4785]: E1126 15:07:18.606103 4785 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.44:6443: connect: connection refused" node="crc" Nov 26 15:07:18 crc kubenswrapper[4785]: I1126 15:07:18.965932 4785 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.44:6443: connect: connection refused Nov 26 15:07:19 crc kubenswrapper[4785]: I1126 15:07:19.053530 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"6c964ff713703fe4484d76d86473279ffc4a3eef0cf3dd8b29aea7fdc6271616"} Nov 26 15:07:19 crc kubenswrapper[4785]: I1126 15:07:19.053616 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"320fc519a91dfed4d5c233e7c1c2f21aadf25336ccade8830069aae4bd804a68"} Nov 26 15:07:19 crc kubenswrapper[4785]: I1126 15:07:19.053637 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"1b79ab88118b1e683ad26fc3bd390961d15af99cce73126c56587c0df819f3a5"} Nov 26 15:07:19 crc kubenswrapper[4785]: I1126 15:07:19.056203 4785 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="0ff60b31932a511f9156f1416e112d115fcbc79a0a6030143ba9f5ddd327577a" exitCode=0 Nov 26 15:07:19 crc kubenswrapper[4785]: I1126 15:07:19.056327 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"0ff60b31932a511f9156f1416e112d115fcbc79a0a6030143ba9f5ddd327577a"} Nov 26 15:07:19 crc kubenswrapper[4785]: I1126 15:07:19.056398 4785 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 26 15:07:19 crc kubenswrapper[4785]: I1126 15:07:19.057594 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:07:19 crc kubenswrapper[4785]: I1126 15:07:19.057622 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:07:19 crc kubenswrapper[4785]: I1126 15:07:19.057631 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:07:19 crc kubenswrapper[4785]: I1126 15:07:19.060261 4785 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="e86ca1ef167bd0c31ad5ae1283d40945c72c4e528f3702b577bbbf9818e84c08" exitCode=0 Nov 26 15:07:19 crc kubenswrapper[4785]: I1126 15:07:19.060337 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"e86ca1ef167bd0c31ad5ae1283d40945c72c4e528f3702b577bbbf9818e84c08"} Nov 26 15:07:19 crc kubenswrapper[4785]: I1126 15:07:19.060454 4785 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 26 15:07:19 crc kubenswrapper[4785]: I1126 15:07:19.061813 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:07:19 crc kubenswrapper[4785]: I1126 15:07:19.061858 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:07:19 crc kubenswrapper[4785]: I1126 15:07:19.061878 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:07:19 crc kubenswrapper[4785]: I1126 15:07:19.064243 4785 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 26 15:07:19 crc kubenswrapper[4785]: I1126 15:07:19.064544 4785 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="ad113c4b2e70dde555f17e5b90363d0c0d5bcd70fa99ed5422ffc32bce3253c1" exitCode=0 Nov 26 15:07:19 crc kubenswrapper[4785]: I1126 15:07:19.064680 4785 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 26 15:07:19 crc kubenswrapper[4785]: I1126 15:07:19.064730 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"ad113c4b2e70dde555f17e5b90363d0c0d5bcd70fa99ed5422ffc32bce3253c1"} Nov 26 15:07:19 crc kubenswrapper[4785]: I1126 15:07:19.065800 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:07:19 crc kubenswrapper[4785]: I1126 15:07:19.065851 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:07:19 crc kubenswrapper[4785]: I1126 15:07:19.065873 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:07:19 crc kubenswrapper[4785]: I1126 15:07:19.066023 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:07:19 crc kubenswrapper[4785]: I1126 15:07:19.066052 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:07:19 crc kubenswrapper[4785]: I1126 15:07:19.066069 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:07:19 crc kubenswrapper[4785]: I1126 15:07:19.070758 4785 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="169def9ac03274e05424365ce5b2fc1236d495e846603c29fa6efbaae27b53b2" exitCode=0 Nov 26 15:07:19 crc kubenswrapper[4785]: I1126 15:07:19.070838 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"169def9ac03274e05424365ce5b2fc1236d495e846603c29fa6efbaae27b53b2"} Nov 26 15:07:19 crc kubenswrapper[4785]: I1126 15:07:19.070923 4785 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 26 15:07:19 crc kubenswrapper[4785]: I1126 15:07:19.072195 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:07:19 crc kubenswrapper[4785]: I1126 15:07:19.072251 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:07:19 crc kubenswrapper[4785]: I1126 15:07:19.072272 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:07:19 crc kubenswrapper[4785]: W1126 15:07:19.501679 4785 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.44:6443: connect: connection refused Nov 26 15:07:19 crc kubenswrapper[4785]: E1126 15:07:19.501740 4785 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.44:6443: connect: connection refused" logger="UnhandledError" Nov 26 15:07:19 crc kubenswrapper[4785]: I1126 15:07:19.966368 4785 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.44:6443: connect: connection refused Nov 26 15:07:19 crc kubenswrapper[4785]: E1126 15:07:19.972325 4785 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.44:6443: connect: connection refused" interval="3.2s" Nov 26 15:07:20 crc kubenswrapper[4785]: I1126 15:07:20.074545 4785 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="ff524be439c6bf410b9377cd503516b43be3826e4c2fcd03a6ed9bf3fcfab95b" exitCode=0 Nov 26 15:07:20 crc kubenswrapper[4785]: I1126 15:07:20.074678 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"ff524be439c6bf410b9377cd503516b43be3826e4c2fcd03a6ed9bf3fcfab95b"} Nov 26 15:07:20 crc kubenswrapper[4785]: I1126 15:07:20.074699 4785 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 26 15:07:20 crc kubenswrapper[4785]: I1126 15:07:20.075616 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:07:20 crc kubenswrapper[4785]: I1126 15:07:20.075649 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:07:20 crc kubenswrapper[4785]: I1126 15:07:20.075658 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:07:20 crc kubenswrapper[4785]: I1126 15:07:20.077837 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"bcce80995d95ff6993051a4f1ca260133d2513d216a291e3f2c905ec29f88d1b"} Nov 26 15:07:20 crc kubenswrapper[4785]: I1126 15:07:20.077875 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"7904d95c1ece08fadcdc54e83f80b295f3904cc950fb84624bc2a6967e4d5bdd"} Nov 26 15:07:20 crc kubenswrapper[4785]: I1126 15:07:20.077894 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"936454b4c6cdedceb140c50e2dba0e677ce2b126075b203bdae8a77bbdefde76"} Nov 26 15:07:20 crc kubenswrapper[4785]: I1126 15:07:20.077910 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"26237e500f4655636096b017418a9029308b9ad9e5c12041546e51d919766529"} Nov 26 15:07:20 crc kubenswrapper[4785]: I1126 15:07:20.087691 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"08b2ab738589b29c6c67e3319ccab97ca16955fbc26dfa0dcf8a4374f6cc0d4b"} Nov 26 15:07:20 crc kubenswrapper[4785]: I1126 15:07:20.087747 4785 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 26 15:07:20 crc kubenswrapper[4785]: I1126 15:07:20.088849 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:07:20 crc kubenswrapper[4785]: I1126 15:07:20.088884 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:07:20 crc kubenswrapper[4785]: I1126 15:07:20.088898 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:07:20 crc kubenswrapper[4785]: I1126 15:07:20.092338 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"7f8dc60997fbeb399187b6bc95276994ff0a181732022bfff1e229928996f90b"} Nov 26 15:07:20 crc kubenswrapper[4785]: I1126 15:07:20.092368 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"a3c4c50721c470ba97725e59a42ae3cf740017ed7027a9e5b0a7e1906f40fd49"} Nov 26 15:07:20 crc kubenswrapper[4785]: I1126 15:07:20.092378 4785 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 26 15:07:20 crc kubenswrapper[4785]: I1126 15:07:20.092382 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"f2553398db49910f840349bfd11fd5ce7908438bdbcf5a654355c10fb3c3d610"} Nov 26 15:07:20 crc kubenswrapper[4785]: I1126 15:07:20.093027 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:07:20 crc kubenswrapper[4785]: I1126 15:07:20.093057 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:07:20 crc kubenswrapper[4785]: I1126 15:07:20.093066 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:07:20 crc kubenswrapper[4785]: I1126 15:07:20.095998 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"03ac9c745ec9eb21263a599561496630202872ef631f5e846e0475efedee6943"} Nov 26 15:07:20 crc kubenswrapper[4785]: I1126 15:07:20.096083 4785 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 26 15:07:20 crc kubenswrapper[4785]: I1126 15:07:20.096747 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:07:20 crc kubenswrapper[4785]: I1126 15:07:20.096777 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:07:20 crc kubenswrapper[4785]: I1126 15:07:20.096788 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:07:20 crc kubenswrapper[4785]: I1126 15:07:20.206967 4785 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 26 15:07:20 crc kubenswrapper[4785]: I1126 15:07:20.208028 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:07:20 crc kubenswrapper[4785]: I1126 15:07:20.208064 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:07:20 crc kubenswrapper[4785]: I1126 15:07:20.208077 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:07:20 crc kubenswrapper[4785]: I1126 15:07:20.208100 4785 kubelet_node_status.go:76] "Attempting to register node" node="crc" Nov 26 15:07:20 crc kubenswrapper[4785]: E1126 15:07:20.208483 4785 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.44:6443: connect: connection refused" node="crc" Nov 26 15:07:20 crc kubenswrapper[4785]: I1126 15:07:20.873541 4785 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Nov 26 15:07:21 crc kubenswrapper[4785]: I1126 15:07:21.102092 4785 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="26cb5ba5508ef320a24ba05e352547c9feb82f37ebeab58ad7c0a073d03b8c2d" exitCode=0 Nov 26 15:07:21 crc kubenswrapper[4785]: I1126 15:07:21.102203 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"26cb5ba5508ef320a24ba05e352547c9feb82f37ebeab58ad7c0a073d03b8c2d"} Nov 26 15:07:21 crc kubenswrapper[4785]: I1126 15:07:21.102261 4785 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 26 15:07:21 crc kubenswrapper[4785]: I1126 15:07:21.103730 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:07:21 crc kubenswrapper[4785]: I1126 15:07:21.103802 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:07:21 crc kubenswrapper[4785]: I1126 15:07:21.103855 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:07:21 crc kubenswrapper[4785]: I1126 15:07:21.110102 4785 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 26 15:07:21 crc kubenswrapper[4785]: I1126 15:07:21.110418 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"6a6d4414478527bb76f539125e4be6ebf2a6470ddb46fd35e07180319fa98ede"} Nov 26 15:07:21 crc kubenswrapper[4785]: I1126 15:07:21.110534 4785 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 26 15:07:21 crc kubenswrapper[4785]: I1126 15:07:21.110597 4785 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 26 15:07:21 crc kubenswrapper[4785]: I1126 15:07:21.110633 4785 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 26 15:07:21 crc kubenswrapper[4785]: I1126 15:07:21.111898 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:07:21 crc kubenswrapper[4785]: I1126 15:07:21.111950 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:07:21 crc kubenswrapper[4785]: I1126 15:07:21.111975 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:07:21 crc kubenswrapper[4785]: I1126 15:07:21.113159 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:07:21 crc kubenswrapper[4785]: I1126 15:07:21.113216 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:07:21 crc kubenswrapper[4785]: I1126 15:07:21.113248 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:07:21 crc kubenswrapper[4785]: I1126 15:07:21.113113 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:07:21 crc kubenswrapper[4785]: I1126 15:07:21.113314 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:07:21 crc kubenswrapper[4785]: I1126 15:07:21.113339 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:07:21 crc kubenswrapper[4785]: I1126 15:07:21.113773 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:07:21 crc kubenswrapper[4785]: I1126 15:07:21.114123 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:07:21 crc kubenswrapper[4785]: I1126 15:07:21.114159 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:07:21 crc kubenswrapper[4785]: I1126 15:07:21.973242 4785 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 26 15:07:22 crc kubenswrapper[4785]: I1126 15:07:22.117265 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"dee1454ae94951e83f437b999f871e4965a9199a20bcd6392a6e07a7bfa9c6aa"} Nov 26 15:07:22 crc kubenswrapper[4785]: I1126 15:07:22.117324 4785 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Nov 26 15:07:22 crc kubenswrapper[4785]: I1126 15:07:22.117350 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"79b8f6bd2047642cacd4effcee8f9760c7c9480092993c7a25eabe4727bd8238"} Nov 26 15:07:22 crc kubenswrapper[4785]: I1126 15:07:22.117382 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"ee866b365ef5e15c3cc262ab6ad0649b10fa7fdf0c10c3f747d9e20d48535e76"} Nov 26 15:07:22 crc kubenswrapper[4785]: I1126 15:07:22.117391 4785 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 26 15:07:22 crc kubenswrapper[4785]: I1126 15:07:22.117410 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"dc743861f85f56a87c6a20d7152af59e554284fe13b8212c5dae854438771406"} Nov 26 15:07:22 crc kubenswrapper[4785]: I1126 15:07:22.117365 4785 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 26 15:07:22 crc kubenswrapper[4785]: I1126 15:07:22.119870 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:07:22 crc kubenswrapper[4785]: I1126 15:07:22.119935 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:07:22 crc kubenswrapper[4785]: I1126 15:07:22.119954 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:07:22 crc kubenswrapper[4785]: I1126 15:07:22.120588 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:07:22 crc kubenswrapper[4785]: I1126 15:07:22.120669 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:07:22 crc kubenswrapper[4785]: I1126 15:07:22.120693 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:07:22 crc kubenswrapper[4785]: I1126 15:07:22.517263 4785 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 26 15:07:22 crc kubenswrapper[4785]: I1126 15:07:22.517401 4785 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 26 15:07:22 crc kubenswrapper[4785]: I1126 15:07:22.518431 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:07:22 crc kubenswrapper[4785]: I1126 15:07:22.518484 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:07:22 crc kubenswrapper[4785]: I1126 15:07:22.518503 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:07:22 crc kubenswrapper[4785]: I1126 15:07:22.849809 4785 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 26 15:07:23 crc kubenswrapper[4785]: I1126 15:07:23.016626 4785 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 26 15:07:23 crc kubenswrapper[4785]: I1126 15:07:23.125353 4785 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 26 15:07:23 crc kubenswrapper[4785]: I1126 15:07:23.125475 4785 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 26 15:07:23 crc kubenswrapper[4785]: I1126 15:07:23.125801 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"257e51cb9b93727ccb616e193abc21dfed1ee9ee2f1ce459e35728457dae5280"} Nov 26 15:07:23 crc kubenswrapper[4785]: I1126 15:07:23.126922 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:07:23 crc kubenswrapper[4785]: I1126 15:07:23.126962 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:07:23 crc kubenswrapper[4785]: I1126 15:07:23.126975 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:07:23 crc kubenswrapper[4785]: I1126 15:07:23.126956 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:07:23 crc kubenswrapper[4785]: I1126 15:07:23.127019 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:07:23 crc kubenswrapper[4785]: I1126 15:07:23.127036 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:07:23 crc kubenswrapper[4785]: I1126 15:07:23.409232 4785 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 26 15:07:23 crc kubenswrapper[4785]: I1126 15:07:23.411196 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:07:23 crc kubenswrapper[4785]: I1126 15:07:23.411283 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:07:23 crc kubenswrapper[4785]: I1126 15:07:23.411305 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:07:23 crc kubenswrapper[4785]: I1126 15:07:23.411349 4785 kubelet_node_status.go:76] "Attempting to register node" node="crc" Nov 26 15:07:23 crc kubenswrapper[4785]: I1126 15:07:23.618425 4785 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Nov 26 15:07:24 crc kubenswrapper[4785]: I1126 15:07:24.127666 4785 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 26 15:07:24 crc kubenswrapper[4785]: I1126 15:07:24.127710 4785 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 26 15:07:24 crc kubenswrapper[4785]: I1126 15:07:24.129123 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:07:24 crc kubenswrapper[4785]: I1126 15:07:24.129123 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:07:24 crc kubenswrapper[4785]: I1126 15:07:24.129189 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:07:24 crc kubenswrapper[4785]: I1126 15:07:24.129213 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:07:24 crc kubenswrapper[4785]: I1126 15:07:24.129212 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:07:24 crc kubenswrapper[4785]: I1126 15:07:24.129275 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:07:24 crc kubenswrapper[4785]: I1126 15:07:24.260019 4785 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 26 15:07:24 crc kubenswrapper[4785]: I1126 15:07:24.260193 4785 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 26 15:07:24 crc kubenswrapper[4785]: I1126 15:07:24.261392 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:07:24 crc kubenswrapper[4785]: I1126 15:07:24.261433 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:07:24 crc kubenswrapper[4785]: I1126 15:07:24.261445 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:07:25 crc kubenswrapper[4785]: I1126 15:07:25.129693 4785 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 26 15:07:25 crc kubenswrapper[4785]: I1126 15:07:25.130970 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:07:25 crc kubenswrapper[4785]: I1126 15:07:25.131058 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:07:25 crc kubenswrapper[4785]: I1126 15:07:25.131087 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:07:25 crc kubenswrapper[4785]: I1126 15:07:25.518357 4785 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Nov 26 15:07:25 crc kubenswrapper[4785]: I1126 15:07:25.518454 4785 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Nov 26 15:07:26 crc kubenswrapper[4785]: I1126 15:07:26.288596 4785 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 26 15:07:26 crc kubenswrapper[4785]: I1126 15:07:26.288850 4785 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 26 15:07:26 crc kubenswrapper[4785]: I1126 15:07:26.290261 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:07:26 crc kubenswrapper[4785]: I1126 15:07:26.290332 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:07:26 crc kubenswrapper[4785]: I1126 15:07:26.290358 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:07:26 crc kubenswrapper[4785]: I1126 15:07:26.294347 4785 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 26 15:07:27 crc kubenswrapper[4785]: E1126 15:07:27.105096 4785 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Nov 26 15:07:27 crc kubenswrapper[4785]: I1126 15:07:27.134415 4785 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 26 15:07:27 crc kubenswrapper[4785]: I1126 15:07:27.134738 4785 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 26 15:07:27 crc kubenswrapper[4785]: I1126 15:07:27.135504 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:07:27 crc kubenswrapper[4785]: I1126 15:07:27.135594 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:07:27 crc kubenswrapper[4785]: I1126 15:07:27.135613 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:07:28 crc kubenswrapper[4785]: I1126 15:07:28.138004 4785 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 26 15:07:28 crc kubenswrapper[4785]: I1126 15:07:28.139449 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:07:28 crc kubenswrapper[4785]: I1126 15:07:28.139506 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:07:28 crc kubenswrapper[4785]: I1126 15:07:28.139525 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:07:28 crc kubenswrapper[4785]: I1126 15:07:28.144916 4785 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 26 15:07:29 crc kubenswrapper[4785]: I1126 15:07:29.139823 4785 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 26 15:07:29 crc kubenswrapper[4785]: I1126 15:07:29.142150 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:07:29 crc kubenswrapper[4785]: I1126 15:07:29.142196 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:07:29 crc kubenswrapper[4785]: I1126 15:07:29.142215 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:07:30 crc kubenswrapper[4785]: W1126 15:07:30.633525 4785 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": net/http: TLS handshake timeout Nov 26 15:07:30 crc kubenswrapper[4785]: I1126 15:07:30.633722 4785 trace.go:236] Trace[1633176676]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (26-Nov-2025 15:07:20.631) (total time: 10001ms): Nov 26 15:07:30 crc kubenswrapper[4785]: Trace[1633176676]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (15:07:30.633) Nov 26 15:07:30 crc kubenswrapper[4785]: Trace[1633176676]: [10.001890226s] [10.001890226s] END Nov 26 15:07:30 crc kubenswrapper[4785]: E1126 15:07:30.633761 4785 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Nov 26 15:07:30 crc kubenswrapper[4785]: W1126 15:07:30.668083 4785 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": net/http: TLS handshake timeout Nov 26 15:07:30 crc kubenswrapper[4785]: I1126 15:07:30.668180 4785 trace.go:236] Trace[375224893]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (26-Nov-2025 15:07:20.666) (total time: 10001ms): Nov 26 15:07:30 crc kubenswrapper[4785]: Trace[375224893]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (15:07:30.668) Nov 26 15:07:30 crc kubenswrapper[4785]: Trace[375224893]: [10.001822286s] [10.001822286s] END Nov 26 15:07:30 crc kubenswrapper[4785]: E1126 15:07:30.668206 4785 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Nov 26 15:07:30 crc kubenswrapper[4785]: I1126 15:07:30.713644 4785 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Nov 26 15:07:30 crc kubenswrapper[4785]: I1126 15:07:30.713869 4785 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 26 15:07:30 crc kubenswrapper[4785]: I1126 15:07:30.715150 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:07:30 crc kubenswrapper[4785]: I1126 15:07:30.715346 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:07:30 crc kubenswrapper[4785]: I1126 15:07:30.715515 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:07:30 crc kubenswrapper[4785]: I1126 15:07:30.966499 4785 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": net/http: TLS handshake timeout Nov 26 15:07:31 crc kubenswrapper[4785]: W1126 15:07:31.597902 4785 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": net/http: TLS handshake timeout Nov 26 15:07:31 crc kubenswrapper[4785]: I1126 15:07:31.598025 4785 trace.go:236] Trace[2025890431]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (26-Nov-2025 15:07:21.596) (total time: 10001ms): Nov 26 15:07:31 crc kubenswrapper[4785]: Trace[2025890431]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (15:07:31.597) Nov 26 15:07:31 crc kubenswrapper[4785]: Trace[2025890431]: [10.001568929s] [10.001568929s] END Nov 26 15:07:31 crc kubenswrapper[4785]: E1126 15:07:31.598057 4785 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Nov 26 15:07:31 crc kubenswrapper[4785]: I1126 15:07:31.830269 4785 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Nov 26 15:07:31 crc kubenswrapper[4785]: I1126 15:07:31.830342 4785 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Nov 26 15:07:31 crc kubenswrapper[4785]: I1126 15:07:31.835634 4785 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Nov 26 15:07:31 crc kubenswrapper[4785]: I1126 15:07:31.835686 4785 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Nov 26 15:07:33 crc kubenswrapper[4785]: I1126 15:07:33.022898 4785 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Nov 26 15:07:33 crc kubenswrapper[4785]: [+]log ok Nov 26 15:07:33 crc kubenswrapper[4785]: [+]etcd ok Nov 26 15:07:33 crc kubenswrapper[4785]: [+]poststarthook/openshift.io-startkubeinformers ok Nov 26 15:07:33 crc kubenswrapper[4785]: [+]poststarthook/openshift.io-openshift-apiserver-reachable ok Nov 26 15:07:33 crc kubenswrapper[4785]: [+]poststarthook/openshift.io-oauth-apiserver-reachable ok Nov 26 15:07:33 crc kubenswrapper[4785]: [+]poststarthook/start-apiserver-admission-initializer ok Nov 26 15:07:33 crc kubenswrapper[4785]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Nov 26 15:07:33 crc kubenswrapper[4785]: [+]poststarthook/openshift.io-api-request-count-filter ok Nov 26 15:07:33 crc kubenswrapper[4785]: [+]poststarthook/generic-apiserver-start-informers ok Nov 26 15:07:33 crc kubenswrapper[4785]: [+]poststarthook/priority-and-fairness-config-consumer ok Nov 26 15:07:33 crc kubenswrapper[4785]: [+]poststarthook/priority-and-fairness-filter ok Nov 26 15:07:33 crc kubenswrapper[4785]: [+]poststarthook/storage-object-count-tracker-hook ok Nov 26 15:07:33 crc kubenswrapper[4785]: [+]poststarthook/start-apiextensions-informers ok Nov 26 15:07:33 crc kubenswrapper[4785]: [+]poststarthook/start-apiextensions-controllers ok Nov 26 15:07:33 crc kubenswrapper[4785]: [+]poststarthook/crd-informer-synced ok Nov 26 15:07:33 crc kubenswrapper[4785]: [+]poststarthook/start-system-namespaces-controller ok Nov 26 15:07:33 crc kubenswrapper[4785]: [+]poststarthook/start-cluster-authentication-info-controller ok Nov 26 15:07:33 crc kubenswrapper[4785]: [+]poststarthook/start-kube-apiserver-identity-lease-controller ok Nov 26 15:07:33 crc kubenswrapper[4785]: [+]poststarthook/start-kube-apiserver-identity-lease-garbage-collector ok Nov 26 15:07:33 crc kubenswrapper[4785]: [+]poststarthook/start-legacy-token-tracking-controller ok Nov 26 15:07:33 crc kubenswrapper[4785]: [+]poststarthook/start-service-ip-repair-controllers ok Nov 26 15:07:33 crc kubenswrapper[4785]: [-]poststarthook/rbac/bootstrap-roles failed: reason withheld Nov 26 15:07:33 crc kubenswrapper[4785]: [+]poststarthook/scheduling/bootstrap-system-priority-classes ok Nov 26 15:07:33 crc kubenswrapper[4785]: [+]poststarthook/priority-and-fairness-config-producer ok Nov 26 15:07:33 crc kubenswrapper[4785]: [+]poststarthook/bootstrap-controller ok Nov 26 15:07:33 crc kubenswrapper[4785]: [+]poststarthook/aggregator-reload-proxy-client-cert ok Nov 26 15:07:33 crc kubenswrapper[4785]: [+]poststarthook/start-kube-aggregator-informers ok Nov 26 15:07:33 crc kubenswrapper[4785]: [+]poststarthook/apiservice-status-local-available-controller ok Nov 26 15:07:33 crc kubenswrapper[4785]: [+]poststarthook/apiservice-status-remote-available-controller ok Nov 26 15:07:33 crc kubenswrapper[4785]: [+]poststarthook/apiservice-registration-controller ok Nov 26 15:07:33 crc kubenswrapper[4785]: [+]poststarthook/apiservice-wait-for-first-sync ok Nov 26 15:07:33 crc kubenswrapper[4785]: [+]poststarthook/apiservice-discovery-controller ok Nov 26 15:07:33 crc kubenswrapper[4785]: [+]poststarthook/kube-apiserver-autoregistration ok Nov 26 15:07:33 crc kubenswrapper[4785]: [+]autoregister-completion ok Nov 26 15:07:33 crc kubenswrapper[4785]: [+]poststarthook/apiservice-openapi-controller ok Nov 26 15:07:33 crc kubenswrapper[4785]: [+]poststarthook/apiservice-openapiv3-controller ok Nov 26 15:07:33 crc kubenswrapper[4785]: livez check failed Nov 26 15:07:33 crc kubenswrapper[4785]: I1126 15:07:33.022961 4785 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 26 15:07:33 crc kubenswrapper[4785]: I1126 15:07:33.923805 4785 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Nov 26 15:07:34 crc kubenswrapper[4785]: I1126 15:07:34.211333 4785 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Nov 26 15:07:34 crc kubenswrapper[4785]: I1126 15:07:34.962015 4785 apiserver.go:52] "Watching apiserver" Nov 26 15:07:34 crc kubenswrapper[4785]: I1126 15:07:34.967902 4785 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Nov 26 15:07:34 crc kubenswrapper[4785]: I1126 15:07:34.968165 4785 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-target-xd92c","openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/iptables-alerter-4ln5h","openshift-network-operator/network-operator-58b4c7f79c-55gtf","openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-diagnostics/network-check-source-55646444c4-trplf"] Nov 26 15:07:34 crc kubenswrapper[4785]: I1126 15:07:34.968724 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Nov 26 15:07:34 crc kubenswrapper[4785]: I1126 15:07:34.968820 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 26 15:07:34 crc kubenswrapper[4785]: I1126 15:07:34.968928 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 26 15:07:34 crc kubenswrapper[4785]: I1126 15:07:34.969235 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Nov 26 15:07:34 crc kubenswrapper[4785]: I1126 15:07:34.969840 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Nov 26 15:07:34 crc kubenswrapper[4785]: E1126 15:07:34.971315 4785 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 26 15:07:34 crc kubenswrapper[4785]: E1126 15:07:34.972386 4785 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 26 15:07:34 crc kubenswrapper[4785]: I1126 15:07:34.972510 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 26 15:07:34 crc kubenswrapper[4785]: E1126 15:07:34.972624 4785 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 26 15:07:34 crc kubenswrapper[4785]: I1126 15:07:34.972929 4785 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Nov 26 15:07:34 crc kubenswrapper[4785]: I1126 15:07:34.972944 4785 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Nov 26 15:07:34 crc kubenswrapper[4785]: I1126 15:07:34.973149 4785 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Nov 26 15:07:34 crc kubenswrapper[4785]: I1126 15:07:34.973955 4785 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Nov 26 15:07:34 crc kubenswrapper[4785]: I1126 15:07:34.974143 4785 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Nov 26 15:07:34 crc kubenswrapper[4785]: I1126 15:07:34.974437 4785 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Nov 26 15:07:34 crc kubenswrapper[4785]: I1126 15:07:34.974441 4785 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Nov 26 15:07:34 crc kubenswrapper[4785]: I1126 15:07:34.977083 4785 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Nov 26 15:07:34 crc kubenswrapper[4785]: I1126 15:07:34.978207 4785 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Nov 26 15:07:35 crc kubenswrapper[4785]: I1126 15:07:35.016287 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 26 15:07:35 crc kubenswrapper[4785]: I1126 15:07:35.035694 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 26 15:07:35 crc kubenswrapper[4785]: I1126 15:07:35.052908 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 26 15:07:35 crc kubenswrapper[4785]: I1126 15:07:35.066937 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 26 15:07:35 crc kubenswrapper[4785]: I1126 15:07:35.067873 4785 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Nov 26 15:07:35 crc kubenswrapper[4785]: I1126 15:07:35.079355 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 26 15:07:35 crc kubenswrapper[4785]: I1126 15:07:35.092629 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 26 15:07:35 crc kubenswrapper[4785]: I1126 15:07:35.103947 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 26 15:07:35 crc kubenswrapper[4785]: I1126 15:07:35.519245 4785 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Nov 26 15:07:35 crc kubenswrapper[4785]: I1126 15:07:35.519322 4785 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Nov 26 15:07:35 crc kubenswrapper[4785]: I1126 15:07:35.575538 4785 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.035660 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 26 15:07:36 crc kubenswrapper[4785]: E1126 15:07:36.036223 4785 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 26 15:07:36 crc kubenswrapper[4785]: E1126 15:07:36.836384 4785 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": context deadline exceeded" interval="6.4s" Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.837897 4785 trace.go:236] Trace[1109002774]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (26-Nov-2025 15:07:24.644) (total time: 12193ms): Nov 26 15:07:36 crc kubenswrapper[4785]: Trace[1109002774]: ---"Objects listed" error: 12193ms (15:07:36.837) Nov 26 15:07:36 crc kubenswrapper[4785]: Trace[1109002774]: [12.193375413s] [12.193375413s] END Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.838074 4785 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Nov 26 15:07:36 crc kubenswrapper[4785]: E1126 15:07:36.839210 4785 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes \"crc\" is forbidden: autoscaling.openshift.io/ManagedNode infra config cache not synchronized" node="crc" Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.839401 4785 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.867486 4785 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:60944->192.168.126.11:17697: read: connection reset by peer" start-of-body= Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.867587 4785 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:60944->192.168.126.11:17697: read: connection reset by peer" Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.939815 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.939850 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.939871 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.939890 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.939905 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.939923 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.939945 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.939959 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.939973 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.939990 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.940004 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.940018 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.940032 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.940048 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.940062 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.940077 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.940091 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.940109 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.940125 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.940139 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.940153 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.940168 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.940184 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.940199 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.940213 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.940229 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.940227 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.940243 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.940245 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.940324 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.940352 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.940375 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.940396 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.940416 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.940436 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.940457 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.940460 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.940479 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.940499 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.940520 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.940545 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.940584 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.940574 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.940604 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.940625 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.940646 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.940668 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.940687 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.940708 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.940724 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.940748 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.940791 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.940794 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.940815 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.940839 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.940859 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.940881 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.940892 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.940902 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.940925 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.940946 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.940967 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.940985 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.940988 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.941005 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.941025 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.941045 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.941065 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.941075 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.941087 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.941110 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.941133 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.941153 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.941167 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.941175 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.941196 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.941209 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.941216 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.941237 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.941258 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.941279 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.941306 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.941329 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.941354 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.941327 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.941367 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.941389 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.941413 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.941433 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.941453 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.941473 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.941493 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.941516 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.941581 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.941586 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.941650 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.941675 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.941697 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.941727 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.941741 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.941750 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.941755 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.941771 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.941793 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.941817 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.941841 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.941863 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.941894 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.941896 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.941939 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.941964 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.941986 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.942007 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.942018 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.942028 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.942073 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.942101 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.942126 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.942138 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.942151 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.942182 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.942207 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.942230 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.942254 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.942274 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.942276 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.942296 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.942364 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.942392 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.942412 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.942433 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.942449 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.942466 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.942483 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.942503 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.942521 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.942536 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.942574 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.942673 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.942723 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.942740 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.942757 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.942775 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.942791 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.942808 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.942824 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.942841 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.942858 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.942876 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.942892 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.942908 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.942927 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.942945 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.942962 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.942979 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.942996 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.943014 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.943030 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.943045 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.943061 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.943079 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.943098 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.943116 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.943133 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.943150 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.943167 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.943184 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.943199 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.943215 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.943230 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.943246 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.943265 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.943342 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.943370 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.943387 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.943405 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.943421 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.943444 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.943462 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.943478 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.943499 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.943571 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.943597 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.943641 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.943760 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.943788 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.943814 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.943848 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.943878 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.943903 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.943936 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.943959 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.943982 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.944004 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.944032 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.944067 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.944087 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.944104 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.944122 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.944140 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.944158 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.944193 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.944218 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.944250 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.944324 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.944343 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.944361 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.944378 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.944394 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.944411 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.944442 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.944459 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.944477 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.944495 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.944513 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.944533 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.944575 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.944635 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.944667 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.944702 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.944747 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.944771 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.944795 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.944820 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.944846 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.944871 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.944907 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.944937 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.944964 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.944987 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.945009 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.945086 4785 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.945112 4785 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.945127 4785 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.945141 4785 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.945155 4785 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.945169 4785 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.945184 4785 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.945199 4785 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.945213 4785 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.945226 4785 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.945239 4785 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.945253 4785 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.945268 4785 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.945282 4785 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.945297 4785 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.945309 4785 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.945321 4785 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.945334 4785 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.945348 4785 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.945363 4785 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.945377 4785 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.945391 4785 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.942356 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.942359 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.942421 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.942492 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.942542 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.942546 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.942589 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.942608 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.942690 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.942695 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.942708 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.942715 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.942777 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.942843 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.942870 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.942877 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.947815 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.942922 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.942939 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.942970 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.943005 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.943050 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.943084 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.943182 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.943219 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.943417 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.943536 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.943615 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.943742 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.943743 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.943812 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.943911 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.943960 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.944017 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.944031 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.944061 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.944071 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.944084 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.944109 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.944239 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.944246 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.944487 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.944597 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.945279 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.945539 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.945574 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.946073 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.947881 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.947903 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.947968 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.948100 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.948864 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.949206 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.949345 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.949361 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.949537 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.949664 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.949755 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.949867 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.949932 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.949977 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.950145 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.951258 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.951361 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.951679 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.951768 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.952406 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.952906 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.953179 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.953265 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.953430 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.953500 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.953736 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.954345 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.954406 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.954412 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.954767 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.954318 4785 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.955154 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.955250 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.955305 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.955340 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.955315 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.951729 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.955573 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.955772 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.955859 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.956037 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.956203 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.956429 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.956876 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.956506 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.956454 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.956922 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.957146 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.957293 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.957366 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.957671 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.957886 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.957986 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.958319 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.958645 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.958768 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.959079 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.959278 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 15:07:36 crc kubenswrapper[4785]: E1126 15:07:36.959301 4785 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.959386 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.959435 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.959472 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.959780 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.959884 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.960020 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.960127 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 15:07:36 crc kubenswrapper[4785]: E1126 15:07:36.960287 4785 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 26 15:07:36 crc kubenswrapper[4785]: E1126 15:07:36.951892 4785 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-26 15:07:37.451865791 +0000 UTC m=+21.130231565 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.960465 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 15:07:36 crc kubenswrapper[4785]: E1126 15:07:36.960638 4785 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-26 15:07:37.460494404 +0000 UTC m=+21.138860168 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 26 15:07:36 crc kubenswrapper[4785]: E1126 15:07:36.960729 4785 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-26 15:07:37.46071672 +0000 UTC m=+21.139082544 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.960755 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.960807 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.960959 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.961036 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.961063 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.961215 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.961260 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.961301 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.961457 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.961705 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.961502 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.961677 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.955830 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.961683 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.961910 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.962407 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.962014 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.963008 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.964113 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.964422 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.965229 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.965464 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.965475 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.965726 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.966058 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.974015 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.975146 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.975678 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.975689 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.975938 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.975940 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 15:07:36 crc kubenswrapper[4785]: E1126 15:07:36.976167 4785 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 26 15:07:36 crc kubenswrapper[4785]: E1126 15:07:36.976215 4785 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.976201 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.976264 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 15:07:36 crc kubenswrapper[4785]: E1126 15:07:36.976231 4785 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 26 15:07:36 crc kubenswrapper[4785]: E1126 15:07:36.976357 4785 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-11-26 15:07:37.476337164 +0000 UTC m=+21.154702938 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.976392 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.976458 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.976579 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.976769 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.976867 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.978140 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 15:07:36 crc kubenswrapper[4785]: E1126 15:07:36.978156 4785 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 26 15:07:36 crc kubenswrapper[4785]: E1126 15:07:36.978197 4785 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 26 15:07:36 crc kubenswrapper[4785]: E1126 15:07:36.978211 4785 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 26 15:07:36 crc kubenswrapper[4785]: E1126 15:07:36.978268 4785 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-11-26 15:07:37.478246013 +0000 UTC m=+21.156611867 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.979065 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.979180 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.979394 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.979713 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.979752 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.979797 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.979918 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.980143 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.980152 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.980241 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.980257 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.980495 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.981151 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.981591 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.982233 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.982794 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.983427 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.985091 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.985132 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.985427 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.985573 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.992901 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.993817 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.993825 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 15:07:36 crc kubenswrapper[4785]: I1126 15:07:36.994446 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 15:07:37 crc kubenswrapper[4785]: I1126 15:07:37.007781 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 15:07:37 crc kubenswrapper[4785]: I1126 15:07:37.014270 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 15:07:37 crc kubenswrapper[4785]: I1126 15:07:37.017693 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 15:07:37 crc kubenswrapper[4785]: I1126 15:07:37.020018 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 15:07:37 crc kubenswrapper[4785]: I1126 15:07:37.035524 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 26 15:07:37 crc kubenswrapper[4785]: I1126 15:07:37.035572 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 26 15:07:37 crc kubenswrapper[4785]: E1126 15:07:37.036032 4785 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 26 15:07:37 crc kubenswrapper[4785]: E1126 15:07:37.036127 4785 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 26 15:07:37 crc kubenswrapper[4785]: I1126 15:07:37.039688 4785 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Nov 26 15:07:37 crc kubenswrapper[4785]: I1126 15:07:37.040166 4785 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Nov 26 15:07:37 crc kubenswrapper[4785]: I1126 15:07:37.041330 4785 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Nov 26 15:07:37 crc kubenswrapper[4785]: I1126 15:07:37.041923 4785 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Nov 26 15:07:37 crc kubenswrapper[4785]: I1126 15:07:37.043698 4785 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Nov 26 15:07:37 crc kubenswrapper[4785]: I1126 15:07:37.044808 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 26 15:07:37 crc kubenswrapper[4785]: I1126 15:07:37.045091 4785 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Nov 26 15:07:37 crc kubenswrapper[4785]: I1126 15:07:37.046177 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Nov 26 15:07:37 crc kubenswrapper[4785]: I1126 15:07:37.046238 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Nov 26 15:07:37 crc kubenswrapper[4785]: I1126 15:07:37.046302 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Nov 26 15:07:37 crc kubenswrapper[4785]: I1126 15:07:37.046306 4785 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Nov 26 15:07:37 crc kubenswrapper[4785]: I1126 15:07:37.046346 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Nov 26 15:07:37 crc kubenswrapper[4785]: I1126 15:07:37.046370 4785 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Nov 26 15:07:37 crc kubenswrapper[4785]: I1126 15:07:37.046381 4785 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Nov 26 15:07:37 crc kubenswrapper[4785]: I1126 15:07:37.046390 4785 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Nov 26 15:07:37 crc kubenswrapper[4785]: I1126 15:07:37.046399 4785 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Nov 26 15:07:37 crc kubenswrapper[4785]: I1126 15:07:37.046408 4785 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Nov 26 15:07:37 crc kubenswrapper[4785]: I1126 15:07:37.046416 4785 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 26 15:07:37 crc kubenswrapper[4785]: I1126 15:07:37.046425 4785 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Nov 26 15:07:37 crc kubenswrapper[4785]: I1126 15:07:37.046433 4785 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Nov 26 15:07:37 crc kubenswrapper[4785]: I1126 15:07:37.046442 4785 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Nov 26 15:07:37 crc kubenswrapper[4785]: I1126 15:07:37.046451 4785 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 26 15:07:37 crc kubenswrapper[4785]: I1126 15:07:37.046460 4785 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Nov 26 15:07:37 crc kubenswrapper[4785]: I1126 15:07:37.046467 4785 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Nov 26 15:07:37 crc kubenswrapper[4785]: I1126 15:07:37.046477 4785 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Nov 26 15:07:37 crc kubenswrapper[4785]: I1126 15:07:37.046486 4785 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Nov 26 15:07:37 crc kubenswrapper[4785]: I1126 15:07:37.046494 4785 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Nov 26 15:07:37 crc kubenswrapper[4785]: I1126 15:07:37.046502 4785 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Nov 26 15:07:37 crc kubenswrapper[4785]: I1126 15:07:37.046510 4785 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Nov 26 15:07:37 crc kubenswrapper[4785]: I1126 15:07:37.046518 4785 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 26 15:07:37 crc kubenswrapper[4785]: I1126 15:07:37.046526 4785 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Nov 26 15:07:37 crc kubenswrapper[4785]: I1126 15:07:37.046534 4785 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 26 15:07:37 crc kubenswrapper[4785]: I1126 15:07:37.046542 4785 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Nov 26 15:07:37 crc kubenswrapper[4785]: I1126 15:07:37.046572 4785 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 26 15:07:37 crc kubenswrapper[4785]: I1126 15:07:37.046585 4785 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Nov 26 15:07:37 crc kubenswrapper[4785]: I1126 15:07:37.046596 4785 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Nov 26 15:07:37 crc kubenswrapper[4785]: I1126 15:07:37.046605 4785 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Nov 26 15:07:37 crc kubenswrapper[4785]: I1126 15:07:37.046614 4785 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Nov 26 15:07:37 crc kubenswrapper[4785]: I1126 15:07:37.046622 4785 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Nov 26 15:07:37 crc kubenswrapper[4785]: I1126 15:07:37.046630 4785 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 26 15:07:37 crc kubenswrapper[4785]: I1126 15:07:37.046639 4785 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Nov 26 15:07:37 crc kubenswrapper[4785]: I1126 15:07:37.046648 4785 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Nov 26 15:07:37 crc kubenswrapper[4785]: I1126 15:07:37.046659 4785 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Nov 26 15:07:37 crc kubenswrapper[4785]: I1126 15:07:37.046673 4785 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Nov 26 15:07:37 crc kubenswrapper[4785]: I1126 15:07:37.046685 4785 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Nov 26 15:07:37 crc kubenswrapper[4785]: I1126 15:07:37.046707 4785 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Nov 26 15:07:37 crc kubenswrapper[4785]: I1126 15:07:37.046719 4785 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Nov 26 15:07:37 crc kubenswrapper[4785]: I1126 15:07:37.046731 4785 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 26 15:07:37 crc kubenswrapper[4785]: I1126 15:07:37.046742 4785 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Nov 26 15:07:37 crc kubenswrapper[4785]: I1126 15:07:37.046752 4785 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 26 15:07:37 crc kubenswrapper[4785]: I1126 15:07:37.046763 4785 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Nov 26 15:07:37 crc kubenswrapper[4785]: I1126 15:07:37.046775 4785 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Nov 26 15:07:37 crc kubenswrapper[4785]: I1126 15:07:37.046785 4785 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 26 15:07:37 crc kubenswrapper[4785]: I1126 15:07:37.046796 4785 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Nov 26 15:07:37 crc kubenswrapper[4785]: I1126 15:07:37.046809 4785 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Nov 26 15:07:37 crc kubenswrapper[4785]: I1126 15:07:37.046820 4785 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Nov 26 15:07:37 crc kubenswrapper[4785]: I1126 15:07:37.046832 4785 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Nov 26 15:07:37 crc kubenswrapper[4785]: I1126 15:07:37.046844 4785 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Nov 26 15:07:37 crc kubenswrapper[4785]: I1126 15:07:37.046855 4785 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Nov 26 15:07:37 crc kubenswrapper[4785]: I1126 15:07:37.046867 4785 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Nov 26 15:07:37 crc kubenswrapper[4785]: I1126 15:07:37.046876 4785 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Nov 26 15:07:37 crc kubenswrapper[4785]: I1126 15:07:37.046885 4785 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Nov 26 15:07:37 crc kubenswrapper[4785]: I1126 15:07:37.046894 4785 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Nov 26 15:07:37 crc kubenswrapper[4785]: I1126 15:07:37.046902 4785 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 26 15:07:37 crc kubenswrapper[4785]: I1126 15:07:37.046910 4785 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Nov 26 15:07:37 crc kubenswrapper[4785]: I1126 15:07:37.046919 4785 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Nov 26 15:07:37 crc kubenswrapper[4785]: I1126 15:07:37.046927 4785 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Nov 26 15:07:37 crc kubenswrapper[4785]: I1126 15:07:37.046935 4785 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Nov 26 15:07:37 crc kubenswrapper[4785]: I1126 15:07:37.046943 4785 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Nov 26 15:07:37 crc kubenswrapper[4785]: I1126 15:07:37.046951 4785 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Nov 26 15:07:37 crc kubenswrapper[4785]: I1126 15:07:37.046960 4785 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Nov 26 15:07:37 crc kubenswrapper[4785]: I1126 15:07:37.046968 4785 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Nov 26 15:07:37 crc kubenswrapper[4785]: I1126 15:07:37.046976 4785 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Nov 26 15:07:37 crc kubenswrapper[4785]: I1126 15:07:37.046984 4785 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Nov 26 15:07:37 crc kubenswrapper[4785]: I1126 15:07:37.046993 4785 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Nov 26 15:07:37 crc kubenswrapper[4785]: I1126 15:07:37.047004 4785 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Nov 26 15:07:37 crc kubenswrapper[4785]: I1126 15:07:37.047015 4785 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 26 15:07:37 crc kubenswrapper[4785]: I1126 15:07:37.047026 4785 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Nov 26 15:07:37 crc kubenswrapper[4785]: I1126 15:07:37.047026 4785 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Nov 26 15:07:37 crc kubenswrapper[4785]: I1126 15:07:37.047037 4785 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 26 15:07:37 crc kubenswrapper[4785]: I1126 15:07:37.047127 4785 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Nov 26 15:07:37 crc kubenswrapper[4785]: I1126 15:07:37.047136 4785 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Nov 26 15:07:37 crc kubenswrapper[4785]: I1126 15:07:37.047165 4785 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Nov 26 15:07:37 crc kubenswrapper[4785]: I1126 15:07:37.047173 4785 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Nov 26 15:07:37 crc kubenswrapper[4785]: I1126 15:07:37.047181 4785 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 26 15:07:37 crc kubenswrapper[4785]: I1126 15:07:37.047191 4785 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Nov 26 15:07:37 crc kubenswrapper[4785]: I1126 15:07:37.047199 4785 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Nov 26 15:07:37 crc kubenswrapper[4785]: I1126 15:07:37.047207 4785 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Nov 26 15:07:37 crc kubenswrapper[4785]: I1126 15:07:37.047215 4785 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 26 15:07:37 crc kubenswrapper[4785]: I1126 15:07:37.047223 4785 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Nov 26 15:07:37 crc kubenswrapper[4785]: I1126 15:07:37.047231 4785 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 26 15:07:37 crc kubenswrapper[4785]: I1126 15:07:37.047238 4785 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Nov 26 15:07:37 crc kubenswrapper[4785]: I1126 15:07:37.047246 4785 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Nov 26 15:07:37 crc kubenswrapper[4785]: I1126 15:07:37.047254 4785 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Nov 26 15:07:37 crc kubenswrapper[4785]: I1126 15:07:37.047263 4785 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Nov 26 15:07:37 crc kubenswrapper[4785]: I1126 15:07:37.047272 4785 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 26 15:07:37 crc kubenswrapper[4785]: I1126 15:07:37.047284 4785 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Nov 26 15:07:37 crc kubenswrapper[4785]: I1126 15:07:37.047291 4785 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 26 15:07:37 crc kubenswrapper[4785]: I1126 15:07:37.047300 4785 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 26 15:07:37 crc kubenswrapper[4785]: I1126 15:07:37.047308 4785 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Nov 26 15:07:37 crc kubenswrapper[4785]: I1126 15:07:37.047317 4785 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Nov 26 15:07:37 crc kubenswrapper[4785]: I1126 15:07:37.047326 4785 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Nov 26 15:07:37 crc kubenswrapper[4785]: I1126 15:07:37.047334 4785 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Nov 26 15:07:37 crc kubenswrapper[4785]: I1126 15:07:37.047342 4785 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Nov 26 15:07:37 crc kubenswrapper[4785]: I1126 15:07:37.047351 4785 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Nov 26 15:07:37 crc kubenswrapper[4785]: I1126 15:07:37.047358 4785 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Nov 26 15:07:37 crc kubenswrapper[4785]: I1126 15:07:37.047366 4785 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Nov 26 15:07:37 crc kubenswrapper[4785]: I1126 15:07:37.047373 4785 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Nov 26 15:07:37 crc kubenswrapper[4785]: I1126 15:07:37.047381 4785 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Nov 26 15:07:37 crc kubenswrapper[4785]: I1126 15:07:37.047388 4785 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Nov 26 15:07:37 crc kubenswrapper[4785]: I1126 15:07:37.047397 4785 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Nov 26 15:07:37 crc kubenswrapper[4785]: I1126 15:07:37.047404 4785 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Nov 26 15:07:37 crc kubenswrapper[4785]: I1126 15:07:37.047412 4785 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Nov 26 15:07:37 crc kubenswrapper[4785]: I1126 15:07:37.047419 4785 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Nov 26 15:07:37 crc kubenswrapper[4785]: I1126 15:07:37.047428 4785 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Nov 26 15:07:37 crc kubenswrapper[4785]: I1126 15:07:37.047435 4785 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Nov 26 15:07:37 crc kubenswrapper[4785]: I1126 15:07:37.047443 4785 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Nov 26 15:07:37 crc kubenswrapper[4785]: I1126 15:07:37.047477 4785 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 26 15:07:37 crc kubenswrapper[4785]: I1126 15:07:37.047485 4785 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Nov 26 15:07:37 crc kubenswrapper[4785]: I1126 15:07:37.047493 4785 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Nov 26 15:07:37 crc kubenswrapper[4785]: I1126 15:07:37.047501 4785 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Nov 26 15:07:37 crc kubenswrapper[4785]: I1126 15:07:37.047509 4785 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 26 15:07:37 crc kubenswrapper[4785]: I1126 15:07:37.047517 4785 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Nov 26 15:07:37 crc kubenswrapper[4785]: I1126 15:07:37.047525 4785 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Nov 26 15:07:37 crc kubenswrapper[4785]: I1126 15:07:37.047533 4785 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Nov 26 15:07:37 crc kubenswrapper[4785]: I1126 15:07:37.047541 4785 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Nov 26 15:07:37 crc kubenswrapper[4785]: I1126 15:07:37.047564 4785 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 26 15:07:37 crc kubenswrapper[4785]: I1126 15:07:37.047575 4785 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Nov 26 15:07:37 crc kubenswrapper[4785]: I1126 15:07:37.047586 4785 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Nov 26 15:07:37 crc kubenswrapper[4785]: I1126 15:07:37.047596 4785 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Nov 26 15:07:37 crc kubenswrapper[4785]: I1126 15:07:37.047606 4785 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Nov 26 15:07:37 crc kubenswrapper[4785]: I1126 15:07:37.047614 4785 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Nov 26 15:07:37 crc kubenswrapper[4785]: I1126 15:07:37.047623 4785 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Nov 26 15:07:37 crc kubenswrapper[4785]: I1126 15:07:37.047631 4785 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Nov 26 15:07:37 crc kubenswrapper[4785]: I1126 15:07:37.047639 4785 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Nov 26 15:07:37 crc kubenswrapper[4785]: I1126 15:07:37.047648 4785 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Nov 26 15:07:37 crc kubenswrapper[4785]: I1126 15:07:37.047655 4785 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Nov 26 15:07:37 crc kubenswrapper[4785]: I1126 15:07:37.047663 4785 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Nov 26 15:07:37 crc kubenswrapper[4785]: I1126 15:07:37.047671 4785 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Nov 26 15:07:37 crc kubenswrapper[4785]: I1126 15:07:37.047678 4785 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Nov 26 15:07:37 crc kubenswrapper[4785]: I1126 15:07:37.047686 4785 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Nov 26 15:07:37 crc kubenswrapper[4785]: I1126 15:07:37.047694 4785 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Nov 26 15:07:37 crc kubenswrapper[4785]: I1126 15:07:37.047702 4785 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Nov 26 15:07:37 crc kubenswrapper[4785]: I1126 15:07:37.047710 4785 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Nov 26 15:07:37 crc kubenswrapper[4785]: I1126 15:07:37.047718 4785 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Nov 26 15:07:37 crc kubenswrapper[4785]: I1126 15:07:37.047726 4785 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Nov 26 15:07:37 crc kubenswrapper[4785]: I1126 15:07:37.047736 4785 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Nov 26 15:07:37 crc kubenswrapper[4785]: I1126 15:07:37.047743 4785 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Nov 26 15:07:37 crc kubenswrapper[4785]: I1126 15:07:37.047752 4785 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Nov 26 15:07:37 crc kubenswrapper[4785]: I1126 15:07:37.047761 4785 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 26 15:07:37 crc kubenswrapper[4785]: I1126 15:07:37.047768 4785 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Nov 26 15:07:37 crc kubenswrapper[4785]: I1126 15:07:37.047811 4785 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Nov 26 15:07:37 crc kubenswrapper[4785]: I1126 15:07:37.047820 4785 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Nov 26 15:07:37 crc kubenswrapper[4785]: I1126 15:07:37.047828 4785 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Nov 26 15:07:37 crc kubenswrapper[4785]: I1126 15:07:37.047836 4785 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Nov 26 15:07:37 crc kubenswrapper[4785]: I1126 15:07:37.047844 4785 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Nov 26 15:07:37 crc kubenswrapper[4785]: I1126 15:07:37.047852 4785 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Nov 26 15:07:37 crc kubenswrapper[4785]: I1126 15:07:37.047861 4785 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Nov 26 15:07:37 crc kubenswrapper[4785]: I1126 15:07:37.047870 4785 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Nov 26 15:07:37 crc kubenswrapper[4785]: I1126 15:07:37.047878 4785 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 26 15:07:37 crc kubenswrapper[4785]: I1126 15:07:37.047887 4785 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 26 15:07:37 crc kubenswrapper[4785]: I1126 15:07:37.047895 4785 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Nov 26 15:07:37 crc kubenswrapper[4785]: I1126 15:07:37.047903 4785 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Nov 26 15:07:37 crc kubenswrapper[4785]: I1126 15:07:37.047912 4785 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Nov 26 15:07:37 crc kubenswrapper[4785]: I1126 15:07:37.047920 4785 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Nov 26 15:07:37 crc kubenswrapper[4785]: I1126 15:07:37.047930 4785 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Nov 26 15:07:37 crc kubenswrapper[4785]: I1126 15:07:37.047939 4785 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Nov 26 15:07:37 crc kubenswrapper[4785]: I1126 15:07:37.047946 4785 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Nov 26 15:07:37 crc kubenswrapper[4785]: I1126 15:07:37.047954 4785 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Nov 26 15:07:37 crc kubenswrapper[4785]: I1126 15:07:37.047962 4785 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 26 15:07:37 crc kubenswrapper[4785]: I1126 15:07:37.047969 4785 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Nov 26 15:07:37 crc kubenswrapper[4785]: I1126 15:07:37.047978 4785 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 26 15:07:37 crc kubenswrapper[4785]: I1126 15:07:37.047985 4785 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Nov 26 15:07:37 crc kubenswrapper[4785]: I1126 15:07:37.047992 4785 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Nov 26 15:07:37 crc kubenswrapper[4785]: I1126 15:07:37.048000 4785 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 26 15:07:37 crc kubenswrapper[4785]: I1126 15:07:37.048007 4785 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Nov 26 15:07:37 crc kubenswrapper[4785]: I1126 15:07:37.048015 4785 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 26 15:07:37 crc kubenswrapper[4785]: I1126 15:07:37.048024 4785 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Nov 26 15:07:37 crc kubenswrapper[4785]: I1126 15:07:37.048032 4785 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Nov 26 15:07:37 crc kubenswrapper[4785]: I1126 15:07:37.048039 4785 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Nov 26 15:07:37 crc kubenswrapper[4785]: I1126 15:07:37.048047 4785 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Nov 26 15:07:37 crc kubenswrapper[4785]: I1126 15:07:37.048055 4785 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Nov 26 15:07:37 crc kubenswrapper[4785]: I1126 15:07:37.048062 4785 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 26 15:07:37 crc kubenswrapper[4785]: I1126 15:07:37.048070 4785 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 26 15:07:37 crc kubenswrapper[4785]: I1126 15:07:37.048118 4785 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Nov 26 15:07:37 crc kubenswrapper[4785]: I1126 15:07:37.049669 4785 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Nov 26 15:07:37 crc kubenswrapper[4785]: I1126 15:07:37.050341 4785 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Nov 26 15:07:37 crc kubenswrapper[4785]: I1126 15:07:37.051444 4785 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Nov 26 15:07:37 crc kubenswrapper[4785]: I1126 15:07:37.053393 4785 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Nov 26 15:07:37 crc kubenswrapper[4785]: I1126 15:07:37.054119 4785 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Nov 26 15:07:37 crc kubenswrapper[4785]: I1126 15:07:37.055383 4785 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Nov 26 15:07:37 crc kubenswrapper[4785]: I1126 15:07:37.055951 4785 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Nov 26 15:07:37 crc kubenswrapper[4785]: I1126 15:07:37.056836 4785 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Nov 26 15:07:37 crc kubenswrapper[4785]: I1126 15:07:37.057478 4785 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Nov 26 15:07:37 crc kubenswrapper[4785]: I1126 15:07:37.057876 4785 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Nov 26 15:07:37 crc kubenswrapper[4785]: I1126 15:07:37.058798 4785 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Nov 26 15:07:37 crc kubenswrapper[4785]: I1126 15:07:37.059348 4785 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Nov 26 15:07:37 crc kubenswrapper[4785]: I1126 15:07:37.059871 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 26 15:07:37 crc kubenswrapper[4785]: I1126 15:07:37.060357 4785 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Nov 26 15:07:37 crc kubenswrapper[4785]: I1126 15:07:37.060898 4785 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Nov 26 15:07:37 crc kubenswrapper[4785]: I1126 15:07:37.061338 4785 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Nov 26 15:07:37 crc kubenswrapper[4785]: I1126 15:07:37.062388 4785 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Nov 26 15:07:37 crc kubenswrapper[4785]: I1126 15:07:37.062897 4785 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Nov 26 15:07:37 crc kubenswrapper[4785]: I1126 15:07:37.063920 4785 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Nov 26 15:07:37 crc kubenswrapper[4785]: I1126 15:07:37.064646 4785 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Nov 26 15:07:37 crc kubenswrapper[4785]: I1126 15:07:37.065474 4785 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Nov 26 15:07:37 crc kubenswrapper[4785]: I1126 15:07:37.066053 4785 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Nov 26 15:07:37 crc kubenswrapper[4785]: I1126 15:07:37.066842 4785 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Nov 26 15:07:37 crc kubenswrapper[4785]: I1126 15:07:37.067277 4785 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Nov 26 15:07:37 crc kubenswrapper[4785]: I1126 15:07:37.067375 4785 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Nov 26 15:07:37 crc kubenswrapper[4785]: I1126 15:07:37.069096 4785 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Nov 26 15:07:37 crc kubenswrapper[4785]: I1126 15:07:37.070019 4785 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Nov 26 15:07:37 crc kubenswrapper[4785]: I1126 15:07:37.070394 4785 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Nov 26 15:07:37 crc kubenswrapper[4785]: I1126 15:07:37.070792 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 26 15:07:37 crc kubenswrapper[4785]: I1126 15:07:37.072088 4785 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Nov 26 15:07:37 crc kubenswrapper[4785]: I1126 15:07:37.073080 4785 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Nov 26 15:07:37 crc kubenswrapper[4785]: I1126 15:07:37.073592 4785 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Nov 26 15:07:37 crc kubenswrapper[4785]: I1126 15:07:37.074506 4785 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Nov 26 15:07:37 crc kubenswrapper[4785]: I1126 15:07:37.075130 4785 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Nov 26 15:07:37 crc kubenswrapper[4785]: I1126 15:07:37.076036 4785 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Nov 26 15:07:37 crc kubenswrapper[4785]: I1126 15:07:37.076618 4785 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Nov 26 15:07:37 crc kubenswrapper[4785]: I1126 15:07:37.077578 4785 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Nov 26 15:07:37 crc kubenswrapper[4785]: I1126 15:07:37.078128 4785 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Nov 26 15:07:37 crc kubenswrapper[4785]: I1126 15:07:37.078929 4785 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Nov 26 15:07:37 crc kubenswrapper[4785]: I1126 15:07:37.079422 4785 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Nov 26 15:07:37 crc kubenswrapper[4785]: I1126 15:07:37.079539 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 26 15:07:37 crc kubenswrapper[4785]: I1126 15:07:37.080384 4785 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Nov 26 15:07:37 crc kubenswrapper[4785]: I1126 15:07:37.081280 4785 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Nov 26 15:07:37 crc kubenswrapper[4785]: I1126 15:07:37.082168 4785 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Nov 26 15:07:37 crc kubenswrapper[4785]: I1126 15:07:37.082654 4785 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Nov 26 15:07:37 crc kubenswrapper[4785]: I1126 15:07:37.083648 4785 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Nov 26 15:07:37 crc kubenswrapper[4785]: I1126 15:07:37.084320 4785 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Nov 26 15:07:37 crc kubenswrapper[4785]: I1126 15:07:37.084880 4785 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Nov 26 15:07:37 crc kubenswrapper[4785]: I1126 15:07:37.085781 4785 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Nov 26 15:07:37 crc kubenswrapper[4785]: I1126 15:07:37.090052 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 26 15:07:37 crc kubenswrapper[4785]: I1126 15:07:37.092202 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Nov 26 15:07:37 crc kubenswrapper[4785]: I1126 15:07:37.098994 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 26 15:07:37 crc kubenswrapper[4785]: I1126 15:07:37.105501 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Nov 26 15:07:37 crc kubenswrapper[4785]: W1126 15:07:37.117804 4785 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef543e1b_8068_4ea3_b32a_61027b32e95d.slice/crio-e5ecfe9d10949b9d51a0382606ba59e4cf1a3451e5ebf2d6bb4bc99e6486f61f WatchSource:0}: Error finding container e5ecfe9d10949b9d51a0382606ba59e4cf1a3451e5ebf2d6bb4bc99e6486f61f: Status 404 returned error can't find the container with id e5ecfe9d10949b9d51a0382606ba59e4cf1a3451e5ebf2d6bb4bc99e6486f61f Nov 26 15:07:37 crc kubenswrapper[4785]: I1126 15:07:37.119613 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Nov 26 15:07:37 crc kubenswrapper[4785]: W1126 15:07:37.129993 4785 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd75a4c96_2883_4a0b_bab2_0fab2b6c0b49.slice/crio-8aa4b41b148f7a8b2798a9b38cbd7dadf96cccdd8eee30eae3e1bcbe08eeb544 WatchSource:0}: Error finding container 8aa4b41b148f7a8b2798a9b38cbd7dadf96cccdd8eee30eae3e1bcbe08eeb544: Status 404 returned error can't find the container with id 8aa4b41b148f7a8b2798a9b38cbd7dadf96cccdd8eee30eae3e1bcbe08eeb544 Nov 26 15:07:37 crc kubenswrapper[4785]: I1126 15:07:37.161252 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"8aa4b41b148f7a8b2798a9b38cbd7dadf96cccdd8eee30eae3e1bcbe08eeb544"} Nov 26 15:07:37 crc kubenswrapper[4785]: I1126 15:07:37.162275 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"e5ecfe9d10949b9d51a0382606ba59e4cf1a3451e5ebf2d6bb4bc99e6486f61f"} Nov 26 15:07:37 crc kubenswrapper[4785]: I1126 15:07:37.164259 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"79655f84f7eb0204c53fdaef8d1a95534eea826a31bf36ad49fb5b2cbc0705bf"} Nov 26 15:07:37 crc kubenswrapper[4785]: I1126 15:07:37.170994 4785 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Nov 26 15:07:37 crc kubenswrapper[4785]: I1126 15:07:37.177332 4785 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="6a6d4414478527bb76f539125e4be6ebf2a6470ddb46fd35e07180319fa98ede" exitCode=255 Nov 26 15:07:37 crc kubenswrapper[4785]: I1126 15:07:37.177379 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"6a6d4414478527bb76f539125e4be6ebf2a6470ddb46fd35e07180319fa98ede"} Nov 26 15:07:37 crc kubenswrapper[4785]: I1126 15:07:37.191233 4785 scope.go:117] "RemoveContainer" containerID="6a6d4414478527bb76f539125e4be6ebf2a6470ddb46fd35e07180319fa98ede" Nov 26 15:07:37 crc kubenswrapper[4785]: I1126 15:07:37.192129 4785 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Nov 26 15:07:37 crc kubenswrapper[4785]: I1126 15:07:37.194408 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 26 15:07:37 crc kubenswrapper[4785]: I1126 15:07:37.215926 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 26 15:07:37 crc kubenswrapper[4785]: I1126 15:07:37.227335 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 26 15:07:37 crc kubenswrapper[4785]: I1126 15:07:37.238032 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 26 15:07:37 crc kubenswrapper[4785]: I1126 15:07:37.250202 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 26 15:07:37 crc kubenswrapper[4785]: I1126 15:07:37.259341 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 26 15:07:37 crc kubenswrapper[4785]: I1126 15:07:37.551789 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 26 15:07:37 crc kubenswrapper[4785]: I1126 15:07:37.551898 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 26 15:07:37 crc kubenswrapper[4785]: E1126 15:07:37.551975 4785 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-26 15:07:38.551945467 +0000 UTC m=+22.230311241 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 15:07:37 crc kubenswrapper[4785]: E1126 15:07:37.552020 4785 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 26 15:07:37 crc kubenswrapper[4785]: E1126 15:07:37.552041 4785 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 26 15:07:37 crc kubenswrapper[4785]: E1126 15:07:37.552054 4785 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 26 15:07:37 crc kubenswrapper[4785]: E1126 15:07:37.552109 4785 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-11-26 15:07:38.552092091 +0000 UTC m=+22.230457965 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 26 15:07:37 crc kubenswrapper[4785]: I1126 15:07:37.552114 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 26 15:07:37 crc kubenswrapper[4785]: I1126 15:07:37.552183 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 26 15:07:37 crc kubenswrapper[4785]: I1126 15:07:37.552220 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 26 15:07:37 crc kubenswrapper[4785]: E1126 15:07:37.552259 4785 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 26 15:07:37 crc kubenswrapper[4785]: E1126 15:07:37.552296 4785 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-26 15:07:38.552286106 +0000 UTC m=+22.230652000 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 26 15:07:37 crc kubenswrapper[4785]: E1126 15:07:37.552336 4785 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Nov 26 15:07:37 crc kubenswrapper[4785]: E1126 15:07:37.552365 4785 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-26 15:07:38.552356778 +0000 UTC m=+22.230722542 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Nov 26 15:07:37 crc kubenswrapper[4785]: E1126 15:07:37.552417 4785 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 26 15:07:37 crc kubenswrapper[4785]: E1126 15:07:37.552444 4785 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 26 15:07:37 crc kubenswrapper[4785]: E1126 15:07:37.552463 4785 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 26 15:07:37 crc kubenswrapper[4785]: E1126 15:07:37.552532 4785 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-11-26 15:07:38.552510982 +0000 UTC m=+22.230876786 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 26 15:07:38 crc kubenswrapper[4785]: I1126 15:07:38.021216 4785 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 26 15:07:38 crc kubenswrapper[4785]: I1126 15:07:38.035613 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 26 15:07:38 crc kubenswrapper[4785]: E1126 15:07:38.035752 4785 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 26 15:07:38 crc kubenswrapper[4785]: I1126 15:07:38.036734 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:38Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:38 crc kubenswrapper[4785]: I1126 15:07:38.049229 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:38Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:38 crc kubenswrapper[4785]: I1126 15:07:38.060462 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:38Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:38 crc kubenswrapper[4785]: I1126 15:07:38.077370 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:38Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:38 crc kubenswrapper[4785]: I1126 15:07:38.094793 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ce2473df-8540-437a-9b68-a0c79c8f1189\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26237e500f4655636096b017418a9029308b9ad9e5c12041546e51d919766529\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7904d95c1ece08fadcdc54e83f80b295f3904cc950fb84624bc2a6967e4d5bdd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://936454b4c6cdedceb140c50e2dba0e677ce2b126075b203bdae8a77bbdefde76\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a6d4414478527bb76f539125e4be6ebf2a6470ddb46fd35e07180319fa98ede\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a6d4414478527bb76f539125e4be6ebf2a6470ddb46fd35e07180319fa98ede\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-26T15:07:36Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1126 15:07:30.557519 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1126 15:07:30.559919 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2275430498/tls.crt::/tmp/serving-cert-2275430498/tls.key\\\\\\\"\\\\nI1126 15:07:36.843783 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1126 15:07:36.847546 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1126 15:07:36.847601 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1126 15:07:36.847630 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1126 15:07:36.847642 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1126 15:07:36.858915 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1126 15:07:36.858955 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1126 15:07:36.858960 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1126 15:07:36.858971 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1126 15:07:36.858975 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1126 15:07:36.858978 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1126 15:07:36.858982 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1126 15:07:36.859422 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1126 15:07:36.862125 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcce80995d95ff6993051a4f1ca260133d2513d216a291e3f2c905ec29f88d1b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e86ca1ef167bd0c31ad5ae1283d40945c72c4e528f3702b577bbbf9818e84c08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e86ca1ef167bd0c31ad5ae1283d40945c72c4e528f3702b577bbbf9818e84c08\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:38Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:38 crc kubenswrapper[4785]: I1126 15:07:38.113171 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:38Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:38 crc kubenswrapper[4785]: I1126 15:07:38.127865 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:38Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:38 crc kubenswrapper[4785]: I1126 15:07:38.181640 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"8acfa933b476aa5e196c195906225ae92af5cca3353292e23f29f87ee374d0ad"} Nov 26 15:07:38 crc kubenswrapper[4785]: I1126 15:07:38.181686 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"f432643beb27bff31e146230a60837b1292cddb8595ede8ed7f929170fceeb17"} Nov 26 15:07:38 crc kubenswrapper[4785]: I1126 15:07:38.183304 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"f369bd7df062792e57a0cee19e7be96c6850b20d6e739a0c7fa5eccc14ae1d1f"} Nov 26 15:07:38 crc kubenswrapper[4785]: I1126 15:07:38.184887 4785 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Nov 26 15:07:38 crc kubenswrapper[4785]: I1126 15:07:38.186454 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"2202f005c9b704cc7a97a7387e4563a14bda9e52001c63918f1067a7e676d34d"} Nov 26 15:07:38 crc kubenswrapper[4785]: I1126 15:07:38.186685 4785 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 26 15:07:38 crc kubenswrapper[4785]: I1126 15:07:38.191738 4785 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 26 15:07:38 crc kubenswrapper[4785]: I1126 15:07:38.201599 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:38Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:38 crc kubenswrapper[4785]: I1126 15:07:38.215021 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8acfa933b476aa5e196c195906225ae92af5cca3353292e23f29f87ee374d0ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f432643beb27bff31e146230a60837b1292cddb8595ede8ed7f929170fceeb17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:38Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:38 crc kubenswrapper[4785]: I1126 15:07:38.222400 4785 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-smv28"] Nov 26 15:07:38 crc kubenswrapper[4785]: I1126 15:07:38.222677 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-smv28" Nov 26 15:07:38 crc kubenswrapper[4785]: I1126 15:07:38.224199 4785 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Nov 26 15:07:38 crc kubenswrapper[4785]: I1126 15:07:38.225743 4785 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Nov 26 15:07:38 crc kubenswrapper[4785]: I1126 15:07:38.226244 4785 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Nov 26 15:07:38 crc kubenswrapper[4785]: I1126 15:07:38.230847 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:38Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:38 crc kubenswrapper[4785]: I1126 15:07:38.240971 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:38Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:38 crc kubenswrapper[4785]: I1126 15:07:38.254403 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ce2473df-8540-437a-9b68-a0c79c8f1189\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26237e500f4655636096b017418a9029308b9ad9e5c12041546e51d919766529\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7904d95c1ece08fadcdc54e83f80b295f3904cc950fb84624bc2a6967e4d5bdd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://936454b4c6cdedceb140c50e2dba0e677ce2b126075b203bdae8a77bbdefde76\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a6d4414478527bb76f539125e4be6ebf2a6470ddb46fd35e07180319fa98ede\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a6d4414478527bb76f539125e4be6ebf2a6470ddb46fd35e07180319fa98ede\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-26T15:07:36Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1126 15:07:30.557519 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1126 15:07:30.559919 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2275430498/tls.crt::/tmp/serving-cert-2275430498/tls.key\\\\\\\"\\\\nI1126 15:07:36.843783 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1126 15:07:36.847546 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1126 15:07:36.847601 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1126 15:07:36.847630 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1126 15:07:36.847642 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1126 15:07:36.858915 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1126 15:07:36.858955 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1126 15:07:36.858960 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1126 15:07:36.858971 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1126 15:07:36.858975 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1126 15:07:36.858978 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1126 15:07:36.858982 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1126 15:07:36.859422 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1126 15:07:36.862125 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcce80995d95ff6993051a4f1ca260133d2513d216a291e3f2c905ec29f88d1b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e86ca1ef167bd0c31ad5ae1283d40945c72c4e528f3702b577bbbf9818e84c08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e86ca1ef167bd0c31ad5ae1283d40945c72c4e528f3702b577bbbf9818e84c08\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:38Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:38 crc kubenswrapper[4785]: I1126 15:07:38.273074 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:38Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:38 crc kubenswrapper[4785]: I1126 15:07:38.288517 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:38Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:38 crc kubenswrapper[4785]: I1126 15:07:38.309024 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ce2473df-8540-437a-9b68-a0c79c8f1189\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26237e500f4655636096b017418a9029308b9ad9e5c12041546e51d919766529\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7904d95c1ece08fadcdc54e83f80b295f3904cc950fb84624bc2a6967e4d5bdd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://936454b4c6cdedceb140c50e2dba0e677ce2b126075b203bdae8a77bbdefde76\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2202f005c9b704cc7a97a7387e4563a14bda9e52001c63918f1067a7e676d34d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a6d4414478527bb76f539125e4be6ebf2a6470ddb46fd35e07180319fa98ede\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-26T15:07:36Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1126 15:07:30.557519 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1126 15:07:30.559919 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2275430498/tls.crt::/tmp/serving-cert-2275430498/tls.key\\\\\\\"\\\\nI1126 15:07:36.843783 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1126 15:07:36.847546 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1126 15:07:36.847601 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1126 15:07:36.847630 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1126 15:07:36.847642 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1126 15:07:36.858915 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1126 15:07:36.858955 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1126 15:07:36.858960 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1126 15:07:36.858971 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1126 15:07:36.858975 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1126 15:07:36.858978 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1126 15:07:36.858982 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1126 15:07:36.859422 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1126 15:07:36.862125 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:20Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcce80995d95ff6993051a4f1ca260133d2513d216a291e3f2c905ec29f88d1b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e86ca1ef167bd0c31ad5ae1283d40945c72c4e528f3702b577bbbf9818e84c08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e86ca1ef167bd0c31ad5ae1283d40945c72c4e528f3702b577bbbf9818e84c08\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:38Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:38 crc kubenswrapper[4785]: I1126 15:07:38.331108 4785 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-hk884"] Nov 26 15:07:38 crc kubenswrapper[4785]: I1126 15:07:38.331381 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-hk884" Nov 26 15:07:38 crc kubenswrapper[4785]: I1126 15:07:38.333753 4785 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Nov 26 15:07:38 crc kubenswrapper[4785]: I1126 15:07:38.333770 4785 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Nov 26 15:07:38 crc kubenswrapper[4785]: I1126 15:07:38.334646 4785 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Nov 26 15:07:38 crc kubenswrapper[4785]: I1126 15:07:38.337579 4785 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Nov 26 15:07:38 crc kubenswrapper[4785]: I1126 15:07:38.343801 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f369bd7df062792e57a0cee19e7be96c6850b20d6e739a0c7fa5eccc14ae1d1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:38Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:38 crc kubenswrapper[4785]: I1126 15:07:38.358631 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mwfft\" (UniqueName: \"kubernetes.io/projected/9d77660b-5b10-4573-84ab-3dc318d4b4ce-kube-api-access-mwfft\") pod \"node-resolver-smv28\" (UID: \"9d77660b-5b10-4573-84ab-3dc318d4b4ce\") " pod="openshift-dns/node-resolver-smv28" Nov 26 15:07:38 crc kubenswrapper[4785]: I1126 15:07:38.358689 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/9d77660b-5b10-4573-84ab-3dc318d4b4ce-hosts-file\") pod \"node-resolver-smv28\" (UID: \"9d77660b-5b10-4573-84ab-3dc318d4b4ce\") " pod="openshift-dns/node-resolver-smv28" Nov 26 15:07:38 crc kubenswrapper[4785]: I1126 15:07:38.370475 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:38Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:38 crc kubenswrapper[4785]: I1126 15:07:38.408542 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:38Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:38 crc kubenswrapper[4785]: I1126 15:07:38.444126 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:38Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:38 crc kubenswrapper[4785]: I1126 15:07:38.460025 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/9d77660b-5b10-4573-84ab-3dc318d4b4ce-hosts-file\") pod \"node-resolver-smv28\" (UID: \"9d77660b-5b10-4573-84ab-3dc318d4b4ce\") " pod="openshift-dns/node-resolver-smv28" Nov 26 15:07:38 crc kubenswrapper[4785]: I1126 15:07:38.460073 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b9a87c55-b930-4993-88b8-15902e000caa-host\") pod \"node-ca-hk884\" (UID: \"b9a87c55-b930-4993-88b8-15902e000caa\") " pod="openshift-image-registry/node-ca-hk884" Nov 26 15:07:38 crc kubenswrapper[4785]: I1126 15:07:38.460109 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mwfft\" (UniqueName: \"kubernetes.io/projected/9d77660b-5b10-4573-84ab-3dc318d4b4ce-kube-api-access-mwfft\") pod \"node-resolver-smv28\" (UID: \"9d77660b-5b10-4573-84ab-3dc318d4b4ce\") " pod="openshift-dns/node-resolver-smv28" Nov 26 15:07:38 crc kubenswrapper[4785]: I1126 15:07:38.460134 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/b9a87c55-b930-4993-88b8-15902e000caa-serviceca\") pod \"node-ca-hk884\" (UID: \"b9a87c55-b930-4993-88b8-15902e000caa\") " pod="openshift-image-registry/node-ca-hk884" Nov 26 15:07:38 crc kubenswrapper[4785]: I1126 15:07:38.460182 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/9d77660b-5b10-4573-84ab-3dc318d4b4ce-hosts-file\") pod \"node-resolver-smv28\" (UID: \"9d77660b-5b10-4573-84ab-3dc318d4b4ce\") " pod="openshift-dns/node-resolver-smv28" Nov 26 15:07:38 crc kubenswrapper[4785]: I1126 15:07:38.460245 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9n9rk\" (UniqueName: \"kubernetes.io/projected/b9a87c55-b930-4993-88b8-15902e000caa-kube-api-access-9n9rk\") pod \"node-ca-hk884\" (UID: \"b9a87c55-b930-4993-88b8-15902e000caa\") " pod="openshift-image-registry/node-ca-hk884" Nov 26 15:07:38 crc kubenswrapper[4785]: I1126 15:07:38.473924 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8acfa933b476aa5e196c195906225ae92af5cca3353292e23f29f87ee374d0ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f432643beb27bff31e146230a60837b1292cddb8595ede8ed7f929170fceeb17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:38Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:38 crc kubenswrapper[4785]: I1126 15:07:38.480204 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mwfft\" (UniqueName: \"kubernetes.io/projected/9d77660b-5b10-4573-84ab-3dc318d4b4ce-kube-api-access-mwfft\") pod \"node-resolver-smv28\" (UID: \"9d77660b-5b10-4573-84ab-3dc318d4b4ce\") " pod="openshift-dns/node-resolver-smv28" Nov 26 15:07:38 crc kubenswrapper[4785]: I1126 15:07:38.493328 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:38Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:38 crc kubenswrapper[4785]: I1126 15:07:38.505799 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-smv28" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d77660b-5b10-4573-84ab-3dc318d4b4ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mwfft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:38Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-smv28\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:38Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:38 crc kubenswrapper[4785]: I1126 15:07:38.515075 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8acfa933b476aa5e196c195906225ae92af5cca3353292e23f29f87ee374d0ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f432643beb27bff31e146230a60837b1292cddb8595ede8ed7f929170fceeb17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:38Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:38 crc kubenswrapper[4785]: I1126 15:07:38.528577 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:38Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:38 crc kubenswrapper[4785]: I1126 15:07:38.532575 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-smv28" Nov 26 15:07:38 crc kubenswrapper[4785]: W1126 15:07:38.541873 4785 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d77660b_5b10_4573_84ab_3dc318d4b4ce.slice/crio-ed6f381f0880d4270fa8020215a80ee18d351fb6fcddbc0ecb5543f018462e2a WatchSource:0}: Error finding container ed6f381f0880d4270fa8020215a80ee18d351fb6fcddbc0ecb5543f018462e2a: Status 404 returned error can't find the container with id ed6f381f0880d4270fa8020215a80ee18d351fb6fcddbc0ecb5543f018462e2a Nov 26 15:07:38 crc kubenswrapper[4785]: I1126 15:07:38.548305 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:38Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:38 crc kubenswrapper[4785]: I1126 15:07:38.560481 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 26 15:07:38 crc kubenswrapper[4785]: E1126 15:07:38.560658 4785 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-26 15:07:40.560632631 +0000 UTC m=+24.238998395 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 15:07:38 crc kubenswrapper[4785]: I1126 15:07:38.560777 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 26 15:07:38 crc kubenswrapper[4785]: I1126 15:07:38.560862 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9n9rk\" (UniqueName: \"kubernetes.io/projected/b9a87c55-b930-4993-88b8-15902e000caa-kube-api-access-9n9rk\") pod \"node-ca-hk884\" (UID: \"b9a87c55-b930-4993-88b8-15902e000caa\") " pod="openshift-image-registry/node-ca-hk884" Nov 26 15:07:38 crc kubenswrapper[4785]: I1126 15:07:38.560935 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 26 15:07:38 crc kubenswrapper[4785]: I1126 15:07:38.561007 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 26 15:07:38 crc kubenswrapper[4785]: I1126 15:07:38.561078 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b9a87c55-b930-4993-88b8-15902e000caa-host\") pod \"node-ca-hk884\" (UID: \"b9a87c55-b930-4993-88b8-15902e000caa\") " pod="openshift-image-registry/node-ca-hk884" Nov 26 15:07:38 crc kubenswrapper[4785]: I1126 15:07:38.561129 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b9a87c55-b930-4993-88b8-15902e000caa-host\") pod \"node-ca-hk884\" (UID: \"b9a87c55-b930-4993-88b8-15902e000caa\") " pod="openshift-image-registry/node-ca-hk884" Nov 26 15:07:38 crc kubenswrapper[4785]: E1126 15:07:38.560947 4785 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 26 15:07:38 crc kubenswrapper[4785]: E1126 15:07:38.561170 4785 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 26 15:07:38 crc kubenswrapper[4785]: E1126 15:07:38.561181 4785 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 26 15:07:38 crc kubenswrapper[4785]: E1126 15:07:38.561209 4785 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-11-26 15:07:40.561201656 +0000 UTC m=+24.239567420 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 26 15:07:38 crc kubenswrapper[4785]: E1126 15:07:38.560979 4785 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Nov 26 15:07:38 crc kubenswrapper[4785]: E1126 15:07:38.561240 4785 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-26 15:07:40.561232837 +0000 UTC m=+24.239598601 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Nov 26 15:07:38 crc kubenswrapper[4785]: E1126 15:07:38.561094 4785 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 26 15:07:38 crc kubenswrapper[4785]: E1126 15:07:38.561308 4785 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-26 15:07:40.561288048 +0000 UTC m=+24.239653812 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 26 15:07:38 crc kubenswrapper[4785]: E1126 15:07:38.561380 4785 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 26 15:07:38 crc kubenswrapper[4785]: E1126 15:07:38.561412 4785 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 26 15:07:38 crc kubenswrapper[4785]: E1126 15:07:38.561423 4785 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 26 15:07:38 crc kubenswrapper[4785]: E1126 15:07:38.561488 4785 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-11-26 15:07:40.561469023 +0000 UTC m=+24.239834787 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 26 15:07:38 crc kubenswrapper[4785]: I1126 15:07:38.561573 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 26 15:07:38 crc kubenswrapper[4785]: I1126 15:07:38.561651 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/b9a87c55-b930-4993-88b8-15902e000caa-serviceca\") pod \"node-ca-hk884\" (UID: \"b9a87c55-b930-4993-88b8-15902e000caa\") " pod="openshift-image-registry/node-ca-hk884" Nov 26 15:07:38 crc kubenswrapper[4785]: I1126 15:07:38.562356 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/b9a87c55-b930-4993-88b8-15902e000caa-serviceca\") pod \"node-ca-hk884\" (UID: \"b9a87c55-b930-4993-88b8-15902e000caa\") " pod="openshift-image-registry/node-ca-hk884" Nov 26 15:07:38 crc kubenswrapper[4785]: I1126 15:07:38.563122 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:38Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:38 crc kubenswrapper[4785]: I1126 15:07:38.574141 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9n9rk\" (UniqueName: \"kubernetes.io/projected/b9a87c55-b930-4993-88b8-15902e000caa-kube-api-access-9n9rk\") pod \"node-ca-hk884\" (UID: \"b9a87c55-b930-4993-88b8-15902e000caa\") " pod="openshift-image-registry/node-ca-hk884" Nov 26 15:07:38 crc kubenswrapper[4785]: I1126 15:07:38.575508 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-smv28" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d77660b-5b10-4573-84ab-3dc318d4b4ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mwfft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:38Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-smv28\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:38Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:38 crc kubenswrapper[4785]: I1126 15:07:38.590737 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:38Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:38 crc kubenswrapper[4785]: I1126 15:07:38.601965 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-hk884" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b9a87c55-b930-4993-88b8-15902e000caa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n9rk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:38Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-hk884\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:38Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:38 crc kubenswrapper[4785]: I1126 15:07:38.620648 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ce2473df-8540-437a-9b68-a0c79c8f1189\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26237e500f4655636096b017418a9029308b9ad9e5c12041546e51d919766529\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7904d95c1ece08fadcdc54e83f80b295f3904cc950fb84624bc2a6967e4d5bdd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://936454b4c6cdedceb140c50e2dba0e677ce2b126075b203bdae8a77bbdefde76\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2202f005c9b704cc7a97a7387e4563a14bda9e52001c63918f1067a7e676d34d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a6d4414478527bb76f539125e4be6ebf2a6470ddb46fd35e07180319fa98ede\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-26T15:07:36Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1126 15:07:30.557519 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1126 15:07:30.559919 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2275430498/tls.crt::/tmp/serving-cert-2275430498/tls.key\\\\\\\"\\\\nI1126 15:07:36.843783 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1126 15:07:36.847546 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1126 15:07:36.847601 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1126 15:07:36.847630 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1126 15:07:36.847642 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1126 15:07:36.858915 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1126 15:07:36.858955 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1126 15:07:36.858960 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1126 15:07:36.858971 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1126 15:07:36.858975 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1126 15:07:36.858978 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1126 15:07:36.858982 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1126 15:07:36.859422 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1126 15:07:36.862125 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:20Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcce80995d95ff6993051a4f1ca260133d2513d216a291e3f2c905ec29f88d1b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e86ca1ef167bd0c31ad5ae1283d40945c72c4e528f3702b577bbbf9818e84c08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e86ca1ef167bd0c31ad5ae1283d40945c72c4e528f3702b577bbbf9818e84c08\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:38Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:38 crc kubenswrapper[4785]: I1126 15:07:38.641060 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-hk884" Nov 26 15:07:38 crc kubenswrapper[4785]: I1126 15:07:38.641092 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f369bd7df062792e57a0cee19e7be96c6850b20d6e739a0c7fa5eccc14ae1d1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:38Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:38 crc kubenswrapper[4785]: W1126 15:07:38.650969 4785 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb9a87c55_b930_4993_88b8_15902e000caa.slice/crio-26c48a8078b815673db1c03ec0109b4f44e92ff6ebbc14ff463bd1abcc7cc994 WatchSource:0}: Error finding container 26c48a8078b815673db1c03ec0109b4f44e92ff6ebbc14ff463bd1abcc7cc994: Status 404 returned error can't find the container with id 26c48a8078b815673db1c03ec0109b4f44e92ff6ebbc14ff463bd1abcc7cc994 Nov 26 15:07:39 crc kubenswrapper[4785]: I1126 15:07:39.035425 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 26 15:07:39 crc kubenswrapper[4785]: I1126 15:07:39.035450 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 26 15:07:39 crc kubenswrapper[4785]: E1126 15:07:39.035585 4785 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 26 15:07:39 crc kubenswrapper[4785]: E1126 15:07:39.035712 4785 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 26 15:07:39 crc kubenswrapper[4785]: I1126 15:07:39.101535 4785 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-xbz7b"] Nov 26 15:07:39 crc kubenswrapper[4785]: I1126 15:07:39.102237 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-xbz7b" Nov 26 15:07:39 crc kubenswrapper[4785]: I1126 15:07:39.103768 4785 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-gkxdl"] Nov 26 15:07:39 crc kubenswrapper[4785]: I1126 15:07:39.104083 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-gkxdl" Nov 26 15:07:39 crc kubenswrapper[4785]: I1126 15:07:39.110569 4785 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Nov 26 15:07:39 crc kubenswrapper[4785]: I1126 15:07:39.110672 4785 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Nov 26 15:07:39 crc kubenswrapper[4785]: I1126 15:07:39.110968 4785 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Nov 26 15:07:39 crc kubenswrapper[4785]: I1126 15:07:39.111330 4785 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Nov 26 15:07:39 crc kubenswrapper[4785]: I1126 15:07:39.112354 4785 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Nov 26 15:07:39 crc kubenswrapper[4785]: I1126 15:07:39.113618 4785 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Nov 26 15:07:39 crc kubenswrapper[4785]: I1126 15:07:39.114618 4785 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Nov 26 15:07:39 crc kubenswrapper[4785]: I1126 15:07:39.114624 4785 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Nov 26 15:07:39 crc kubenswrapper[4785]: I1126 15:07:39.114768 4785 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-6q4xd"] Nov 26 15:07:39 crc kubenswrapper[4785]: I1126 15:07:39.114944 4785 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Nov 26 15:07:39 crc kubenswrapper[4785]: I1126 15:07:39.115063 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-6q4xd" Nov 26 15:07:39 crc kubenswrapper[4785]: I1126 15:07:39.118323 4785 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Nov 26 15:07:39 crc kubenswrapper[4785]: I1126 15:07:39.118441 4785 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-925q9"] Nov 26 15:07:39 crc kubenswrapper[4785]: I1126 15:07:39.118586 4785 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Nov 26 15:07:39 crc kubenswrapper[4785]: I1126 15:07:39.118590 4785 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Nov 26 15:07:39 crc kubenswrapper[4785]: I1126 15:07:39.119235 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-925q9" Nov 26 15:07:39 crc kubenswrapper[4785]: I1126 15:07:39.120256 4785 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Nov 26 15:07:39 crc kubenswrapper[4785]: I1126 15:07:39.122385 4785 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Nov 26 15:07:39 crc kubenswrapper[4785]: I1126 15:07:39.122385 4785 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Nov 26 15:07:39 crc kubenswrapper[4785]: I1126 15:07:39.124601 4785 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Nov 26 15:07:39 crc kubenswrapper[4785]: I1126 15:07:39.124867 4785 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Nov 26 15:07:39 crc kubenswrapper[4785]: I1126 15:07:39.124884 4785 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Nov 26 15:07:39 crc kubenswrapper[4785]: I1126 15:07:39.125064 4785 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Nov 26 15:07:39 crc kubenswrapper[4785]: I1126 15:07:39.139441 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:39Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:39 crc kubenswrapper[4785]: I1126 15:07:39.151527 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xbz7b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84d83039-5d86-45bb-a5c1-ca5b94ed92c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gdb7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gdb7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gdb7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gdb7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gdb7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gdb7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gdb7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xbz7b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:39Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:39 crc kubenswrapper[4785]: I1126 15:07:39.161244 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8acfa933b476aa5e196c195906225ae92af5cca3353292e23f29f87ee374d0ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f432643beb27bff31e146230a60837b1292cddb8595ede8ed7f929170fceeb17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:39Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:39 crc kubenswrapper[4785]: I1126 15:07:39.171458 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:39Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:39 crc kubenswrapper[4785]: I1126 15:07:39.179064 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-hk884" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b9a87c55-b930-4993-88b8-15902e000caa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n9rk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:38Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-hk884\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:39Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:39 crc kubenswrapper[4785]: I1126 15:07:39.189723 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:39Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:39 crc kubenswrapper[4785]: I1126 15:07:39.189973 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-hk884" event={"ID":"b9a87c55-b930-4993-88b8-15902e000caa","Type":"ContainerStarted","Data":"018c4591e8ee49a626a6357d6672b723d221f62eaa62face83fd6c3d5f1b3bfa"} Nov 26 15:07:39 crc kubenswrapper[4785]: I1126 15:07:39.190020 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-hk884" event={"ID":"b9a87c55-b930-4993-88b8-15902e000caa","Type":"ContainerStarted","Data":"26c48a8078b815673db1c03ec0109b4f44e92ff6ebbc14ff463bd1abcc7cc994"} Nov 26 15:07:39 crc kubenswrapper[4785]: I1126 15:07:39.191257 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-smv28" event={"ID":"9d77660b-5b10-4573-84ab-3dc318d4b4ce","Type":"ContainerStarted","Data":"a3bb62d64073e9a5efa6ec04394fb33bf1498892d59e0843f323a728dff2e7b4"} Nov 26 15:07:39 crc kubenswrapper[4785]: I1126 15:07:39.191295 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-smv28" event={"ID":"9d77660b-5b10-4573-84ab-3dc318d4b4ce","Type":"ContainerStarted","Data":"ed6f381f0880d4270fa8020215a80ee18d351fb6fcddbc0ecb5543f018462e2a"} Nov 26 15:07:39 crc kubenswrapper[4785]: I1126 15:07:39.200858 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-smv28" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d77660b-5b10-4573-84ab-3dc318d4b4ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mwfft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:38Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-smv28\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:39Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:39 crc kubenswrapper[4785]: I1126 15:07:39.215446 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ce2473df-8540-437a-9b68-a0c79c8f1189\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26237e500f4655636096b017418a9029308b9ad9e5c12041546e51d919766529\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7904d95c1ece08fadcdc54e83f80b295f3904cc950fb84624bc2a6967e4d5bdd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://936454b4c6cdedceb140c50e2dba0e677ce2b126075b203bdae8a77bbdefde76\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2202f005c9b704cc7a97a7387e4563a14bda9e52001c63918f1067a7e676d34d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a6d4414478527bb76f539125e4be6ebf2a6470ddb46fd35e07180319fa98ede\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-26T15:07:36Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1126 15:07:30.557519 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1126 15:07:30.559919 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2275430498/tls.crt::/tmp/serving-cert-2275430498/tls.key\\\\\\\"\\\\nI1126 15:07:36.843783 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1126 15:07:36.847546 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1126 15:07:36.847601 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1126 15:07:36.847630 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1126 15:07:36.847642 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1126 15:07:36.858915 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1126 15:07:36.858955 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1126 15:07:36.858960 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1126 15:07:36.858971 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1126 15:07:36.858975 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1126 15:07:36.858978 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1126 15:07:36.858982 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1126 15:07:36.859422 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1126 15:07:36.862125 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:20Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcce80995d95ff6993051a4f1ca260133d2513d216a291e3f2c905ec29f88d1b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e86ca1ef167bd0c31ad5ae1283d40945c72c4e528f3702b577bbbf9818e84c08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e86ca1ef167bd0c31ad5ae1283d40945c72c4e528f3702b577bbbf9818e84c08\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:39Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:39 crc kubenswrapper[4785]: I1126 15:07:39.230398 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f369bd7df062792e57a0cee19e7be96c6850b20d6e739a0c7fa5eccc14ae1d1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:39Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:39 crc kubenswrapper[4785]: I1126 15:07:39.243937 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:39Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:39 crc kubenswrapper[4785]: I1126 15:07:39.254826 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-hk884" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b9a87c55-b930-4993-88b8-15902e000caa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://018c4591e8ee49a626a6357d6672b723d221f62eaa62face83fd6c3d5f1b3bfa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n9rk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:38Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-hk884\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:39Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:39 crc kubenswrapper[4785]: I1126 15:07:39.266402 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/84d83039-5d86-45bb-a5c1-ca5b94ed92c5-system-cni-dir\") pod \"multus-additional-cni-plugins-xbz7b\" (UID: \"84d83039-5d86-45bb-a5c1-ca5b94ed92c5\") " pod="openshift-multus/multus-additional-cni-plugins-xbz7b" Nov 26 15:07:39 crc kubenswrapper[4785]: I1126 15:07:39.266444 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/855bd894-cca9-4fe1-a0d5-8b72afe7c93a-host-run-k8s-cni-cncf-io\") pod \"multus-6q4xd\" (UID: \"855bd894-cca9-4fe1-a0d5-8b72afe7c93a\") " pod="openshift-multus/multus-6q4xd" Nov 26 15:07:39 crc kubenswrapper[4785]: I1126 15:07:39.266464 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/855bd894-cca9-4fe1-a0d5-8b72afe7c93a-cni-binary-copy\") pod \"multus-6q4xd\" (UID: \"855bd894-cca9-4fe1-a0d5-8b72afe7c93a\") " pod="openshift-multus/multus-6q4xd" Nov 26 15:07:39 crc kubenswrapper[4785]: I1126 15:07:39.266484 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/855bd894-cca9-4fe1-a0d5-8b72afe7c93a-multus-conf-dir\") pod \"multus-6q4xd\" (UID: \"855bd894-cca9-4fe1-a0d5-8b72afe7c93a\") " pod="openshift-multus/multus-6q4xd" Nov 26 15:07:39 crc kubenswrapper[4785]: I1126 15:07:39.266500 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/855bd894-cca9-4fe1-a0d5-8b72afe7c93a-multus-daemon-config\") pod \"multus-6q4xd\" (UID: \"855bd894-cca9-4fe1-a0d5-8b72afe7c93a\") " pod="openshift-multus/multus-6q4xd" Nov 26 15:07:39 crc kubenswrapper[4785]: I1126 15:07:39.266520 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/5539d39a-e2bc-4e7f-8b5a-a5e3e10c4ba4-rootfs\") pod \"machine-config-daemon-gkxdl\" (UID: \"5539d39a-e2bc-4e7f-8b5a-a5e3e10c4ba4\") " pod="openshift-machine-config-operator/machine-config-daemon-gkxdl" Nov 26 15:07:39 crc kubenswrapper[4785]: I1126 15:07:39.266537 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/862c58fd-3f79-4276-bd76-ce689d32cbd6-var-lib-openvswitch\") pod \"ovnkube-node-925q9\" (UID: \"862c58fd-3f79-4276-bd76-ce689d32cbd6\") " pod="openshift-ovn-kubernetes/ovnkube-node-925q9" Nov 26 15:07:39 crc kubenswrapper[4785]: I1126 15:07:39.266571 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/862c58fd-3f79-4276-bd76-ce689d32cbd6-run-openvswitch\") pod \"ovnkube-node-925q9\" (UID: \"862c58fd-3f79-4276-bd76-ce689d32cbd6\") " pod="openshift-ovn-kubernetes/ovnkube-node-925q9" Nov 26 15:07:39 crc kubenswrapper[4785]: I1126 15:07:39.266586 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/84d83039-5d86-45bb-a5c1-ca5b94ed92c5-cni-binary-copy\") pod \"multus-additional-cni-plugins-xbz7b\" (UID: \"84d83039-5d86-45bb-a5c1-ca5b94ed92c5\") " pod="openshift-multus/multus-additional-cni-plugins-xbz7b" Nov 26 15:07:39 crc kubenswrapper[4785]: I1126 15:07:39.266638 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/855bd894-cca9-4fe1-a0d5-8b72afe7c93a-os-release\") pod \"multus-6q4xd\" (UID: \"855bd894-cca9-4fe1-a0d5-8b72afe7c93a\") " pod="openshift-multus/multus-6q4xd" Nov 26 15:07:39 crc kubenswrapper[4785]: I1126 15:07:39.266701 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/84d83039-5d86-45bb-a5c1-ca5b94ed92c5-tuning-conf-dir\") pod \"multus-additional-cni-plugins-xbz7b\" (UID: \"84d83039-5d86-45bb-a5c1-ca5b94ed92c5\") " pod="openshift-multus/multus-additional-cni-plugins-xbz7b" Nov 26 15:07:39 crc kubenswrapper[4785]: I1126 15:07:39.266813 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/84d83039-5d86-45bb-a5c1-ca5b94ed92c5-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-xbz7b\" (UID: \"84d83039-5d86-45bb-a5c1-ca5b94ed92c5\") " pod="openshift-multus/multus-additional-cni-plugins-xbz7b" Nov 26 15:07:39 crc kubenswrapper[4785]: I1126 15:07:39.266954 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fqzs8\" (UniqueName: \"kubernetes.io/projected/5539d39a-e2bc-4e7f-8b5a-a5e3e10c4ba4-kube-api-access-fqzs8\") pod \"machine-config-daemon-gkxdl\" (UID: \"5539d39a-e2bc-4e7f-8b5a-a5e3e10c4ba4\") " pod="openshift-machine-config-operator/machine-config-daemon-gkxdl" Nov 26 15:07:39 crc kubenswrapper[4785]: I1126 15:07:39.266986 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/862c58fd-3f79-4276-bd76-ce689d32cbd6-node-log\") pod \"ovnkube-node-925q9\" (UID: \"862c58fd-3f79-4276-bd76-ce689d32cbd6\") " pod="openshift-ovn-kubernetes/ovnkube-node-925q9" Nov 26 15:07:39 crc kubenswrapper[4785]: I1126 15:07:39.267007 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dgxb5\" (UniqueName: \"kubernetes.io/projected/862c58fd-3f79-4276-bd76-ce689d32cbd6-kube-api-access-dgxb5\") pod \"ovnkube-node-925q9\" (UID: \"862c58fd-3f79-4276-bd76-ce689d32cbd6\") " pod="openshift-ovn-kubernetes/ovnkube-node-925q9" Nov 26 15:07:39 crc kubenswrapper[4785]: I1126 15:07:39.267025 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/862c58fd-3f79-4276-bd76-ce689d32cbd6-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-925q9\" (UID: \"862c58fd-3f79-4276-bd76-ce689d32cbd6\") " pod="openshift-ovn-kubernetes/ovnkube-node-925q9" Nov 26 15:07:39 crc kubenswrapper[4785]: I1126 15:07:39.267041 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/862c58fd-3f79-4276-bd76-ce689d32cbd6-ovn-node-metrics-cert\") pod \"ovnkube-node-925q9\" (UID: \"862c58fd-3f79-4276-bd76-ce689d32cbd6\") " pod="openshift-ovn-kubernetes/ovnkube-node-925q9" Nov 26 15:07:39 crc kubenswrapper[4785]: I1126 15:07:39.267129 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/862c58fd-3f79-4276-bd76-ce689d32cbd6-host-kubelet\") pod \"ovnkube-node-925q9\" (UID: \"862c58fd-3f79-4276-bd76-ce689d32cbd6\") " pod="openshift-ovn-kubernetes/ovnkube-node-925q9" Nov 26 15:07:39 crc kubenswrapper[4785]: I1126 15:07:39.267157 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/862c58fd-3f79-4276-bd76-ce689d32cbd6-host-run-ovn-kubernetes\") pod \"ovnkube-node-925q9\" (UID: \"862c58fd-3f79-4276-bd76-ce689d32cbd6\") " pod="openshift-ovn-kubernetes/ovnkube-node-925q9" Nov 26 15:07:39 crc kubenswrapper[4785]: I1126 15:07:39.267234 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/862c58fd-3f79-4276-bd76-ce689d32cbd6-host-run-netns\") pod \"ovnkube-node-925q9\" (UID: \"862c58fd-3f79-4276-bd76-ce689d32cbd6\") " pod="openshift-ovn-kubernetes/ovnkube-node-925q9" Nov 26 15:07:39 crc kubenswrapper[4785]: I1126 15:07:39.267303 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/862c58fd-3f79-4276-bd76-ce689d32cbd6-host-cni-bin\") pod \"ovnkube-node-925q9\" (UID: \"862c58fd-3f79-4276-bd76-ce689d32cbd6\") " pod="openshift-ovn-kubernetes/ovnkube-node-925q9" Nov 26 15:07:39 crc kubenswrapper[4785]: I1126 15:07:39.267352 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/862c58fd-3f79-4276-bd76-ce689d32cbd6-etc-openvswitch\") pod \"ovnkube-node-925q9\" (UID: \"862c58fd-3f79-4276-bd76-ce689d32cbd6\") " pod="openshift-ovn-kubernetes/ovnkube-node-925q9" Nov 26 15:07:39 crc kubenswrapper[4785]: I1126 15:07:39.267377 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/855bd894-cca9-4fe1-a0d5-8b72afe7c93a-multus-socket-dir-parent\") pod \"multus-6q4xd\" (UID: \"855bd894-cca9-4fe1-a0d5-8b72afe7c93a\") " pod="openshift-multus/multus-6q4xd" Nov 26 15:07:39 crc kubenswrapper[4785]: I1126 15:07:39.267405 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/855bd894-cca9-4fe1-a0d5-8b72afe7c93a-host-var-lib-cni-bin\") pod \"multus-6q4xd\" (UID: \"855bd894-cca9-4fe1-a0d5-8b72afe7c93a\") " pod="openshift-multus/multus-6q4xd" Nov 26 15:07:39 crc kubenswrapper[4785]: I1126 15:07:39.267446 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/855bd894-cca9-4fe1-a0d5-8b72afe7c93a-host-var-lib-cni-multus\") pod \"multus-6q4xd\" (UID: \"855bd894-cca9-4fe1-a0d5-8b72afe7c93a\") " pod="openshift-multus/multus-6q4xd" Nov 26 15:07:39 crc kubenswrapper[4785]: I1126 15:07:39.267469 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v7zv2\" (UniqueName: \"kubernetes.io/projected/855bd894-cca9-4fe1-a0d5-8b72afe7c93a-kube-api-access-v7zv2\") pod \"multus-6q4xd\" (UID: \"855bd894-cca9-4fe1-a0d5-8b72afe7c93a\") " pod="openshift-multus/multus-6q4xd" Nov 26 15:07:39 crc kubenswrapper[4785]: I1126 15:07:39.267492 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/862c58fd-3f79-4276-bd76-ce689d32cbd6-env-overrides\") pod \"ovnkube-node-925q9\" (UID: \"862c58fd-3f79-4276-bd76-ce689d32cbd6\") " pod="openshift-ovn-kubernetes/ovnkube-node-925q9" Nov 26 15:07:39 crc kubenswrapper[4785]: I1126 15:07:39.267514 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/862c58fd-3f79-4276-bd76-ce689d32cbd6-host-cni-netd\") pod \"ovnkube-node-925q9\" (UID: \"862c58fd-3f79-4276-bd76-ce689d32cbd6\") " pod="openshift-ovn-kubernetes/ovnkube-node-925q9" Nov 26 15:07:39 crc kubenswrapper[4785]: I1126 15:07:39.267542 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/862c58fd-3f79-4276-bd76-ce689d32cbd6-ovnkube-config\") pod \"ovnkube-node-925q9\" (UID: \"862c58fd-3f79-4276-bd76-ce689d32cbd6\") " pod="openshift-ovn-kubernetes/ovnkube-node-925q9" Nov 26 15:07:39 crc kubenswrapper[4785]: I1126 15:07:39.267597 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/855bd894-cca9-4fe1-a0d5-8b72afe7c93a-cnibin\") pod \"multus-6q4xd\" (UID: \"855bd894-cca9-4fe1-a0d5-8b72afe7c93a\") " pod="openshift-multus/multus-6q4xd" Nov 26 15:07:39 crc kubenswrapper[4785]: I1126 15:07:39.267625 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5539d39a-e2bc-4e7f-8b5a-a5e3e10c4ba4-proxy-tls\") pod \"machine-config-daemon-gkxdl\" (UID: \"5539d39a-e2bc-4e7f-8b5a-a5e3e10c4ba4\") " pod="openshift-machine-config-operator/machine-config-daemon-gkxdl" Nov 26 15:07:39 crc kubenswrapper[4785]: I1126 15:07:39.267642 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/862c58fd-3f79-4276-bd76-ce689d32cbd6-run-systemd\") pod \"ovnkube-node-925q9\" (UID: \"862c58fd-3f79-4276-bd76-ce689d32cbd6\") " pod="openshift-ovn-kubernetes/ovnkube-node-925q9" Nov 26 15:07:39 crc kubenswrapper[4785]: I1126 15:07:39.267713 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/5539d39a-e2bc-4e7f-8b5a-a5e3e10c4ba4-mcd-auth-proxy-config\") pod \"machine-config-daemon-gkxdl\" (UID: \"5539d39a-e2bc-4e7f-8b5a-a5e3e10c4ba4\") " pod="openshift-machine-config-operator/machine-config-daemon-gkxdl" Nov 26 15:07:39 crc kubenswrapper[4785]: I1126 15:07:39.267758 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/855bd894-cca9-4fe1-a0d5-8b72afe7c93a-hostroot\") pod \"multus-6q4xd\" (UID: \"855bd894-cca9-4fe1-a0d5-8b72afe7c93a\") " pod="openshift-multus/multus-6q4xd" Nov 26 15:07:39 crc kubenswrapper[4785]: I1126 15:07:39.267784 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/862c58fd-3f79-4276-bd76-ce689d32cbd6-systemd-units\") pod \"ovnkube-node-925q9\" (UID: \"862c58fd-3f79-4276-bd76-ce689d32cbd6\") " pod="openshift-ovn-kubernetes/ovnkube-node-925q9" Nov 26 15:07:39 crc kubenswrapper[4785]: I1126 15:07:39.267808 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/855bd894-cca9-4fe1-a0d5-8b72afe7c93a-host-run-multus-certs\") pod \"multus-6q4xd\" (UID: \"855bd894-cca9-4fe1-a0d5-8b72afe7c93a\") " pod="openshift-multus/multus-6q4xd" Nov 26 15:07:39 crc kubenswrapper[4785]: I1126 15:07:39.267877 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/862c58fd-3f79-4276-bd76-ce689d32cbd6-run-ovn\") pod \"ovnkube-node-925q9\" (UID: \"862c58fd-3f79-4276-bd76-ce689d32cbd6\") " pod="openshift-ovn-kubernetes/ovnkube-node-925q9" Nov 26 15:07:39 crc kubenswrapper[4785]: I1126 15:07:39.267905 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/855bd894-cca9-4fe1-a0d5-8b72afe7c93a-host-run-netns\") pod \"multus-6q4xd\" (UID: \"855bd894-cca9-4fe1-a0d5-8b72afe7c93a\") " pod="openshift-multus/multus-6q4xd" Nov 26 15:07:39 crc kubenswrapper[4785]: I1126 15:07:39.267933 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/855bd894-cca9-4fe1-a0d5-8b72afe7c93a-system-cni-dir\") pod \"multus-6q4xd\" (UID: \"855bd894-cca9-4fe1-a0d5-8b72afe7c93a\") " pod="openshift-multus/multus-6q4xd" Nov 26 15:07:39 crc kubenswrapper[4785]: I1126 15:07:39.267964 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/855bd894-cca9-4fe1-a0d5-8b72afe7c93a-host-var-lib-kubelet\") pod \"multus-6q4xd\" (UID: \"855bd894-cca9-4fe1-a0d5-8b72afe7c93a\") " pod="openshift-multus/multus-6q4xd" Nov 26 15:07:39 crc kubenswrapper[4785]: I1126 15:07:39.268014 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/862c58fd-3f79-4276-bd76-ce689d32cbd6-host-slash\") pod \"ovnkube-node-925q9\" (UID: \"862c58fd-3f79-4276-bd76-ce689d32cbd6\") " pod="openshift-ovn-kubernetes/ovnkube-node-925q9" Nov 26 15:07:39 crc kubenswrapper[4785]: I1126 15:07:39.268036 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/862c58fd-3f79-4276-bd76-ce689d32cbd6-log-socket\") pod \"ovnkube-node-925q9\" (UID: \"862c58fd-3f79-4276-bd76-ce689d32cbd6\") " pod="openshift-ovn-kubernetes/ovnkube-node-925q9" Nov 26 15:07:39 crc kubenswrapper[4785]: I1126 15:07:39.268051 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/862c58fd-3f79-4276-bd76-ce689d32cbd6-ovnkube-script-lib\") pod \"ovnkube-node-925q9\" (UID: \"862c58fd-3f79-4276-bd76-ce689d32cbd6\") " pod="openshift-ovn-kubernetes/ovnkube-node-925q9" Nov 26 15:07:39 crc kubenswrapper[4785]: I1126 15:07:39.268067 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gdb7p\" (UniqueName: \"kubernetes.io/projected/84d83039-5d86-45bb-a5c1-ca5b94ed92c5-kube-api-access-gdb7p\") pod \"multus-additional-cni-plugins-xbz7b\" (UID: \"84d83039-5d86-45bb-a5c1-ca5b94ed92c5\") " pod="openshift-multus/multus-additional-cni-plugins-xbz7b" Nov 26 15:07:39 crc kubenswrapper[4785]: I1126 15:07:39.268108 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/855bd894-cca9-4fe1-a0d5-8b72afe7c93a-multus-cni-dir\") pod \"multus-6q4xd\" (UID: \"855bd894-cca9-4fe1-a0d5-8b72afe7c93a\") " pod="openshift-multus/multus-6q4xd" Nov 26 15:07:39 crc kubenswrapper[4785]: I1126 15:07:39.268179 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/855bd894-cca9-4fe1-a0d5-8b72afe7c93a-etc-kubernetes\") pod \"multus-6q4xd\" (UID: \"855bd894-cca9-4fe1-a0d5-8b72afe7c93a\") " pod="openshift-multus/multus-6q4xd" Nov 26 15:07:39 crc kubenswrapper[4785]: I1126 15:07:39.268270 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/84d83039-5d86-45bb-a5c1-ca5b94ed92c5-cnibin\") pod \"multus-additional-cni-plugins-xbz7b\" (UID: \"84d83039-5d86-45bb-a5c1-ca5b94ed92c5\") " pod="openshift-multus/multus-additional-cni-plugins-xbz7b" Nov 26 15:07:39 crc kubenswrapper[4785]: I1126 15:07:39.268298 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/84d83039-5d86-45bb-a5c1-ca5b94ed92c5-os-release\") pod \"multus-additional-cni-plugins-xbz7b\" (UID: \"84d83039-5d86-45bb-a5c1-ca5b94ed92c5\") " pod="openshift-multus/multus-additional-cni-plugins-xbz7b" Nov 26 15:07:39 crc kubenswrapper[4785]: I1126 15:07:39.270672 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6q4xd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"855bd894-cca9-4fe1-a0d5-8b72afe7c93a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7zv2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6q4xd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:39Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:39 crc kubenswrapper[4785]: I1126 15:07:39.285716 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:39Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:39 crc kubenswrapper[4785]: I1126 15:07:39.296054 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-smv28" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d77660b-5b10-4573-84ab-3dc318d4b4ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3bb62d64073e9a5efa6ec04394fb33bf1498892d59e0843f323a728dff2e7b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mwfft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:38Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-smv28\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:39Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:39 crc kubenswrapper[4785]: I1126 15:07:39.313458 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ce2473df-8540-437a-9b68-a0c79c8f1189\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26237e500f4655636096b017418a9029308b9ad9e5c12041546e51d919766529\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7904d95c1ece08fadcdc54e83f80b295f3904cc950fb84624bc2a6967e4d5bdd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://936454b4c6cdedceb140c50e2dba0e677ce2b126075b203bdae8a77bbdefde76\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2202f005c9b704cc7a97a7387e4563a14bda9e52001c63918f1067a7e676d34d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a6d4414478527bb76f539125e4be6ebf2a6470ddb46fd35e07180319fa98ede\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-26T15:07:36Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1126 15:07:30.557519 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1126 15:07:30.559919 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2275430498/tls.crt::/tmp/serving-cert-2275430498/tls.key\\\\\\\"\\\\nI1126 15:07:36.843783 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1126 15:07:36.847546 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1126 15:07:36.847601 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1126 15:07:36.847630 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1126 15:07:36.847642 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1126 15:07:36.858915 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1126 15:07:36.858955 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1126 15:07:36.858960 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1126 15:07:36.858971 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1126 15:07:36.858975 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1126 15:07:36.858978 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1126 15:07:36.858982 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1126 15:07:36.859422 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1126 15:07:36.862125 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:20Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcce80995d95ff6993051a4f1ca260133d2513d216a291e3f2c905ec29f88d1b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e86ca1ef167bd0c31ad5ae1283d40945c72c4e528f3702b577bbbf9818e84c08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e86ca1ef167bd0c31ad5ae1283d40945c72c4e528f3702b577bbbf9818e84c08\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:39Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:39 crc kubenswrapper[4785]: I1126 15:07:39.328004 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f369bd7df062792e57a0cee19e7be96c6850b20d6e739a0c7fa5eccc14ae1d1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:39Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:39 crc kubenswrapper[4785]: I1126 15:07:39.338794 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:39Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:39 crc kubenswrapper[4785]: I1126 15:07:39.350778 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:39Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:39 crc kubenswrapper[4785]: I1126 15:07:39.364829 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:39Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:39 crc kubenswrapper[4785]: I1126 15:07:39.368719 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/855bd894-cca9-4fe1-a0d5-8b72afe7c93a-host-var-lib-kubelet\") pod \"multus-6q4xd\" (UID: \"855bd894-cca9-4fe1-a0d5-8b72afe7c93a\") " pod="openshift-multus/multus-6q4xd" Nov 26 15:07:39 crc kubenswrapper[4785]: I1126 15:07:39.368761 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/862c58fd-3f79-4276-bd76-ce689d32cbd6-host-slash\") pod \"ovnkube-node-925q9\" (UID: \"862c58fd-3f79-4276-bd76-ce689d32cbd6\") " pod="openshift-ovn-kubernetes/ovnkube-node-925q9" Nov 26 15:07:39 crc kubenswrapper[4785]: I1126 15:07:39.368781 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/862c58fd-3f79-4276-bd76-ce689d32cbd6-log-socket\") pod \"ovnkube-node-925q9\" (UID: \"862c58fd-3f79-4276-bd76-ce689d32cbd6\") " pod="openshift-ovn-kubernetes/ovnkube-node-925q9" Nov 26 15:07:39 crc kubenswrapper[4785]: I1126 15:07:39.368802 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/862c58fd-3f79-4276-bd76-ce689d32cbd6-ovnkube-script-lib\") pod \"ovnkube-node-925q9\" (UID: \"862c58fd-3f79-4276-bd76-ce689d32cbd6\") " pod="openshift-ovn-kubernetes/ovnkube-node-925q9" Nov 26 15:07:39 crc kubenswrapper[4785]: I1126 15:07:39.368822 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/855bd894-cca9-4fe1-a0d5-8b72afe7c93a-system-cni-dir\") pod \"multus-6q4xd\" (UID: \"855bd894-cca9-4fe1-a0d5-8b72afe7c93a\") " pod="openshift-multus/multus-6q4xd" Nov 26 15:07:39 crc kubenswrapper[4785]: I1126 15:07:39.368840 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/855bd894-cca9-4fe1-a0d5-8b72afe7c93a-multus-cni-dir\") pod \"multus-6q4xd\" (UID: \"855bd894-cca9-4fe1-a0d5-8b72afe7c93a\") " pod="openshift-multus/multus-6q4xd" Nov 26 15:07:39 crc kubenswrapper[4785]: I1126 15:07:39.368857 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/855bd894-cca9-4fe1-a0d5-8b72afe7c93a-etc-kubernetes\") pod \"multus-6q4xd\" (UID: \"855bd894-cca9-4fe1-a0d5-8b72afe7c93a\") " pod="openshift-multus/multus-6q4xd" Nov 26 15:07:39 crc kubenswrapper[4785]: I1126 15:07:39.368862 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/855bd894-cca9-4fe1-a0d5-8b72afe7c93a-host-var-lib-kubelet\") pod \"multus-6q4xd\" (UID: \"855bd894-cca9-4fe1-a0d5-8b72afe7c93a\") " pod="openshift-multus/multus-6q4xd" Nov 26 15:07:39 crc kubenswrapper[4785]: I1126 15:07:39.368923 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/862c58fd-3f79-4276-bd76-ce689d32cbd6-log-socket\") pod \"ovnkube-node-925q9\" (UID: \"862c58fd-3f79-4276-bd76-ce689d32cbd6\") " pod="openshift-ovn-kubernetes/ovnkube-node-925q9" Nov 26 15:07:39 crc kubenswrapper[4785]: I1126 15:07:39.368994 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/862c58fd-3f79-4276-bd76-ce689d32cbd6-host-slash\") pod \"ovnkube-node-925q9\" (UID: \"862c58fd-3f79-4276-bd76-ce689d32cbd6\") " pod="openshift-ovn-kubernetes/ovnkube-node-925q9" Nov 26 15:07:39 crc kubenswrapper[4785]: I1126 15:07:39.369037 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/855bd894-cca9-4fe1-a0d5-8b72afe7c93a-etc-kubernetes\") pod \"multus-6q4xd\" (UID: \"855bd894-cca9-4fe1-a0d5-8b72afe7c93a\") " pod="openshift-multus/multus-6q4xd" Nov 26 15:07:39 crc kubenswrapper[4785]: I1126 15:07:39.368877 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/84d83039-5d86-45bb-a5c1-ca5b94ed92c5-cnibin\") pod \"multus-additional-cni-plugins-xbz7b\" (UID: \"84d83039-5d86-45bb-a5c1-ca5b94ed92c5\") " pod="openshift-multus/multus-additional-cni-plugins-xbz7b" Nov 26 15:07:39 crc kubenswrapper[4785]: I1126 15:07:39.369068 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/855bd894-cca9-4fe1-a0d5-8b72afe7c93a-system-cni-dir\") pod \"multus-6q4xd\" (UID: \"855bd894-cca9-4fe1-a0d5-8b72afe7c93a\") " pod="openshift-multus/multus-6q4xd" Nov 26 15:07:39 crc kubenswrapper[4785]: I1126 15:07:39.369085 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/84d83039-5d86-45bb-a5c1-ca5b94ed92c5-os-release\") pod \"multus-additional-cni-plugins-xbz7b\" (UID: \"84d83039-5d86-45bb-a5c1-ca5b94ed92c5\") " pod="openshift-multus/multus-additional-cni-plugins-xbz7b" Nov 26 15:07:39 crc kubenswrapper[4785]: I1126 15:07:39.369102 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/855bd894-cca9-4fe1-a0d5-8b72afe7c93a-multus-cni-dir\") pod \"multus-6q4xd\" (UID: \"855bd894-cca9-4fe1-a0d5-8b72afe7c93a\") " pod="openshift-multus/multus-6q4xd" Nov 26 15:07:39 crc kubenswrapper[4785]: I1126 15:07:39.369115 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/84d83039-5d86-45bb-a5c1-ca5b94ed92c5-cnibin\") pod \"multus-additional-cni-plugins-xbz7b\" (UID: \"84d83039-5d86-45bb-a5c1-ca5b94ed92c5\") " pod="openshift-multus/multus-additional-cni-plugins-xbz7b" Nov 26 15:07:39 crc kubenswrapper[4785]: I1126 15:07:39.369153 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gdb7p\" (UniqueName: \"kubernetes.io/projected/84d83039-5d86-45bb-a5c1-ca5b94ed92c5-kube-api-access-gdb7p\") pod \"multus-additional-cni-plugins-xbz7b\" (UID: \"84d83039-5d86-45bb-a5c1-ca5b94ed92c5\") " pod="openshift-multus/multus-additional-cni-plugins-xbz7b" Nov 26 15:07:39 crc kubenswrapper[4785]: I1126 15:07:39.369238 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/84d83039-5d86-45bb-a5c1-ca5b94ed92c5-system-cni-dir\") pod \"multus-additional-cni-plugins-xbz7b\" (UID: \"84d83039-5d86-45bb-a5c1-ca5b94ed92c5\") " pod="openshift-multus/multus-additional-cni-plugins-xbz7b" Nov 26 15:07:39 crc kubenswrapper[4785]: I1126 15:07:39.369329 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/855bd894-cca9-4fe1-a0d5-8b72afe7c93a-host-run-k8s-cni-cncf-io\") pod \"multus-6q4xd\" (UID: \"855bd894-cca9-4fe1-a0d5-8b72afe7c93a\") " pod="openshift-multus/multus-6q4xd" Nov 26 15:07:39 crc kubenswrapper[4785]: I1126 15:07:39.369345 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/84d83039-5d86-45bb-a5c1-ca5b94ed92c5-os-release\") pod \"multus-additional-cni-plugins-xbz7b\" (UID: \"84d83039-5d86-45bb-a5c1-ca5b94ed92c5\") " pod="openshift-multus/multus-additional-cni-plugins-xbz7b" Nov 26 15:07:39 crc kubenswrapper[4785]: I1126 15:07:39.369350 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/855bd894-cca9-4fe1-a0d5-8b72afe7c93a-multus-conf-dir\") pod \"multus-6q4xd\" (UID: \"855bd894-cca9-4fe1-a0d5-8b72afe7c93a\") " pod="openshift-multus/multus-6q4xd" Nov 26 15:07:39 crc kubenswrapper[4785]: I1126 15:07:39.369377 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/855bd894-cca9-4fe1-a0d5-8b72afe7c93a-multus-conf-dir\") pod \"multus-6q4xd\" (UID: \"855bd894-cca9-4fe1-a0d5-8b72afe7c93a\") " pod="openshift-multus/multus-6q4xd" Nov 26 15:07:39 crc kubenswrapper[4785]: I1126 15:07:39.369381 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/855bd894-cca9-4fe1-a0d5-8b72afe7c93a-host-run-k8s-cni-cncf-io\") pod \"multus-6q4xd\" (UID: \"855bd894-cca9-4fe1-a0d5-8b72afe7c93a\") " pod="openshift-multus/multus-6q4xd" Nov 26 15:07:39 crc kubenswrapper[4785]: I1126 15:07:39.369420 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/84d83039-5d86-45bb-a5c1-ca5b94ed92c5-system-cni-dir\") pod \"multus-additional-cni-plugins-xbz7b\" (UID: \"84d83039-5d86-45bb-a5c1-ca5b94ed92c5\") " pod="openshift-multus/multus-additional-cni-plugins-xbz7b" Nov 26 15:07:39 crc kubenswrapper[4785]: I1126 15:07:39.369423 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/855bd894-cca9-4fe1-a0d5-8b72afe7c93a-multus-daemon-config\") pod \"multus-6q4xd\" (UID: \"855bd894-cca9-4fe1-a0d5-8b72afe7c93a\") " pod="openshift-multus/multus-6q4xd" Nov 26 15:07:39 crc kubenswrapper[4785]: I1126 15:07:39.369475 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/5539d39a-e2bc-4e7f-8b5a-a5e3e10c4ba4-rootfs\") pod \"machine-config-daemon-gkxdl\" (UID: \"5539d39a-e2bc-4e7f-8b5a-a5e3e10c4ba4\") " pod="openshift-machine-config-operator/machine-config-daemon-gkxdl" Nov 26 15:07:39 crc kubenswrapper[4785]: I1126 15:07:39.369510 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/862c58fd-3f79-4276-bd76-ce689d32cbd6-var-lib-openvswitch\") pod \"ovnkube-node-925q9\" (UID: \"862c58fd-3f79-4276-bd76-ce689d32cbd6\") " pod="openshift-ovn-kubernetes/ovnkube-node-925q9" Nov 26 15:07:39 crc kubenswrapper[4785]: I1126 15:07:39.369531 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/5539d39a-e2bc-4e7f-8b5a-a5e3e10c4ba4-rootfs\") pod \"machine-config-daemon-gkxdl\" (UID: \"5539d39a-e2bc-4e7f-8b5a-a5e3e10c4ba4\") " pod="openshift-machine-config-operator/machine-config-daemon-gkxdl" Nov 26 15:07:39 crc kubenswrapper[4785]: I1126 15:07:39.369593 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/862c58fd-3f79-4276-bd76-ce689d32cbd6-var-lib-openvswitch\") pod \"ovnkube-node-925q9\" (UID: \"862c58fd-3f79-4276-bd76-ce689d32cbd6\") " pod="openshift-ovn-kubernetes/ovnkube-node-925q9" Nov 26 15:07:39 crc kubenswrapper[4785]: I1126 15:07:39.369605 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/862c58fd-3f79-4276-bd76-ce689d32cbd6-run-openvswitch\") pod \"ovnkube-node-925q9\" (UID: \"862c58fd-3f79-4276-bd76-ce689d32cbd6\") " pod="openshift-ovn-kubernetes/ovnkube-node-925q9" Nov 26 15:07:39 crc kubenswrapper[4785]: I1126 15:07:39.369629 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/84d83039-5d86-45bb-a5c1-ca5b94ed92c5-cni-binary-copy\") pod \"multus-additional-cni-plugins-xbz7b\" (UID: \"84d83039-5d86-45bb-a5c1-ca5b94ed92c5\") " pod="openshift-multus/multus-additional-cni-plugins-xbz7b" Nov 26 15:07:39 crc kubenswrapper[4785]: I1126 15:07:39.369650 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/862c58fd-3f79-4276-bd76-ce689d32cbd6-run-openvswitch\") pod \"ovnkube-node-925q9\" (UID: \"862c58fd-3f79-4276-bd76-ce689d32cbd6\") " pod="openshift-ovn-kubernetes/ovnkube-node-925q9" Nov 26 15:07:39 crc kubenswrapper[4785]: I1126 15:07:39.369655 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/855bd894-cca9-4fe1-a0d5-8b72afe7c93a-os-release\") pod \"multus-6q4xd\" (UID: \"855bd894-cca9-4fe1-a0d5-8b72afe7c93a\") " pod="openshift-multus/multus-6q4xd" Nov 26 15:07:39 crc kubenswrapper[4785]: I1126 15:07:39.369677 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/855bd894-cca9-4fe1-a0d5-8b72afe7c93a-cni-binary-copy\") pod \"multus-6q4xd\" (UID: \"855bd894-cca9-4fe1-a0d5-8b72afe7c93a\") " pod="openshift-multus/multus-6q4xd" Nov 26 15:07:39 crc kubenswrapper[4785]: I1126 15:07:39.369695 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/84d83039-5d86-45bb-a5c1-ca5b94ed92c5-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-xbz7b\" (UID: \"84d83039-5d86-45bb-a5c1-ca5b94ed92c5\") " pod="openshift-multus/multus-additional-cni-plugins-xbz7b" Nov 26 15:07:39 crc kubenswrapper[4785]: I1126 15:07:39.369719 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fqzs8\" (UniqueName: \"kubernetes.io/projected/5539d39a-e2bc-4e7f-8b5a-a5e3e10c4ba4-kube-api-access-fqzs8\") pod \"machine-config-daemon-gkxdl\" (UID: \"5539d39a-e2bc-4e7f-8b5a-a5e3e10c4ba4\") " pod="openshift-machine-config-operator/machine-config-daemon-gkxdl" Nov 26 15:07:39 crc kubenswrapper[4785]: I1126 15:07:39.369737 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/862c58fd-3f79-4276-bd76-ce689d32cbd6-node-log\") pod \"ovnkube-node-925q9\" (UID: \"862c58fd-3f79-4276-bd76-ce689d32cbd6\") " pod="openshift-ovn-kubernetes/ovnkube-node-925q9" Nov 26 15:07:39 crc kubenswrapper[4785]: I1126 15:07:39.369756 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dgxb5\" (UniqueName: \"kubernetes.io/projected/862c58fd-3f79-4276-bd76-ce689d32cbd6-kube-api-access-dgxb5\") pod \"ovnkube-node-925q9\" (UID: \"862c58fd-3f79-4276-bd76-ce689d32cbd6\") " pod="openshift-ovn-kubernetes/ovnkube-node-925q9" Nov 26 15:07:39 crc kubenswrapper[4785]: I1126 15:07:39.369780 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/84d83039-5d86-45bb-a5c1-ca5b94ed92c5-tuning-conf-dir\") pod \"multus-additional-cni-plugins-xbz7b\" (UID: \"84d83039-5d86-45bb-a5c1-ca5b94ed92c5\") " pod="openshift-multus/multus-additional-cni-plugins-xbz7b" Nov 26 15:07:39 crc kubenswrapper[4785]: I1126 15:07:39.369801 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/862c58fd-3f79-4276-bd76-ce689d32cbd6-ovn-node-metrics-cert\") pod \"ovnkube-node-925q9\" (UID: \"862c58fd-3f79-4276-bd76-ce689d32cbd6\") " pod="openshift-ovn-kubernetes/ovnkube-node-925q9" Nov 26 15:07:39 crc kubenswrapper[4785]: I1126 15:07:39.369825 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/862c58fd-3f79-4276-bd76-ce689d32cbd6-host-kubelet\") pod \"ovnkube-node-925q9\" (UID: \"862c58fd-3f79-4276-bd76-ce689d32cbd6\") " pod="openshift-ovn-kubernetes/ovnkube-node-925q9" Nov 26 15:07:39 crc kubenswrapper[4785]: I1126 15:07:39.369846 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/862c58fd-3f79-4276-bd76-ce689d32cbd6-host-run-ovn-kubernetes\") pod \"ovnkube-node-925q9\" (UID: \"862c58fd-3f79-4276-bd76-ce689d32cbd6\") " pod="openshift-ovn-kubernetes/ovnkube-node-925q9" Nov 26 15:07:39 crc kubenswrapper[4785]: I1126 15:07:39.369867 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/862c58fd-3f79-4276-bd76-ce689d32cbd6-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-925q9\" (UID: \"862c58fd-3f79-4276-bd76-ce689d32cbd6\") " pod="openshift-ovn-kubernetes/ovnkube-node-925q9" Nov 26 15:07:39 crc kubenswrapper[4785]: I1126 15:07:39.369892 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/862c58fd-3f79-4276-bd76-ce689d32cbd6-ovnkube-script-lib\") pod \"ovnkube-node-925q9\" (UID: \"862c58fd-3f79-4276-bd76-ce689d32cbd6\") " pod="openshift-ovn-kubernetes/ovnkube-node-925q9" Nov 26 15:07:39 crc kubenswrapper[4785]: I1126 15:07:39.369930 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/862c58fd-3f79-4276-bd76-ce689d32cbd6-host-run-netns\") pod \"ovnkube-node-925q9\" (UID: \"862c58fd-3f79-4276-bd76-ce689d32cbd6\") " pod="openshift-ovn-kubernetes/ovnkube-node-925q9" Nov 26 15:07:39 crc kubenswrapper[4785]: I1126 15:07:39.369899 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/862c58fd-3f79-4276-bd76-ce689d32cbd6-host-run-netns\") pod \"ovnkube-node-925q9\" (UID: \"862c58fd-3f79-4276-bd76-ce689d32cbd6\") " pod="openshift-ovn-kubernetes/ovnkube-node-925q9" Nov 26 15:07:39 crc kubenswrapper[4785]: I1126 15:07:39.369979 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/862c58fd-3f79-4276-bd76-ce689d32cbd6-host-cni-bin\") pod \"ovnkube-node-925q9\" (UID: \"862c58fd-3f79-4276-bd76-ce689d32cbd6\") " pod="openshift-ovn-kubernetes/ovnkube-node-925q9" Nov 26 15:07:39 crc kubenswrapper[4785]: I1126 15:07:39.370001 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/855bd894-cca9-4fe1-a0d5-8b72afe7c93a-multus-socket-dir-parent\") pod \"multus-6q4xd\" (UID: \"855bd894-cca9-4fe1-a0d5-8b72afe7c93a\") " pod="openshift-multus/multus-6q4xd" Nov 26 15:07:39 crc kubenswrapper[4785]: I1126 15:07:39.370022 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/855bd894-cca9-4fe1-a0d5-8b72afe7c93a-host-var-lib-cni-bin\") pod \"multus-6q4xd\" (UID: \"855bd894-cca9-4fe1-a0d5-8b72afe7c93a\") " pod="openshift-multus/multus-6q4xd" Nov 26 15:07:39 crc kubenswrapper[4785]: I1126 15:07:39.370083 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/862c58fd-3f79-4276-bd76-ce689d32cbd6-etc-openvswitch\") pod \"ovnkube-node-925q9\" (UID: \"862c58fd-3f79-4276-bd76-ce689d32cbd6\") " pod="openshift-ovn-kubernetes/ovnkube-node-925q9" Nov 26 15:07:39 crc kubenswrapper[4785]: I1126 15:07:39.370109 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/855bd894-cca9-4fe1-a0d5-8b72afe7c93a-host-var-lib-cni-multus\") pod \"multus-6q4xd\" (UID: \"855bd894-cca9-4fe1-a0d5-8b72afe7c93a\") " pod="openshift-multus/multus-6q4xd" Nov 26 15:07:39 crc kubenswrapper[4785]: I1126 15:07:39.370132 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v7zv2\" (UniqueName: \"kubernetes.io/projected/855bd894-cca9-4fe1-a0d5-8b72afe7c93a-kube-api-access-v7zv2\") pod \"multus-6q4xd\" (UID: \"855bd894-cca9-4fe1-a0d5-8b72afe7c93a\") " pod="openshift-multus/multus-6q4xd" Nov 26 15:07:39 crc kubenswrapper[4785]: I1126 15:07:39.370153 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/862c58fd-3f79-4276-bd76-ce689d32cbd6-env-overrides\") pod \"ovnkube-node-925q9\" (UID: \"862c58fd-3f79-4276-bd76-ce689d32cbd6\") " pod="openshift-ovn-kubernetes/ovnkube-node-925q9" Nov 26 15:07:39 crc kubenswrapper[4785]: I1126 15:07:39.370177 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/862c58fd-3f79-4276-bd76-ce689d32cbd6-ovnkube-config\") pod \"ovnkube-node-925q9\" (UID: \"862c58fd-3f79-4276-bd76-ce689d32cbd6\") " pod="openshift-ovn-kubernetes/ovnkube-node-925q9" Nov 26 15:07:39 crc kubenswrapper[4785]: I1126 15:07:39.370196 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/855bd894-cca9-4fe1-a0d5-8b72afe7c93a-cnibin\") pod \"multus-6q4xd\" (UID: \"855bd894-cca9-4fe1-a0d5-8b72afe7c93a\") " pod="openshift-multus/multus-6q4xd" Nov 26 15:07:39 crc kubenswrapper[4785]: I1126 15:07:39.370218 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5539d39a-e2bc-4e7f-8b5a-a5e3e10c4ba4-proxy-tls\") pod \"machine-config-daemon-gkxdl\" (UID: \"5539d39a-e2bc-4e7f-8b5a-a5e3e10c4ba4\") " pod="openshift-machine-config-operator/machine-config-daemon-gkxdl" Nov 26 15:07:39 crc kubenswrapper[4785]: I1126 15:07:39.370243 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/862c58fd-3f79-4276-bd76-ce689d32cbd6-run-systemd\") pod \"ovnkube-node-925q9\" (UID: \"862c58fd-3f79-4276-bd76-ce689d32cbd6\") " pod="openshift-ovn-kubernetes/ovnkube-node-925q9" Nov 26 15:07:39 crc kubenswrapper[4785]: I1126 15:07:39.370256 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/855bd894-cca9-4fe1-a0d5-8b72afe7c93a-os-release\") pod \"multus-6q4xd\" (UID: \"855bd894-cca9-4fe1-a0d5-8b72afe7c93a\") " pod="openshift-multus/multus-6q4xd" Nov 26 15:07:39 crc kubenswrapper[4785]: I1126 15:07:39.370268 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/862c58fd-3f79-4276-bd76-ce689d32cbd6-host-cni-netd\") pod \"ovnkube-node-925q9\" (UID: \"862c58fd-3f79-4276-bd76-ce689d32cbd6\") " pod="openshift-ovn-kubernetes/ovnkube-node-925q9" Nov 26 15:07:39 crc kubenswrapper[4785]: I1126 15:07:39.370277 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/855bd894-cca9-4fe1-a0d5-8b72afe7c93a-host-var-lib-cni-bin\") pod \"multus-6q4xd\" (UID: \"855bd894-cca9-4fe1-a0d5-8b72afe7c93a\") " pod="openshift-multus/multus-6q4xd" Nov 26 15:07:39 crc kubenswrapper[4785]: I1126 15:07:39.370315 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/5539d39a-e2bc-4e7f-8b5a-a5e3e10c4ba4-mcd-auth-proxy-config\") pod \"machine-config-daemon-gkxdl\" (UID: \"5539d39a-e2bc-4e7f-8b5a-a5e3e10c4ba4\") " pod="openshift-machine-config-operator/machine-config-daemon-gkxdl" Nov 26 15:07:39 crc kubenswrapper[4785]: I1126 15:07:39.370351 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/855bd894-cca9-4fe1-a0d5-8b72afe7c93a-hostroot\") pod \"multus-6q4xd\" (UID: \"855bd894-cca9-4fe1-a0d5-8b72afe7c93a\") " pod="openshift-multus/multus-6q4xd" Nov 26 15:07:39 crc kubenswrapper[4785]: I1126 15:07:39.370355 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/862c58fd-3f79-4276-bd76-ce689d32cbd6-etc-openvswitch\") pod \"ovnkube-node-925q9\" (UID: \"862c58fd-3f79-4276-bd76-ce689d32cbd6\") " pod="openshift-ovn-kubernetes/ovnkube-node-925q9" Nov 26 15:07:39 crc kubenswrapper[4785]: I1126 15:07:39.370375 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/862c58fd-3f79-4276-bd76-ce689d32cbd6-systemd-units\") pod \"ovnkube-node-925q9\" (UID: \"862c58fd-3f79-4276-bd76-ce689d32cbd6\") " pod="openshift-ovn-kubernetes/ovnkube-node-925q9" Nov 26 15:07:39 crc kubenswrapper[4785]: I1126 15:07:39.370395 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/855bd894-cca9-4fe1-a0d5-8b72afe7c93a-host-var-lib-cni-multus\") pod \"multus-6q4xd\" (UID: \"855bd894-cca9-4fe1-a0d5-8b72afe7c93a\") " pod="openshift-multus/multus-6q4xd" Nov 26 15:07:39 crc kubenswrapper[4785]: I1126 15:07:39.370398 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/855bd894-cca9-4fe1-a0d5-8b72afe7c93a-host-run-multus-certs\") pod \"multus-6q4xd\" (UID: \"855bd894-cca9-4fe1-a0d5-8b72afe7c93a\") " pod="openshift-multus/multus-6q4xd" Nov 26 15:07:39 crc kubenswrapper[4785]: I1126 15:07:39.370413 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/84d83039-5d86-45bb-a5c1-ca5b94ed92c5-cni-binary-copy\") pod \"multus-additional-cni-plugins-xbz7b\" (UID: \"84d83039-5d86-45bb-a5c1-ca5b94ed92c5\") " pod="openshift-multus/multus-additional-cni-plugins-xbz7b" Nov 26 15:07:39 crc kubenswrapper[4785]: I1126 15:07:39.370421 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/855bd894-cca9-4fe1-a0d5-8b72afe7c93a-host-run-multus-certs\") pod \"multus-6q4xd\" (UID: \"855bd894-cca9-4fe1-a0d5-8b72afe7c93a\") " pod="openshift-multus/multus-6q4xd" Nov 26 15:07:39 crc kubenswrapper[4785]: I1126 15:07:39.370443 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/862c58fd-3f79-4276-bd76-ce689d32cbd6-run-ovn\") pod \"ovnkube-node-925q9\" (UID: \"862c58fd-3f79-4276-bd76-ce689d32cbd6\") " pod="openshift-ovn-kubernetes/ovnkube-node-925q9" Nov 26 15:07:39 crc kubenswrapper[4785]: I1126 15:07:39.370466 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/855bd894-cca9-4fe1-a0d5-8b72afe7c93a-host-run-netns\") pod \"multus-6q4xd\" (UID: \"855bd894-cca9-4fe1-a0d5-8b72afe7c93a\") " pod="openshift-multus/multus-6q4xd" Nov 26 15:07:39 crc kubenswrapper[4785]: I1126 15:07:39.370479 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/855bd894-cca9-4fe1-a0d5-8b72afe7c93a-cnibin\") pod \"multus-6q4xd\" (UID: \"855bd894-cca9-4fe1-a0d5-8b72afe7c93a\") " pod="openshift-multus/multus-6q4xd" Nov 26 15:07:39 crc kubenswrapper[4785]: I1126 15:07:39.370522 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/855bd894-cca9-4fe1-a0d5-8b72afe7c93a-host-run-netns\") pod \"multus-6q4xd\" (UID: \"855bd894-cca9-4fe1-a0d5-8b72afe7c93a\") " pod="openshift-multus/multus-6q4xd" Nov 26 15:07:39 crc kubenswrapper[4785]: I1126 15:07:39.370892 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/855bd894-cca9-4fe1-a0d5-8b72afe7c93a-cni-binary-copy\") pod \"multus-6q4xd\" (UID: \"855bd894-cca9-4fe1-a0d5-8b72afe7c93a\") " pod="openshift-multus/multus-6q4xd" Nov 26 15:07:39 crc kubenswrapper[4785]: I1126 15:07:39.371049 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/84d83039-5d86-45bb-a5c1-ca5b94ed92c5-tuning-conf-dir\") pod \"multus-additional-cni-plugins-xbz7b\" (UID: \"84d83039-5d86-45bb-a5c1-ca5b94ed92c5\") " pod="openshift-multus/multus-additional-cni-plugins-xbz7b" Nov 26 15:07:39 crc kubenswrapper[4785]: I1126 15:07:39.371101 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/862c58fd-3f79-4276-bd76-ce689d32cbd6-host-kubelet\") pod \"ovnkube-node-925q9\" (UID: \"862c58fd-3f79-4276-bd76-ce689d32cbd6\") " pod="openshift-ovn-kubernetes/ovnkube-node-925q9" Nov 26 15:07:39 crc kubenswrapper[4785]: I1126 15:07:39.371135 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/862c58fd-3f79-4276-bd76-ce689d32cbd6-host-run-ovn-kubernetes\") pod \"ovnkube-node-925q9\" (UID: \"862c58fd-3f79-4276-bd76-ce689d32cbd6\") " pod="openshift-ovn-kubernetes/ovnkube-node-925q9" Nov 26 15:07:39 crc kubenswrapper[4785]: I1126 15:07:39.371150 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/862c58fd-3f79-4276-bd76-ce689d32cbd6-env-overrides\") pod \"ovnkube-node-925q9\" (UID: \"862c58fd-3f79-4276-bd76-ce689d32cbd6\") " pod="openshift-ovn-kubernetes/ovnkube-node-925q9" Nov 26 15:07:39 crc kubenswrapper[4785]: I1126 15:07:39.371170 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/862c58fd-3f79-4276-bd76-ce689d32cbd6-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-925q9\" (UID: \"862c58fd-3f79-4276-bd76-ce689d32cbd6\") " pod="openshift-ovn-kubernetes/ovnkube-node-925q9" Nov 26 15:07:39 crc kubenswrapper[4785]: I1126 15:07:39.371204 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/862c58fd-3f79-4276-bd76-ce689d32cbd6-host-cni-bin\") pod \"ovnkube-node-925q9\" (UID: \"862c58fd-3f79-4276-bd76-ce689d32cbd6\") " pod="openshift-ovn-kubernetes/ovnkube-node-925q9" Nov 26 15:07:39 crc kubenswrapper[4785]: I1126 15:07:39.371251 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/855bd894-cca9-4fe1-a0d5-8b72afe7c93a-multus-socket-dir-parent\") pod \"multus-6q4xd\" (UID: \"855bd894-cca9-4fe1-a0d5-8b72afe7c93a\") " pod="openshift-multus/multus-6q4xd" Nov 26 15:07:39 crc kubenswrapper[4785]: I1126 15:07:39.371477 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/84d83039-5d86-45bb-a5c1-ca5b94ed92c5-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-xbz7b\" (UID: \"84d83039-5d86-45bb-a5c1-ca5b94ed92c5\") " pod="openshift-multus/multus-additional-cni-plugins-xbz7b" Nov 26 15:07:39 crc kubenswrapper[4785]: I1126 15:07:39.371731 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/862c58fd-3f79-4276-bd76-ce689d32cbd6-node-log\") pod \"ovnkube-node-925q9\" (UID: \"862c58fd-3f79-4276-bd76-ce689d32cbd6\") " pod="openshift-ovn-kubernetes/ovnkube-node-925q9" Nov 26 15:07:39 crc kubenswrapper[4785]: I1126 15:07:39.370243 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/855bd894-cca9-4fe1-a0d5-8b72afe7c93a-multus-daemon-config\") pod \"multus-6q4xd\" (UID: \"855bd894-cca9-4fe1-a0d5-8b72afe7c93a\") " pod="openshift-multus/multus-6q4xd" Nov 26 15:07:39 crc kubenswrapper[4785]: I1126 15:07:39.371787 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/862c58fd-3f79-4276-bd76-ce689d32cbd6-run-systemd\") pod \"ovnkube-node-925q9\" (UID: \"862c58fd-3f79-4276-bd76-ce689d32cbd6\") " pod="openshift-ovn-kubernetes/ovnkube-node-925q9" Nov 26 15:07:39 crc kubenswrapper[4785]: I1126 15:07:39.371848 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/862c58fd-3f79-4276-bd76-ce689d32cbd6-run-ovn\") pod \"ovnkube-node-925q9\" (UID: \"862c58fd-3f79-4276-bd76-ce689d32cbd6\") " pod="openshift-ovn-kubernetes/ovnkube-node-925q9" Nov 26 15:07:39 crc kubenswrapper[4785]: I1126 15:07:39.371853 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/5539d39a-e2bc-4e7f-8b5a-a5e3e10c4ba4-mcd-auth-proxy-config\") pod \"machine-config-daemon-gkxdl\" (UID: \"5539d39a-e2bc-4e7f-8b5a-a5e3e10c4ba4\") " pod="openshift-machine-config-operator/machine-config-daemon-gkxdl" Nov 26 15:07:39 crc kubenswrapper[4785]: I1126 15:07:39.371879 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/855bd894-cca9-4fe1-a0d5-8b72afe7c93a-hostroot\") pod \"multus-6q4xd\" (UID: \"855bd894-cca9-4fe1-a0d5-8b72afe7c93a\") " pod="openshift-multus/multus-6q4xd" Nov 26 15:07:39 crc kubenswrapper[4785]: I1126 15:07:39.371881 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/862c58fd-3f79-4276-bd76-ce689d32cbd6-ovnkube-config\") pod \"ovnkube-node-925q9\" (UID: \"862c58fd-3f79-4276-bd76-ce689d32cbd6\") " pod="openshift-ovn-kubernetes/ovnkube-node-925q9" Nov 26 15:07:39 crc kubenswrapper[4785]: I1126 15:07:39.371908 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/862c58fd-3f79-4276-bd76-ce689d32cbd6-systemd-units\") pod \"ovnkube-node-925q9\" (UID: \"862c58fd-3f79-4276-bd76-ce689d32cbd6\") " pod="openshift-ovn-kubernetes/ovnkube-node-925q9" Nov 26 15:07:39 crc kubenswrapper[4785]: I1126 15:07:39.370316 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/862c58fd-3f79-4276-bd76-ce689d32cbd6-host-cni-netd\") pod \"ovnkube-node-925q9\" (UID: \"862c58fd-3f79-4276-bd76-ce689d32cbd6\") " pod="openshift-ovn-kubernetes/ovnkube-node-925q9" Nov 26 15:07:39 crc kubenswrapper[4785]: I1126 15:07:39.374177 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5539d39a-e2bc-4e7f-8b5a-a5e3e10c4ba4-proxy-tls\") pod \"machine-config-daemon-gkxdl\" (UID: \"5539d39a-e2bc-4e7f-8b5a-a5e3e10c4ba4\") " pod="openshift-machine-config-operator/machine-config-daemon-gkxdl" Nov 26 15:07:39 crc kubenswrapper[4785]: I1126 15:07:39.375585 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/862c58fd-3f79-4276-bd76-ce689d32cbd6-ovn-node-metrics-cert\") pod \"ovnkube-node-925q9\" (UID: \"862c58fd-3f79-4276-bd76-ce689d32cbd6\") " pod="openshift-ovn-kubernetes/ovnkube-node-925q9" Nov 26 15:07:39 crc kubenswrapper[4785]: I1126 15:07:39.387062 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v7zv2\" (UniqueName: \"kubernetes.io/projected/855bd894-cca9-4fe1-a0d5-8b72afe7c93a-kube-api-access-v7zv2\") pod \"multus-6q4xd\" (UID: \"855bd894-cca9-4fe1-a0d5-8b72afe7c93a\") " pod="openshift-multus/multus-6q4xd" Nov 26 15:07:39 crc kubenswrapper[4785]: I1126 15:07:39.388042 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xbz7b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84d83039-5d86-45bb-a5c1-ca5b94ed92c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gdb7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gdb7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gdb7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gdb7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gdb7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gdb7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gdb7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xbz7b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:39Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:39 crc kubenswrapper[4785]: I1126 15:07:39.389639 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dgxb5\" (UniqueName: \"kubernetes.io/projected/862c58fd-3f79-4276-bd76-ce689d32cbd6-kube-api-access-dgxb5\") pod \"ovnkube-node-925q9\" (UID: \"862c58fd-3f79-4276-bd76-ce689d32cbd6\") " pod="openshift-ovn-kubernetes/ovnkube-node-925q9" Nov 26 15:07:39 crc kubenswrapper[4785]: I1126 15:07:39.394317 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fqzs8\" (UniqueName: \"kubernetes.io/projected/5539d39a-e2bc-4e7f-8b5a-a5e3e10c4ba4-kube-api-access-fqzs8\") pod \"machine-config-daemon-gkxdl\" (UID: \"5539d39a-e2bc-4e7f-8b5a-a5e3e10c4ba4\") " pod="openshift-machine-config-operator/machine-config-daemon-gkxdl" Nov 26 15:07:39 crc kubenswrapper[4785]: I1126 15:07:39.399539 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gdb7p\" (UniqueName: \"kubernetes.io/projected/84d83039-5d86-45bb-a5c1-ca5b94ed92c5-kube-api-access-gdb7p\") pod \"multus-additional-cni-plugins-xbz7b\" (UID: \"84d83039-5d86-45bb-a5c1-ca5b94ed92c5\") " pod="openshift-multus/multus-additional-cni-plugins-xbz7b" Nov 26 15:07:39 crc kubenswrapper[4785]: I1126 15:07:39.402148 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-gkxdl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5539d39a-e2bc-4e7f-8b5a-a5e3e10c4ba4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqzs8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqzs8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-gkxdl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:39Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:39 crc kubenswrapper[4785]: I1126 15:07:39.413296 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-xbz7b" Nov 26 15:07:39 crc kubenswrapper[4785]: I1126 15:07:39.416748 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8acfa933b476aa5e196c195906225ae92af5cca3353292e23f29f87ee374d0ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f432643beb27bff31e146230a60837b1292cddb8595ede8ed7f929170fceeb17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:39Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:39 crc kubenswrapper[4785]: I1126 15:07:39.419868 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-gkxdl" Nov 26 15:07:39 crc kubenswrapper[4785]: W1126 15:07:39.426148 4785 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod84d83039_5d86_45bb_a5c1_ca5b94ed92c5.slice/crio-f5c7a9bc56ecc7657125a012ca87a5439c34a6272dd1a422c9a18a709403208e WatchSource:0}: Error finding container f5c7a9bc56ecc7657125a012ca87a5439c34a6272dd1a422c9a18a709403208e: Status 404 returned error can't find the container with id f5c7a9bc56ecc7657125a012ca87a5439c34a6272dd1a422c9a18a709403208e Nov 26 15:07:39 crc kubenswrapper[4785]: I1126 15:07:39.426565 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-6q4xd" Nov 26 15:07:39 crc kubenswrapper[4785]: I1126 15:07:39.432089 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-925q9" Nov 26 15:07:39 crc kubenswrapper[4785]: I1126 15:07:39.441338 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-925q9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"862c58fd-3f79-4276-bd76-ce689d32cbd6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-925q9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:39Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:39 crc kubenswrapper[4785]: W1126 15:07:39.443743 4785 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod855bd894_cca9_4fe1_a0d5_8b72afe7c93a.slice/crio-0070e4e4788b899dcaffdfaad55baabb710366b64d711bc97a1124dbed731abe WatchSource:0}: Error finding container 0070e4e4788b899dcaffdfaad55baabb710366b64d711bc97a1124dbed731abe: Status 404 returned error can't find the container with id 0070e4e4788b899dcaffdfaad55baabb710366b64d711bc97a1124dbed731abe Nov 26 15:07:39 crc kubenswrapper[4785]: W1126 15:07:39.460716 4785 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod862c58fd_3f79_4276_bd76_ce689d32cbd6.slice/crio-eb4ce55002c9c00ad0f8578d4ed01703cb91814179fe559da5b366bd016953b8 WatchSource:0}: Error finding container eb4ce55002c9c00ad0f8578d4ed01703cb91814179fe559da5b366bd016953b8: Status 404 returned error can't find the container with id eb4ce55002c9c00ad0f8578d4ed01703cb91814179fe559da5b366bd016953b8 Nov 26 15:07:40 crc kubenswrapper[4785]: I1126 15:07:40.036167 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 26 15:07:40 crc kubenswrapper[4785]: E1126 15:07:40.036357 4785 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 26 15:07:40 crc kubenswrapper[4785]: I1126 15:07:40.199923 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"4ab8b8fda3321443cb24c214c10ba7171f01c80adf81860f4aa931062ec886d1"} Nov 26 15:07:40 crc kubenswrapper[4785]: I1126 15:07:40.202009 4785 generic.go:334] "Generic (PLEG): container finished" podID="862c58fd-3f79-4276-bd76-ce689d32cbd6" containerID="049fc6de07d4b51e1c898035109c3427dff189cc509f5973db7cb75c3a022bc1" exitCode=0 Nov 26 15:07:40 crc kubenswrapper[4785]: I1126 15:07:40.202080 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-925q9" event={"ID":"862c58fd-3f79-4276-bd76-ce689d32cbd6","Type":"ContainerDied","Data":"049fc6de07d4b51e1c898035109c3427dff189cc509f5973db7cb75c3a022bc1"} Nov 26 15:07:40 crc kubenswrapper[4785]: I1126 15:07:40.202127 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-925q9" event={"ID":"862c58fd-3f79-4276-bd76-ce689d32cbd6","Type":"ContainerStarted","Data":"eb4ce55002c9c00ad0f8578d4ed01703cb91814179fe559da5b366bd016953b8"} Nov 26 15:07:40 crc kubenswrapper[4785]: I1126 15:07:40.204054 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-6q4xd" event={"ID":"855bd894-cca9-4fe1-a0d5-8b72afe7c93a","Type":"ContainerStarted","Data":"041064cd63f3d654e02e5e310fbc190ca527a986959c9d7a35ab87962184f2a0"} Nov 26 15:07:40 crc kubenswrapper[4785]: I1126 15:07:40.204118 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-6q4xd" event={"ID":"855bd894-cca9-4fe1-a0d5-8b72afe7c93a","Type":"ContainerStarted","Data":"0070e4e4788b899dcaffdfaad55baabb710366b64d711bc97a1124dbed731abe"} Nov 26 15:07:40 crc kubenswrapper[4785]: I1126 15:07:40.205759 4785 generic.go:334] "Generic (PLEG): container finished" podID="84d83039-5d86-45bb-a5c1-ca5b94ed92c5" containerID="6f7072a2a9067489a27300fea823075b89d71d45a1d972bd9d94226b01c2c123" exitCode=0 Nov 26 15:07:40 crc kubenswrapper[4785]: I1126 15:07:40.205825 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-xbz7b" event={"ID":"84d83039-5d86-45bb-a5c1-ca5b94ed92c5","Type":"ContainerDied","Data":"6f7072a2a9067489a27300fea823075b89d71d45a1d972bd9d94226b01c2c123"} Nov 26 15:07:40 crc kubenswrapper[4785]: I1126 15:07:40.205861 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-xbz7b" event={"ID":"84d83039-5d86-45bb-a5c1-ca5b94ed92c5","Type":"ContainerStarted","Data":"f5c7a9bc56ecc7657125a012ca87a5439c34a6272dd1a422c9a18a709403208e"} Nov 26 15:07:40 crc kubenswrapper[4785]: I1126 15:07:40.208672 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-gkxdl" event={"ID":"5539d39a-e2bc-4e7f-8b5a-a5e3e10c4ba4","Type":"ContainerStarted","Data":"183451ec5e6ad1983b6e56a53adfcf2eaf3489dcfbc3cbc709f67002691b9010"} Nov 26 15:07:40 crc kubenswrapper[4785]: I1126 15:07:40.208734 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-gkxdl" event={"ID":"5539d39a-e2bc-4e7f-8b5a-a5e3e10c4ba4","Type":"ContainerStarted","Data":"5538d579d2e0ac36c778f58da6e8176f8f7a0995797e8bc7c3ad21a054346867"} Nov 26 15:07:40 crc kubenswrapper[4785]: I1126 15:07:40.208750 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-gkxdl" event={"ID":"5539d39a-e2bc-4e7f-8b5a-a5e3e10c4ba4","Type":"ContainerStarted","Data":"51f4895a15327db194635f4bd294c97f37456abd05686338fc339a60335a9359"} Nov 26 15:07:40 crc kubenswrapper[4785]: I1126 15:07:40.216480 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:40Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:40 crc kubenswrapper[4785]: I1126 15:07:40.237600 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xbz7b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84d83039-5d86-45bb-a5c1-ca5b94ed92c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gdb7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gdb7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gdb7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gdb7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gdb7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gdb7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gdb7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xbz7b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:40Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:40 crc kubenswrapper[4785]: I1126 15:07:40.250465 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-gkxdl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5539d39a-e2bc-4e7f-8b5a-a5e3e10c4ba4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqzs8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqzs8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-gkxdl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:40Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:40 crc kubenswrapper[4785]: I1126 15:07:40.273006 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8acfa933b476aa5e196c195906225ae92af5cca3353292e23f29f87ee374d0ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f432643beb27bff31e146230a60837b1292cddb8595ede8ed7f929170fceeb17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:40Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:40 crc kubenswrapper[4785]: I1126 15:07:40.289937 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ab8b8fda3321443cb24c214c10ba7171f01c80adf81860f4aa931062ec886d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:40Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:40 crc kubenswrapper[4785]: I1126 15:07:40.309399 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-925q9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"862c58fd-3f79-4276-bd76-ce689d32cbd6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-925q9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:40Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:40 crc kubenswrapper[4785]: I1126 15:07:40.319023 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-hk884" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b9a87c55-b930-4993-88b8-15902e000caa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://018c4591e8ee49a626a6357d6672b723d221f62eaa62face83fd6c3d5f1b3bfa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n9rk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:38Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-hk884\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:40Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:40 crc kubenswrapper[4785]: I1126 15:07:40.331138 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6q4xd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"855bd894-cca9-4fe1-a0d5-8b72afe7c93a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7zv2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6q4xd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:40Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:40 crc kubenswrapper[4785]: I1126 15:07:40.342210 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:40Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:40 crc kubenswrapper[4785]: I1126 15:07:40.351699 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-smv28" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d77660b-5b10-4573-84ab-3dc318d4b4ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3bb62d64073e9a5efa6ec04394fb33bf1498892d59e0843f323a728dff2e7b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mwfft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:38Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-smv28\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:40Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:40 crc kubenswrapper[4785]: I1126 15:07:40.363060 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ce2473df-8540-437a-9b68-a0c79c8f1189\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26237e500f4655636096b017418a9029308b9ad9e5c12041546e51d919766529\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7904d95c1ece08fadcdc54e83f80b295f3904cc950fb84624bc2a6967e4d5bdd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://936454b4c6cdedceb140c50e2dba0e677ce2b126075b203bdae8a77bbdefde76\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2202f005c9b704cc7a97a7387e4563a14bda9e52001c63918f1067a7e676d34d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a6d4414478527bb76f539125e4be6ebf2a6470ddb46fd35e07180319fa98ede\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-26T15:07:36Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1126 15:07:30.557519 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1126 15:07:30.559919 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2275430498/tls.crt::/tmp/serving-cert-2275430498/tls.key\\\\\\\"\\\\nI1126 15:07:36.843783 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1126 15:07:36.847546 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1126 15:07:36.847601 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1126 15:07:36.847630 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1126 15:07:36.847642 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1126 15:07:36.858915 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1126 15:07:36.858955 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1126 15:07:36.858960 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1126 15:07:36.858971 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1126 15:07:36.858975 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1126 15:07:36.858978 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1126 15:07:36.858982 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1126 15:07:36.859422 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1126 15:07:36.862125 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:20Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcce80995d95ff6993051a4f1ca260133d2513d216a291e3f2c905ec29f88d1b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e86ca1ef167bd0c31ad5ae1283d40945c72c4e528f3702b577bbbf9818e84c08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e86ca1ef167bd0c31ad5ae1283d40945c72c4e528f3702b577bbbf9818e84c08\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:40Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:40 crc kubenswrapper[4785]: I1126 15:07:40.376513 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f369bd7df062792e57a0cee19e7be96c6850b20d6e739a0c7fa5eccc14ae1d1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:40Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:40 crc kubenswrapper[4785]: I1126 15:07:40.389167 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:40Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:40 crc kubenswrapper[4785]: I1126 15:07:40.402768 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8acfa933b476aa5e196c195906225ae92af5cca3353292e23f29f87ee374d0ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f432643beb27bff31e146230a60837b1292cddb8595ede8ed7f929170fceeb17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:40Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:40 crc kubenswrapper[4785]: I1126 15:07:40.415729 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ab8b8fda3321443cb24c214c10ba7171f01c80adf81860f4aa931062ec886d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:40Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:40 crc kubenswrapper[4785]: I1126 15:07:40.429332 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:40Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:40 crc kubenswrapper[4785]: I1126 15:07:40.444627 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xbz7b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84d83039-5d86-45bb-a5c1-ca5b94ed92c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gdb7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f7072a2a9067489a27300fea823075b89d71d45a1d972bd9d94226b01c2c123\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f7072a2a9067489a27300fea823075b89d71d45a1d972bd9d94226b01c2c123\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gdb7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gdb7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gdb7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gdb7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gdb7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gdb7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xbz7b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:40Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:40 crc kubenswrapper[4785]: I1126 15:07:40.453395 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-gkxdl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5539d39a-e2bc-4e7f-8b5a-a5e3e10c4ba4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://183451ec5e6ad1983b6e56a53adfcf2eaf3489dcfbc3cbc709f67002691b9010\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqzs8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5538d579d2e0ac36c778f58da6e8176f8f7a0995797e8bc7c3ad21a054346867\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqzs8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-gkxdl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:40Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:40 crc kubenswrapper[4785]: I1126 15:07:40.469533 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-925q9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"862c58fd-3f79-4276-bd76-ce689d32cbd6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://049fc6de07d4b51e1c898035109c3427dff189cc509f5973db7cb75c3a022bc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://049fc6de07d4b51e1c898035109c3427dff189cc509f5973db7cb75c3a022bc1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-925q9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:40Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:40 crc kubenswrapper[4785]: I1126 15:07:40.480463 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-hk884" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b9a87c55-b930-4993-88b8-15902e000caa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://018c4591e8ee49a626a6357d6672b723d221f62eaa62face83fd6c3d5f1b3bfa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n9rk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:38Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-hk884\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:40Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:40 crc kubenswrapper[4785]: I1126 15:07:40.493822 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6q4xd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"855bd894-cca9-4fe1-a0d5-8b72afe7c93a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://041064cd63f3d654e02e5e310fbc190ca527a986959c9d7a35ab87962184f2a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7zv2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6q4xd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:40Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:40 crc kubenswrapper[4785]: I1126 15:07:40.504882 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-smv28" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d77660b-5b10-4573-84ab-3dc318d4b4ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3bb62d64073e9a5efa6ec04394fb33bf1498892d59e0843f323a728dff2e7b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mwfft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:38Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-smv28\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:40Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:40 crc kubenswrapper[4785]: I1126 15:07:40.517230 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:40Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:40 crc kubenswrapper[4785]: I1126 15:07:40.529366 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:40Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:40 crc kubenswrapper[4785]: I1126 15:07:40.544361 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ce2473df-8540-437a-9b68-a0c79c8f1189\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26237e500f4655636096b017418a9029308b9ad9e5c12041546e51d919766529\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7904d95c1ece08fadcdc54e83f80b295f3904cc950fb84624bc2a6967e4d5bdd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://936454b4c6cdedceb140c50e2dba0e677ce2b126075b203bdae8a77bbdefde76\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2202f005c9b704cc7a97a7387e4563a14bda9e52001c63918f1067a7e676d34d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a6d4414478527bb76f539125e4be6ebf2a6470ddb46fd35e07180319fa98ede\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-26T15:07:36Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1126 15:07:30.557519 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1126 15:07:30.559919 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2275430498/tls.crt::/tmp/serving-cert-2275430498/tls.key\\\\\\\"\\\\nI1126 15:07:36.843783 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1126 15:07:36.847546 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1126 15:07:36.847601 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1126 15:07:36.847630 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1126 15:07:36.847642 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1126 15:07:36.858915 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1126 15:07:36.858955 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1126 15:07:36.858960 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1126 15:07:36.858971 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1126 15:07:36.858975 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1126 15:07:36.858978 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1126 15:07:36.858982 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1126 15:07:36.859422 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1126 15:07:36.862125 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:20Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcce80995d95ff6993051a4f1ca260133d2513d216a291e3f2c905ec29f88d1b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e86ca1ef167bd0c31ad5ae1283d40945c72c4e528f3702b577bbbf9818e84c08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e86ca1ef167bd0c31ad5ae1283d40945c72c4e528f3702b577bbbf9818e84c08\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:40Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:40 crc kubenswrapper[4785]: I1126 15:07:40.557384 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f369bd7df062792e57a0cee19e7be96c6850b20d6e739a0c7fa5eccc14ae1d1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:40Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:40 crc kubenswrapper[4785]: I1126 15:07:40.583235 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 26 15:07:40 crc kubenswrapper[4785]: I1126 15:07:40.583326 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 26 15:07:40 crc kubenswrapper[4785]: I1126 15:07:40.583350 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 26 15:07:40 crc kubenswrapper[4785]: I1126 15:07:40.583381 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 26 15:07:40 crc kubenswrapper[4785]: E1126 15:07:40.583439 4785 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Nov 26 15:07:40 crc kubenswrapper[4785]: E1126 15:07:40.583448 4785 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-26 15:07:44.583417738 +0000 UTC m=+28.261783502 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 15:07:40 crc kubenswrapper[4785]: E1126 15:07:40.583491 4785 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-26 15:07:44.583474679 +0000 UTC m=+28.261840523 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Nov 26 15:07:40 crc kubenswrapper[4785]: I1126 15:07:40.583540 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 26 15:07:40 crc kubenswrapper[4785]: E1126 15:07:40.583621 4785 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 26 15:07:40 crc kubenswrapper[4785]: E1126 15:07:40.583642 4785 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 26 15:07:40 crc kubenswrapper[4785]: E1126 15:07:40.583623 4785 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 26 15:07:40 crc kubenswrapper[4785]: E1126 15:07:40.583668 4785 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 26 15:07:40 crc kubenswrapper[4785]: E1126 15:07:40.583682 4785 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 26 15:07:40 crc kubenswrapper[4785]: E1126 15:07:40.583690 4785 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 26 15:07:40 crc kubenswrapper[4785]: E1126 15:07:40.583723 4785 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-11-26 15:07:44.583715285 +0000 UTC m=+28.262081049 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 26 15:07:40 crc kubenswrapper[4785]: E1126 15:07:40.583656 4785 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 26 15:07:40 crc kubenswrapper[4785]: E1126 15:07:40.583736 4785 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-26 15:07:44.583729836 +0000 UTC m=+28.262095600 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 26 15:07:40 crc kubenswrapper[4785]: E1126 15:07:40.583777 4785 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-11-26 15:07:44.583751546 +0000 UTC m=+28.262117310 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 26 15:07:40 crc kubenswrapper[4785]: I1126 15:07:40.744925 4785 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Nov 26 15:07:40 crc kubenswrapper[4785]: I1126 15:07:40.761334 4785 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Nov 26 15:07:40 crc kubenswrapper[4785]: I1126 15:07:40.762683 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ce2473df-8540-437a-9b68-a0c79c8f1189\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26237e500f4655636096b017418a9029308b9ad9e5c12041546e51d919766529\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7904d95c1ece08fadcdc54e83f80b295f3904cc950fb84624bc2a6967e4d5bdd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://936454b4c6cdedceb140c50e2dba0e677ce2b126075b203bdae8a77bbdefde76\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2202f005c9b704cc7a97a7387e4563a14bda9e52001c63918f1067a7e676d34d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a6d4414478527bb76f539125e4be6ebf2a6470ddb46fd35e07180319fa98ede\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-26T15:07:36Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1126 15:07:30.557519 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1126 15:07:30.559919 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2275430498/tls.crt::/tmp/serving-cert-2275430498/tls.key\\\\\\\"\\\\nI1126 15:07:36.843783 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1126 15:07:36.847546 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1126 15:07:36.847601 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1126 15:07:36.847630 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1126 15:07:36.847642 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1126 15:07:36.858915 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1126 15:07:36.858955 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1126 15:07:36.858960 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1126 15:07:36.858971 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1126 15:07:36.858975 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1126 15:07:36.858978 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1126 15:07:36.858982 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1126 15:07:36.859422 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1126 15:07:36.862125 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:20Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcce80995d95ff6993051a4f1ca260133d2513d216a291e3f2c905ec29f88d1b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e86ca1ef167bd0c31ad5ae1283d40945c72c4e528f3702b577bbbf9818e84c08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e86ca1ef167bd0c31ad5ae1283d40945c72c4e528f3702b577bbbf9818e84c08\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:40Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:40 crc kubenswrapper[4785]: I1126 15:07:40.762915 4785 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Nov 26 15:07:40 crc kubenswrapper[4785]: I1126 15:07:40.773955 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f369bd7df062792e57a0cee19e7be96c6850b20d6e739a0c7fa5eccc14ae1d1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:40Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:40 crc kubenswrapper[4785]: I1126 15:07:40.786037 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:40Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:40 crc kubenswrapper[4785]: I1126 15:07:40.799490 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8acfa933b476aa5e196c195906225ae92af5cca3353292e23f29f87ee374d0ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f432643beb27bff31e146230a60837b1292cddb8595ede8ed7f929170fceeb17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:40Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:40 crc kubenswrapper[4785]: I1126 15:07:40.813495 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ab8b8fda3321443cb24c214c10ba7171f01c80adf81860f4aa931062ec886d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:40Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:40 crc kubenswrapper[4785]: I1126 15:07:40.830114 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:40Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:40 crc kubenswrapper[4785]: I1126 15:07:40.851965 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xbz7b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84d83039-5d86-45bb-a5c1-ca5b94ed92c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gdb7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f7072a2a9067489a27300fea823075b89d71d45a1d972bd9d94226b01c2c123\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f7072a2a9067489a27300fea823075b89d71d45a1d972bd9d94226b01c2c123\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gdb7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gdb7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gdb7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gdb7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gdb7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gdb7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xbz7b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:40Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:40 crc kubenswrapper[4785]: I1126 15:07:40.864868 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-gkxdl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5539d39a-e2bc-4e7f-8b5a-a5e3e10c4ba4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://183451ec5e6ad1983b6e56a53adfcf2eaf3489dcfbc3cbc709f67002691b9010\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqzs8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5538d579d2e0ac36c778f58da6e8176f8f7a0995797e8bc7c3ad21a054346867\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqzs8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-gkxdl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:40Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:40 crc kubenswrapper[4785]: I1126 15:07:40.884913 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-925q9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"862c58fd-3f79-4276-bd76-ce689d32cbd6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://049fc6de07d4b51e1c898035109c3427dff189cc509f5973db7cb75c3a022bc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://049fc6de07d4b51e1c898035109c3427dff189cc509f5973db7cb75c3a022bc1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-925q9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:40Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:40 crc kubenswrapper[4785]: I1126 15:07:40.903190 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-hk884" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b9a87c55-b930-4993-88b8-15902e000caa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://018c4591e8ee49a626a6357d6672b723d221f62eaa62face83fd6c3d5f1b3bfa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n9rk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:38Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-hk884\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:40Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:40 crc kubenswrapper[4785]: I1126 15:07:40.918352 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6q4xd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"855bd894-cca9-4fe1-a0d5-8b72afe7c93a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://041064cd63f3d654e02e5e310fbc190ca527a986959c9d7a35ab87962184f2a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7zv2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6q4xd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:40Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:40 crc kubenswrapper[4785]: I1126 15:07:40.930717 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:40Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:40 crc kubenswrapper[4785]: I1126 15:07:40.941045 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-smv28" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d77660b-5b10-4573-84ab-3dc318d4b4ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3bb62d64073e9a5efa6ec04394fb33bf1498892d59e0843f323a728dff2e7b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mwfft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:38Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-smv28\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:40Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:40 crc kubenswrapper[4785]: I1126 15:07:40.954720 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ce2473df-8540-437a-9b68-a0c79c8f1189\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26237e500f4655636096b017418a9029308b9ad9e5c12041546e51d919766529\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7904d95c1ece08fadcdc54e83f80b295f3904cc950fb84624bc2a6967e4d5bdd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://936454b4c6cdedceb140c50e2dba0e677ce2b126075b203bdae8a77bbdefde76\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2202f005c9b704cc7a97a7387e4563a14bda9e52001c63918f1067a7e676d34d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a6d4414478527bb76f539125e4be6ebf2a6470ddb46fd35e07180319fa98ede\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-26T15:07:36Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1126 15:07:30.557519 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1126 15:07:30.559919 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2275430498/tls.crt::/tmp/serving-cert-2275430498/tls.key\\\\\\\"\\\\nI1126 15:07:36.843783 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1126 15:07:36.847546 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1126 15:07:36.847601 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1126 15:07:36.847630 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1126 15:07:36.847642 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1126 15:07:36.858915 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1126 15:07:36.858955 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1126 15:07:36.858960 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1126 15:07:36.858971 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1126 15:07:36.858975 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1126 15:07:36.858978 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1126 15:07:36.858982 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1126 15:07:36.859422 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1126 15:07:36.862125 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:20Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcce80995d95ff6993051a4f1ca260133d2513d216a291e3f2c905ec29f88d1b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e86ca1ef167bd0c31ad5ae1283d40945c72c4e528f3702b577bbbf9818e84c08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e86ca1ef167bd0c31ad5ae1283d40945c72c4e528f3702b577bbbf9818e84c08\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:40Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:40 crc kubenswrapper[4785]: I1126 15:07:40.966870 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f369bd7df062792e57a0cee19e7be96c6850b20d6e739a0c7fa5eccc14ae1d1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:40Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:40 crc kubenswrapper[4785]: I1126 15:07:40.977358 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:40Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:40 crc kubenswrapper[4785]: I1126 15:07:40.992118 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:40Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:41 crc kubenswrapper[4785]: I1126 15:07:41.006530 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xbz7b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84d83039-5d86-45bb-a5c1-ca5b94ed92c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gdb7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f7072a2a9067489a27300fea823075b89d71d45a1d972bd9d94226b01c2c123\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f7072a2a9067489a27300fea823075b89d71d45a1d972bd9d94226b01c2c123\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gdb7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gdb7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gdb7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gdb7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gdb7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gdb7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xbz7b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:41Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:41 crc kubenswrapper[4785]: I1126 15:07:41.018312 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-gkxdl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5539d39a-e2bc-4e7f-8b5a-a5e3e10c4ba4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://183451ec5e6ad1983b6e56a53adfcf2eaf3489dcfbc3cbc709f67002691b9010\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqzs8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5538d579d2e0ac36c778f58da6e8176f8f7a0995797e8bc7c3ad21a054346867\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqzs8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-gkxdl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:41Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:41 crc kubenswrapper[4785]: I1126 15:07:41.037753 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 26 15:07:41 crc kubenswrapper[4785]: E1126 15:07:41.037990 4785 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 26 15:07:41 crc kubenswrapper[4785]: I1126 15:07:41.038297 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 26 15:07:41 crc kubenswrapper[4785]: E1126 15:07:41.038457 4785 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 26 15:07:41 crc kubenswrapper[4785]: I1126 15:07:41.039282 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ec3c2133-1999-46b4-b5a6-c1b800f04c30\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee866b365ef5e15c3cc262ab6ad0649b10fa7fdf0c10c3f747d9e20d48535e76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79b8f6bd2047642cacd4effcee8f9760c7c9480092993c7a25eabe4727bd8238\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dee1454ae94951e83f437b999f871e4965a9199a20bcd6392a6e07a7bfa9c6aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://257e51cb9b93727ccb616e193abc21dfed1ee9ee2f1ce459e35728457dae5280\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc743861f85f56a87c6a20d7152af59e554284fe13b8212c5dae854438771406\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ff60b31932a511f9156f1416e112d115fcbc79a0a6030143ba9f5ddd327577a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ff60b31932a511f9156f1416e112d115fcbc79a0a6030143ba9f5ddd327577a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff524be439c6bf410b9377cd503516b43be3826e4c2fcd03a6ed9bf3fcfab95b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff524be439c6bf410b9377cd503516b43be3826e4c2fcd03a6ed9bf3fcfab95b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:19Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://26cb5ba5508ef320a24ba05e352547c9feb82f37ebeab58ad7c0a073d03b8c2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26cb5ba5508ef320a24ba05e352547c9feb82f37ebeab58ad7c0a073d03b8c2d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:17Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:41Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:41 crc kubenswrapper[4785]: I1126 15:07:41.054004 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8acfa933b476aa5e196c195906225ae92af5cca3353292e23f29f87ee374d0ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f432643beb27bff31e146230a60837b1292cddb8595ede8ed7f929170fceeb17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:41Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:41 crc kubenswrapper[4785]: I1126 15:07:41.068103 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ab8b8fda3321443cb24c214c10ba7171f01c80adf81860f4aa931062ec886d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:41Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:41 crc kubenswrapper[4785]: I1126 15:07:41.094125 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-925q9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"862c58fd-3f79-4276-bd76-ce689d32cbd6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://049fc6de07d4b51e1c898035109c3427dff189cc509f5973db7cb75c3a022bc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://049fc6de07d4b51e1c898035109c3427dff189cc509f5973db7cb75c3a022bc1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-925q9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:41Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:41 crc kubenswrapper[4785]: I1126 15:07:41.104162 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-hk884" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b9a87c55-b930-4993-88b8-15902e000caa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://018c4591e8ee49a626a6357d6672b723d221f62eaa62face83fd6c3d5f1b3bfa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n9rk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:38Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-hk884\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:41Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:41 crc kubenswrapper[4785]: I1126 15:07:41.118704 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6q4xd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"855bd894-cca9-4fe1-a0d5-8b72afe7c93a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://041064cd63f3d654e02e5e310fbc190ca527a986959c9d7a35ab87962184f2a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7zv2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6q4xd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:41Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:41 crc kubenswrapper[4785]: I1126 15:07:41.131762 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:41Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:41 crc kubenswrapper[4785]: I1126 15:07:41.139527 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-smv28" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d77660b-5b10-4573-84ab-3dc318d4b4ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3bb62d64073e9a5efa6ec04394fb33bf1498892d59e0843f323a728dff2e7b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mwfft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:38Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-smv28\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:41Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:41 crc kubenswrapper[4785]: I1126 15:07:41.215403 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-925q9" event={"ID":"862c58fd-3f79-4276-bd76-ce689d32cbd6","Type":"ContainerStarted","Data":"5bc34631383b7470b2d582d037ea922f2c46d09ca9e37011d758e16837336a78"} Nov 26 15:07:41 crc kubenswrapper[4785]: I1126 15:07:41.215453 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-925q9" event={"ID":"862c58fd-3f79-4276-bd76-ce689d32cbd6","Type":"ContainerStarted","Data":"018b59a73e9d454f9156de17b00a8c5cccc3f321164cac374187afa18466bc45"} Nov 26 15:07:41 crc kubenswrapper[4785]: I1126 15:07:41.215472 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-925q9" event={"ID":"862c58fd-3f79-4276-bd76-ce689d32cbd6","Type":"ContainerStarted","Data":"7b754df1d5f2d20f3afdb10f2e77f4716de9d953c7322bb3e289d989d2699d17"} Nov 26 15:07:41 crc kubenswrapper[4785]: I1126 15:07:41.215487 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-925q9" event={"ID":"862c58fd-3f79-4276-bd76-ce689d32cbd6","Type":"ContainerStarted","Data":"77387f2d556a76807d809c75aa4d9e2fdf5de82d7a04e23d3f5461ccef860196"} Nov 26 15:07:41 crc kubenswrapper[4785]: I1126 15:07:41.215501 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-925q9" event={"ID":"862c58fd-3f79-4276-bd76-ce689d32cbd6","Type":"ContainerStarted","Data":"47be007cc11c8c2db5bc6dd5786cad4e45c1f8f08f872d060be0211012aad935"} Nov 26 15:07:41 crc kubenswrapper[4785]: I1126 15:07:41.222505 4785 generic.go:334] "Generic (PLEG): container finished" podID="84d83039-5d86-45bb-a5c1-ca5b94ed92c5" containerID="538d65a0b5f3e2c45c1b9761f2c73e958d5de0873652f20f4047631423e538eb" exitCode=0 Nov 26 15:07:41 crc kubenswrapper[4785]: I1126 15:07:41.222587 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-xbz7b" event={"ID":"84d83039-5d86-45bb-a5c1-ca5b94ed92c5","Type":"ContainerDied","Data":"538d65a0b5f3e2c45c1b9761f2c73e958d5de0873652f20f4047631423e538eb"} Nov 26 15:07:41 crc kubenswrapper[4785]: I1126 15:07:41.236299 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:41Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:41 crc kubenswrapper[4785]: I1126 15:07:41.251957 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-smv28" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d77660b-5b10-4573-84ab-3dc318d4b4ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3bb62d64073e9a5efa6ec04394fb33bf1498892d59e0843f323a728dff2e7b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mwfft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:38Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-smv28\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:41Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:41 crc kubenswrapper[4785]: I1126 15:07:41.267675 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ce2473df-8540-437a-9b68-a0c79c8f1189\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26237e500f4655636096b017418a9029308b9ad9e5c12041546e51d919766529\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7904d95c1ece08fadcdc54e83f80b295f3904cc950fb84624bc2a6967e4d5bdd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://936454b4c6cdedceb140c50e2dba0e677ce2b126075b203bdae8a77bbdefde76\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2202f005c9b704cc7a97a7387e4563a14bda9e52001c63918f1067a7e676d34d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a6d4414478527bb76f539125e4be6ebf2a6470ddb46fd35e07180319fa98ede\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-26T15:07:36Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1126 15:07:30.557519 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1126 15:07:30.559919 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2275430498/tls.crt::/tmp/serving-cert-2275430498/tls.key\\\\\\\"\\\\nI1126 15:07:36.843783 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1126 15:07:36.847546 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1126 15:07:36.847601 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1126 15:07:36.847630 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1126 15:07:36.847642 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1126 15:07:36.858915 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1126 15:07:36.858955 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1126 15:07:36.858960 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1126 15:07:36.858971 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1126 15:07:36.858975 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1126 15:07:36.858978 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1126 15:07:36.858982 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1126 15:07:36.859422 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1126 15:07:36.862125 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:20Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcce80995d95ff6993051a4f1ca260133d2513d216a291e3f2c905ec29f88d1b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e86ca1ef167bd0c31ad5ae1283d40945c72c4e528f3702b577bbbf9818e84c08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e86ca1ef167bd0c31ad5ae1283d40945c72c4e528f3702b577bbbf9818e84c08\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:41Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:41 crc kubenswrapper[4785]: I1126 15:07:41.280653 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f369bd7df062792e57a0cee19e7be96c6850b20d6e739a0c7fa5eccc14ae1d1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:41Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:41 crc kubenswrapper[4785]: I1126 15:07:41.293782 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:41Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:41 crc kubenswrapper[4785]: I1126 15:07:41.309890 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8acfa933b476aa5e196c195906225ae92af5cca3353292e23f29f87ee374d0ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f432643beb27bff31e146230a60837b1292cddb8595ede8ed7f929170fceeb17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:41Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:41 crc kubenswrapper[4785]: I1126 15:07:41.322030 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ab8b8fda3321443cb24c214c10ba7171f01c80adf81860f4aa931062ec886d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:41Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:41 crc kubenswrapper[4785]: I1126 15:07:41.360314 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:41Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:41 crc kubenswrapper[4785]: I1126 15:07:41.383799 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xbz7b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84d83039-5d86-45bb-a5c1-ca5b94ed92c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gdb7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f7072a2a9067489a27300fea823075b89d71d45a1d972bd9d94226b01c2c123\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f7072a2a9067489a27300fea823075b89d71d45a1d972bd9d94226b01c2c123\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gdb7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://538d65a0b5f3e2c45c1b9761f2c73e958d5de0873652f20f4047631423e538eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://538d65a0b5f3e2c45c1b9761f2c73e958d5de0873652f20f4047631423e538eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gdb7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gdb7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gdb7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gdb7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gdb7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xbz7b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:41Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:41 crc kubenswrapper[4785]: I1126 15:07:41.400023 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-gkxdl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5539d39a-e2bc-4e7f-8b5a-a5e3e10c4ba4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://183451ec5e6ad1983b6e56a53adfcf2eaf3489dcfbc3cbc709f67002691b9010\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqzs8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5538d579d2e0ac36c778f58da6e8176f8f7a0995797e8bc7c3ad21a054346867\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqzs8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-gkxdl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:41Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:41 crc kubenswrapper[4785]: I1126 15:07:41.421127 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ec3c2133-1999-46b4-b5a6-c1b800f04c30\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee866b365ef5e15c3cc262ab6ad0649b10fa7fdf0c10c3f747d9e20d48535e76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79b8f6bd2047642cacd4effcee8f9760c7c9480092993c7a25eabe4727bd8238\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dee1454ae94951e83f437b999f871e4965a9199a20bcd6392a6e07a7bfa9c6aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://257e51cb9b93727ccb616e193abc21dfed1ee9ee2f1ce459e35728457dae5280\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc743861f85f56a87c6a20d7152af59e554284fe13b8212c5dae854438771406\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ff60b31932a511f9156f1416e112d115fcbc79a0a6030143ba9f5ddd327577a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ff60b31932a511f9156f1416e112d115fcbc79a0a6030143ba9f5ddd327577a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff524be439c6bf410b9377cd503516b43be3826e4c2fcd03a6ed9bf3fcfab95b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff524be439c6bf410b9377cd503516b43be3826e4c2fcd03a6ed9bf3fcfab95b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:19Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://26cb5ba5508ef320a24ba05e352547c9feb82f37ebeab58ad7c0a073d03b8c2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26cb5ba5508ef320a24ba05e352547c9feb82f37ebeab58ad7c0a073d03b8c2d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:17Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:41Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:41 crc kubenswrapper[4785]: I1126 15:07:41.461021 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-925q9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"862c58fd-3f79-4276-bd76-ce689d32cbd6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://049fc6de07d4b51e1c898035109c3427dff189cc509f5973db7cb75c3a022bc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://049fc6de07d4b51e1c898035109c3427dff189cc509f5973db7cb75c3a022bc1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-925q9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:41Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:41 crc kubenswrapper[4785]: I1126 15:07:41.490130 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-hk884" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b9a87c55-b930-4993-88b8-15902e000caa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://018c4591e8ee49a626a6357d6672b723d221f62eaa62face83fd6c3d5f1b3bfa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n9rk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:38Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-hk884\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:41Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:41 crc kubenswrapper[4785]: I1126 15:07:41.530383 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6q4xd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"855bd894-cca9-4fe1-a0d5-8b72afe7c93a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://041064cd63f3d654e02e5e310fbc190ca527a986959c9d7a35ab87962184f2a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7zv2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6q4xd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:41Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:42 crc kubenswrapper[4785]: I1126 15:07:42.035819 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 26 15:07:42 crc kubenswrapper[4785]: E1126 15:07:42.035997 4785 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 26 15:07:42 crc kubenswrapper[4785]: I1126 15:07:42.230169 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-925q9" event={"ID":"862c58fd-3f79-4276-bd76-ce689d32cbd6","Type":"ContainerStarted","Data":"e1a2013c12cbd23c8c9e4e0d5bd9f76df1c2c2f30f0d55fdb8c8cb03698ea576"} Nov 26 15:07:42 crc kubenswrapper[4785]: I1126 15:07:42.232419 4785 generic.go:334] "Generic (PLEG): container finished" podID="84d83039-5d86-45bb-a5c1-ca5b94ed92c5" containerID="d1abf2b4c1a00c0bc17acaf7547130008127579df59592382ae6e81430936e6d" exitCode=0 Nov 26 15:07:42 crc kubenswrapper[4785]: I1126 15:07:42.232437 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-xbz7b" event={"ID":"84d83039-5d86-45bb-a5c1-ca5b94ed92c5","Type":"ContainerDied","Data":"d1abf2b4c1a00c0bc17acaf7547130008127579df59592382ae6e81430936e6d"} Nov 26 15:07:42 crc kubenswrapper[4785]: I1126 15:07:42.257769 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ec3c2133-1999-46b4-b5a6-c1b800f04c30\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee866b365ef5e15c3cc262ab6ad0649b10fa7fdf0c10c3f747d9e20d48535e76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79b8f6bd2047642cacd4effcee8f9760c7c9480092993c7a25eabe4727bd8238\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dee1454ae94951e83f437b999f871e4965a9199a20bcd6392a6e07a7bfa9c6aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://257e51cb9b93727ccb616e193abc21dfed1ee9ee2f1ce459e35728457dae5280\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc743861f85f56a87c6a20d7152af59e554284fe13b8212c5dae854438771406\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ff60b31932a511f9156f1416e112d115fcbc79a0a6030143ba9f5ddd327577a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ff60b31932a511f9156f1416e112d115fcbc79a0a6030143ba9f5ddd327577a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff524be439c6bf410b9377cd503516b43be3826e4c2fcd03a6ed9bf3fcfab95b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff524be439c6bf410b9377cd503516b43be3826e4c2fcd03a6ed9bf3fcfab95b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:19Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://26cb5ba5508ef320a24ba05e352547c9feb82f37ebeab58ad7c0a073d03b8c2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26cb5ba5508ef320a24ba05e352547c9feb82f37ebeab58ad7c0a073d03b8c2d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:17Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:42Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:42 crc kubenswrapper[4785]: I1126 15:07:42.271646 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8acfa933b476aa5e196c195906225ae92af5cca3353292e23f29f87ee374d0ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f432643beb27bff31e146230a60837b1292cddb8595ede8ed7f929170fceeb17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:42Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:42 crc kubenswrapper[4785]: I1126 15:07:42.283184 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ab8b8fda3321443cb24c214c10ba7171f01c80adf81860f4aa931062ec886d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:42Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:42 crc kubenswrapper[4785]: I1126 15:07:42.296217 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:42Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:42 crc kubenswrapper[4785]: I1126 15:07:42.312000 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xbz7b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84d83039-5d86-45bb-a5c1-ca5b94ed92c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gdb7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f7072a2a9067489a27300fea823075b89d71d45a1d972bd9d94226b01c2c123\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f7072a2a9067489a27300fea823075b89d71d45a1d972bd9d94226b01c2c123\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gdb7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://538d65a0b5f3e2c45c1b9761f2c73e958d5de0873652f20f4047631423e538eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://538d65a0b5f3e2c45c1b9761f2c73e958d5de0873652f20f4047631423e538eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gdb7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1abf2b4c1a00c0bc17acaf7547130008127579df59592382ae6e81430936e6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d1abf2b4c1a00c0bc17acaf7547130008127579df59592382ae6e81430936e6d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gdb7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gdb7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gdb7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gdb7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xbz7b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:42Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:42 crc kubenswrapper[4785]: I1126 15:07:42.328953 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-gkxdl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5539d39a-e2bc-4e7f-8b5a-a5e3e10c4ba4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://183451ec5e6ad1983b6e56a53adfcf2eaf3489dcfbc3cbc709f67002691b9010\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqzs8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5538d579d2e0ac36c778f58da6e8176f8f7a0995797e8bc7c3ad21a054346867\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqzs8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-gkxdl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:42Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:42 crc kubenswrapper[4785]: I1126 15:07:42.346674 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-925q9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"862c58fd-3f79-4276-bd76-ce689d32cbd6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://049fc6de07d4b51e1c898035109c3427dff189cc509f5973db7cb75c3a022bc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://049fc6de07d4b51e1c898035109c3427dff189cc509f5973db7cb75c3a022bc1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-925q9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:42Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:42 crc kubenswrapper[4785]: I1126 15:07:42.362171 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-hk884" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b9a87c55-b930-4993-88b8-15902e000caa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://018c4591e8ee49a626a6357d6672b723d221f62eaa62face83fd6c3d5f1b3bfa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n9rk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:38Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-hk884\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:42Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:42 crc kubenswrapper[4785]: I1126 15:07:42.375864 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6q4xd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"855bd894-cca9-4fe1-a0d5-8b72afe7c93a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://041064cd63f3d654e02e5e310fbc190ca527a986959c9d7a35ab87962184f2a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7zv2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6q4xd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:42Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:42 crc kubenswrapper[4785]: I1126 15:07:42.389692 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:42Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:42 crc kubenswrapper[4785]: I1126 15:07:42.400123 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-smv28" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d77660b-5b10-4573-84ab-3dc318d4b4ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3bb62d64073e9a5efa6ec04394fb33bf1498892d59e0843f323a728dff2e7b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mwfft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:38Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-smv28\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:42Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:42 crc kubenswrapper[4785]: I1126 15:07:42.415387 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ce2473df-8540-437a-9b68-a0c79c8f1189\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26237e500f4655636096b017418a9029308b9ad9e5c12041546e51d919766529\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7904d95c1ece08fadcdc54e83f80b295f3904cc950fb84624bc2a6967e4d5bdd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://936454b4c6cdedceb140c50e2dba0e677ce2b126075b203bdae8a77bbdefde76\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2202f005c9b704cc7a97a7387e4563a14bda9e52001c63918f1067a7e676d34d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a6d4414478527bb76f539125e4be6ebf2a6470ddb46fd35e07180319fa98ede\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-26T15:07:36Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1126 15:07:30.557519 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1126 15:07:30.559919 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2275430498/tls.crt::/tmp/serving-cert-2275430498/tls.key\\\\\\\"\\\\nI1126 15:07:36.843783 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1126 15:07:36.847546 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1126 15:07:36.847601 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1126 15:07:36.847630 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1126 15:07:36.847642 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1126 15:07:36.858915 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1126 15:07:36.858955 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1126 15:07:36.858960 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1126 15:07:36.858971 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1126 15:07:36.858975 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1126 15:07:36.858978 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1126 15:07:36.858982 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1126 15:07:36.859422 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1126 15:07:36.862125 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:20Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcce80995d95ff6993051a4f1ca260133d2513d216a291e3f2c905ec29f88d1b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e86ca1ef167bd0c31ad5ae1283d40945c72c4e528f3702b577bbbf9818e84c08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e86ca1ef167bd0c31ad5ae1283d40945c72c4e528f3702b577bbbf9818e84c08\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:42Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:42 crc kubenswrapper[4785]: I1126 15:07:42.428278 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f369bd7df062792e57a0cee19e7be96c6850b20d6e739a0c7fa5eccc14ae1d1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:42Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:42 crc kubenswrapper[4785]: I1126 15:07:42.439637 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:42Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:42 crc kubenswrapper[4785]: I1126 15:07:42.522409 4785 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 26 15:07:42 crc kubenswrapper[4785]: I1126 15:07:42.527809 4785 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 26 15:07:42 crc kubenswrapper[4785]: I1126 15:07:42.529717 4785 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Nov 26 15:07:42 crc kubenswrapper[4785]: I1126 15:07:42.535457 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f369bd7df062792e57a0cee19e7be96c6850b20d6e739a0c7fa5eccc14ae1d1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:42Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:42 crc kubenswrapper[4785]: I1126 15:07:42.549333 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:42Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:42 crc kubenswrapper[4785]: I1126 15:07:42.562479 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ce2473df-8540-437a-9b68-a0c79c8f1189\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26237e500f4655636096b017418a9029308b9ad9e5c12041546e51d919766529\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7904d95c1ece08fadcdc54e83f80b295f3904cc950fb84624bc2a6967e4d5bdd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://936454b4c6cdedceb140c50e2dba0e677ce2b126075b203bdae8a77bbdefde76\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2202f005c9b704cc7a97a7387e4563a14bda9e52001c63918f1067a7e676d34d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a6d4414478527bb76f539125e4be6ebf2a6470ddb46fd35e07180319fa98ede\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-26T15:07:36Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1126 15:07:30.557519 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1126 15:07:30.559919 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2275430498/tls.crt::/tmp/serving-cert-2275430498/tls.key\\\\\\\"\\\\nI1126 15:07:36.843783 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1126 15:07:36.847546 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1126 15:07:36.847601 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1126 15:07:36.847630 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1126 15:07:36.847642 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1126 15:07:36.858915 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1126 15:07:36.858955 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1126 15:07:36.858960 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1126 15:07:36.858971 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1126 15:07:36.858975 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1126 15:07:36.858978 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1126 15:07:36.858982 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1126 15:07:36.859422 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1126 15:07:36.862125 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:20Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcce80995d95ff6993051a4f1ca260133d2513d216a291e3f2c905ec29f88d1b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e86ca1ef167bd0c31ad5ae1283d40945c72c4e528f3702b577bbbf9818e84c08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e86ca1ef167bd0c31ad5ae1283d40945c72c4e528f3702b577bbbf9818e84c08\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:42Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:42 crc kubenswrapper[4785]: I1126 15:07:42.593381 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ec3c2133-1999-46b4-b5a6-c1b800f04c30\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee866b365ef5e15c3cc262ab6ad0649b10fa7fdf0c10c3f747d9e20d48535e76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79b8f6bd2047642cacd4effcee8f9760c7c9480092993c7a25eabe4727bd8238\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dee1454ae94951e83f437b999f871e4965a9199a20bcd6392a6e07a7bfa9c6aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://257e51cb9b93727ccb616e193abc21dfed1ee9ee2f1ce459e35728457dae5280\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc743861f85f56a87c6a20d7152af59e554284fe13b8212c5dae854438771406\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ff60b31932a511f9156f1416e112d115fcbc79a0a6030143ba9f5ddd327577a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ff60b31932a511f9156f1416e112d115fcbc79a0a6030143ba9f5ddd327577a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff524be439c6bf410b9377cd503516b43be3826e4c2fcd03a6ed9bf3fcfab95b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff524be439c6bf410b9377cd503516b43be3826e4c2fcd03a6ed9bf3fcfab95b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:19Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://26cb5ba5508ef320a24ba05e352547c9feb82f37ebeab58ad7c0a073d03b8c2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26cb5ba5508ef320a24ba05e352547c9feb82f37ebeab58ad7c0a073d03b8c2d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:17Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:42Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:42 crc kubenswrapper[4785]: I1126 15:07:42.610544 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8acfa933b476aa5e196c195906225ae92af5cca3353292e23f29f87ee374d0ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f432643beb27bff31e146230a60837b1292cddb8595ede8ed7f929170fceeb17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:42Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:42 crc kubenswrapper[4785]: I1126 15:07:42.623667 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ab8b8fda3321443cb24c214c10ba7171f01c80adf81860f4aa931062ec886d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:42Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:42 crc kubenswrapper[4785]: I1126 15:07:42.636744 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:42Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:42 crc kubenswrapper[4785]: I1126 15:07:42.653477 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xbz7b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84d83039-5d86-45bb-a5c1-ca5b94ed92c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gdb7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f7072a2a9067489a27300fea823075b89d71d45a1d972bd9d94226b01c2c123\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f7072a2a9067489a27300fea823075b89d71d45a1d972bd9d94226b01c2c123\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gdb7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://538d65a0b5f3e2c45c1b9761f2c73e958d5de0873652f20f4047631423e538eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://538d65a0b5f3e2c45c1b9761f2c73e958d5de0873652f20f4047631423e538eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gdb7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1abf2b4c1a00c0bc17acaf7547130008127579df59592382ae6e81430936e6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d1abf2b4c1a00c0bc17acaf7547130008127579df59592382ae6e81430936e6d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gdb7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gdb7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gdb7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gdb7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xbz7b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:42Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:42 crc kubenswrapper[4785]: I1126 15:07:42.665265 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-gkxdl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5539d39a-e2bc-4e7f-8b5a-a5e3e10c4ba4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://183451ec5e6ad1983b6e56a53adfcf2eaf3489dcfbc3cbc709f67002691b9010\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqzs8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5538d579d2e0ac36c778f58da6e8176f8f7a0995797e8bc7c3ad21a054346867\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqzs8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-gkxdl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:42Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:42 crc kubenswrapper[4785]: I1126 15:07:42.685160 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-925q9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"862c58fd-3f79-4276-bd76-ce689d32cbd6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://049fc6de07d4b51e1c898035109c3427dff189cc509f5973db7cb75c3a022bc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://049fc6de07d4b51e1c898035109c3427dff189cc509f5973db7cb75c3a022bc1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-925q9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:42Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:42 crc kubenswrapper[4785]: I1126 15:07:42.700129 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6q4xd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"855bd894-cca9-4fe1-a0d5-8b72afe7c93a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://041064cd63f3d654e02e5e310fbc190ca527a986959c9d7a35ab87962184f2a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7zv2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6q4xd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:42Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:42 crc kubenswrapper[4785]: I1126 15:07:42.711339 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-hk884" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b9a87c55-b930-4993-88b8-15902e000caa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://018c4591e8ee49a626a6357d6672b723d221f62eaa62face83fd6c3d5f1b3bfa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n9rk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:38Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-hk884\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:42Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:42 crc kubenswrapper[4785]: I1126 15:07:42.729620 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:42Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:42 crc kubenswrapper[4785]: I1126 15:07:42.741524 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-smv28" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d77660b-5b10-4573-84ab-3dc318d4b4ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3bb62d64073e9a5efa6ec04394fb33bf1498892d59e0843f323a728dff2e7b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mwfft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:38Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-smv28\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:42Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:42 crc kubenswrapper[4785]: I1126 15:07:42.754470 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:42Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:42 crc kubenswrapper[4785]: I1126 15:07:42.764694 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-smv28" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d77660b-5b10-4573-84ab-3dc318d4b4ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3bb62d64073e9a5efa6ec04394fb33bf1498892d59e0843f323a728dff2e7b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mwfft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:38Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-smv28\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:42Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:42 crc kubenswrapper[4785]: I1126 15:07:42.791651 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ce2473df-8540-437a-9b68-a0c79c8f1189\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26237e500f4655636096b017418a9029308b9ad9e5c12041546e51d919766529\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7904d95c1ece08fadcdc54e83f80b295f3904cc950fb84624bc2a6967e4d5bdd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://936454b4c6cdedceb140c50e2dba0e677ce2b126075b203bdae8a77bbdefde76\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2202f005c9b704cc7a97a7387e4563a14bda9e52001c63918f1067a7e676d34d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a6d4414478527bb76f539125e4be6ebf2a6470ddb46fd35e07180319fa98ede\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-26T15:07:36Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1126 15:07:30.557519 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1126 15:07:30.559919 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2275430498/tls.crt::/tmp/serving-cert-2275430498/tls.key\\\\\\\"\\\\nI1126 15:07:36.843783 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1126 15:07:36.847546 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1126 15:07:36.847601 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1126 15:07:36.847630 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1126 15:07:36.847642 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1126 15:07:36.858915 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1126 15:07:36.858955 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1126 15:07:36.858960 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1126 15:07:36.858971 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1126 15:07:36.858975 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1126 15:07:36.858978 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1126 15:07:36.858982 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1126 15:07:36.859422 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1126 15:07:36.862125 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:20Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcce80995d95ff6993051a4f1ca260133d2513d216a291e3f2c905ec29f88d1b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e86ca1ef167bd0c31ad5ae1283d40945c72c4e528f3702b577bbbf9818e84c08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e86ca1ef167bd0c31ad5ae1283d40945c72c4e528f3702b577bbbf9818e84c08\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:42Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:42 crc kubenswrapper[4785]: I1126 15:07:42.837681 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f369bd7df062792e57a0cee19e7be96c6850b20d6e739a0c7fa5eccc14ae1d1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:42Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:42 crc kubenswrapper[4785]: I1126 15:07:42.876402 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:42Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:42 crc kubenswrapper[4785]: I1126 15:07:42.913214 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8acfa933b476aa5e196c195906225ae92af5cca3353292e23f29f87ee374d0ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f432643beb27bff31e146230a60837b1292cddb8595ede8ed7f929170fceeb17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:42Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:42 crc kubenswrapper[4785]: I1126 15:07:42.954067 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ab8b8fda3321443cb24c214c10ba7171f01c80adf81860f4aa931062ec886d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:42Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:42 crc kubenswrapper[4785]: I1126 15:07:42.998854 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:42Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:43 crc kubenswrapper[4785]: I1126 15:07:43.036247 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 26 15:07:43 crc kubenswrapper[4785]: I1126 15:07:43.036377 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 26 15:07:43 crc kubenswrapper[4785]: E1126 15:07:43.036497 4785 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 26 15:07:43 crc kubenswrapper[4785]: E1126 15:07:43.036700 4785 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 26 15:07:43 crc kubenswrapper[4785]: I1126 15:07:43.051025 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xbz7b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84d83039-5d86-45bb-a5c1-ca5b94ed92c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gdb7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f7072a2a9067489a27300fea823075b89d71d45a1d972bd9d94226b01c2c123\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f7072a2a9067489a27300fea823075b89d71d45a1d972bd9d94226b01c2c123\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gdb7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://538d65a0b5f3e2c45c1b9761f2c73e958d5de0873652f20f4047631423e538eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://538d65a0b5f3e2c45c1b9761f2c73e958d5de0873652f20f4047631423e538eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gdb7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1abf2b4c1a00c0bc17acaf7547130008127579df59592382ae6e81430936e6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d1abf2b4c1a00c0bc17acaf7547130008127579df59592382ae6e81430936e6d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gdb7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gdb7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gdb7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gdb7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xbz7b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:43Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:43 crc kubenswrapper[4785]: I1126 15:07:43.070087 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-gkxdl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5539d39a-e2bc-4e7f-8b5a-a5e3e10c4ba4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://183451ec5e6ad1983b6e56a53adfcf2eaf3489dcfbc3cbc709f67002691b9010\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqzs8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5538d579d2e0ac36c778f58da6e8176f8f7a0995797e8bc7c3ad21a054346867\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqzs8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-gkxdl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:43Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:43 crc kubenswrapper[4785]: I1126 15:07:43.110132 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dbac4f7f-578e-41e6-88a0-4ad8d2f5eef2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://320fc519a91dfed4d5c233e7c1c2f21aadf25336ccade8830069aae4bd804a68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b79ab88118b1e683ad26fc3bd390961d15af99cce73126c56587c0df819f3a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6c964ff713703fe4484d76d86473279ffc4a3eef0cf3dd8b29aea7fdc6271616\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03ac9c745ec9eb21263a599561496630202872ef631f5e846e0475efedee6943\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:43Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:43 crc kubenswrapper[4785]: I1126 15:07:43.156166 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ec3c2133-1999-46b4-b5a6-c1b800f04c30\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee866b365ef5e15c3cc262ab6ad0649b10fa7fdf0c10c3f747d9e20d48535e76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79b8f6bd2047642cacd4effcee8f9760c7c9480092993c7a25eabe4727bd8238\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dee1454ae94951e83f437b999f871e4965a9199a20bcd6392a6e07a7bfa9c6aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://257e51cb9b93727ccb616e193abc21dfed1ee9ee2f1ce459e35728457dae5280\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc743861f85f56a87c6a20d7152af59e554284fe13b8212c5dae854438771406\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ff60b31932a511f9156f1416e112d115fcbc79a0a6030143ba9f5ddd327577a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ff60b31932a511f9156f1416e112d115fcbc79a0a6030143ba9f5ddd327577a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff524be439c6bf410b9377cd503516b43be3826e4c2fcd03a6ed9bf3fcfab95b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff524be439c6bf410b9377cd503516b43be3826e4c2fcd03a6ed9bf3fcfab95b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:19Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://26cb5ba5508ef320a24ba05e352547c9feb82f37ebeab58ad7c0a073d03b8c2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26cb5ba5508ef320a24ba05e352547c9feb82f37ebeab58ad7c0a073d03b8c2d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:17Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:43Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:43 crc kubenswrapper[4785]: I1126 15:07:43.195431 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-925q9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"862c58fd-3f79-4276-bd76-ce689d32cbd6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://049fc6de07d4b51e1c898035109c3427dff189cc509f5973db7cb75c3a022bc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://049fc6de07d4b51e1c898035109c3427dff189cc509f5973db7cb75c3a022bc1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-925q9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:43Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:43 crc kubenswrapper[4785]: I1126 15:07:43.228613 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-hk884" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b9a87c55-b930-4993-88b8-15902e000caa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://018c4591e8ee49a626a6357d6672b723d221f62eaa62face83fd6c3d5f1b3bfa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n9rk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:38Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-hk884\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:43Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:43 crc kubenswrapper[4785]: I1126 15:07:43.236969 4785 generic.go:334] "Generic (PLEG): container finished" podID="84d83039-5d86-45bb-a5c1-ca5b94ed92c5" containerID="35a2151729e5cb2ae347214fc91e6de116a4062943e4cb4a27ab0553436953ea" exitCode=0 Nov 26 15:07:43 crc kubenswrapper[4785]: I1126 15:07:43.237027 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-xbz7b" event={"ID":"84d83039-5d86-45bb-a5c1-ca5b94ed92c5","Type":"ContainerDied","Data":"35a2151729e5cb2ae347214fc91e6de116a4062943e4cb4a27ab0553436953ea"} Nov 26 15:07:43 crc kubenswrapper[4785]: I1126 15:07:43.239345 4785 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 26 15:07:43 crc kubenswrapper[4785]: I1126 15:07:43.240960 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:07:43 crc kubenswrapper[4785]: I1126 15:07:43.240989 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:07:43 crc kubenswrapper[4785]: I1126 15:07:43.240997 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:07:43 crc kubenswrapper[4785]: I1126 15:07:43.241087 4785 kubelet_node_status.go:76] "Attempting to register node" node="crc" Nov 26 15:07:43 crc kubenswrapper[4785]: E1126 15:07:43.268306 4785 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"kube-controller-manager-crc\" already exists" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 26 15:07:43 crc kubenswrapper[4785]: I1126 15:07:43.312074 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6q4xd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"855bd894-cca9-4fe1-a0d5-8b72afe7c93a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://041064cd63f3d654e02e5e310fbc190ca527a986959c9d7a35ab87962184f2a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7zv2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6q4xd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:43Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:43 crc kubenswrapper[4785]: I1126 15:07:43.324991 4785 kubelet_node_status.go:115] "Node was previously registered" node="crc" Nov 26 15:07:43 crc kubenswrapper[4785]: I1126 15:07:43.325256 4785 kubelet_node_status.go:79] "Successfully registered node" node="crc" Nov 26 15:07:43 crc kubenswrapper[4785]: I1126 15:07:43.326098 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:07:43 crc kubenswrapper[4785]: I1126 15:07:43.326132 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:07:43 crc kubenswrapper[4785]: I1126 15:07:43.326141 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:07:43 crc kubenswrapper[4785]: I1126 15:07:43.326155 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:07:43 crc kubenswrapper[4785]: I1126 15:07:43.326164 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:07:43Z","lastTransitionTime":"2025-11-26T15:07:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:07:43 crc kubenswrapper[4785]: E1126 15:07:43.342528 4785 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T15:07:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T15:07:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:43Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T15:07:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T15:07:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:43Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"564b0a72-079b-4004-bac6-7e7947bc6860\\\",\\\"systemUUID\\\":\\\"0559caf5-1d73-4afa-a2e0-e8d6b738bfd5\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:43Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:43 crc kubenswrapper[4785]: I1126 15:07:43.346005 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:07:43 crc kubenswrapper[4785]: I1126 15:07:43.346029 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:07:43 crc kubenswrapper[4785]: I1126 15:07:43.346037 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:07:43 crc kubenswrapper[4785]: I1126 15:07:43.346053 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:07:43 crc kubenswrapper[4785]: I1126 15:07:43.346063 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:07:43Z","lastTransitionTime":"2025-11-26T15:07:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:07:43 crc kubenswrapper[4785]: E1126 15:07:43.362865 4785 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T15:07:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T15:07:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:43Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T15:07:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T15:07:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:43Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"564b0a72-079b-4004-bac6-7e7947bc6860\\\",\\\"systemUUID\\\":\\\"0559caf5-1d73-4afa-a2e0-e8d6b738bfd5\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:43Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:43 crc kubenswrapper[4785]: I1126 15:07:43.369857 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:07:43 crc kubenswrapper[4785]: I1126 15:07:43.369898 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:07:43 crc kubenswrapper[4785]: I1126 15:07:43.369907 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:07:43 crc kubenswrapper[4785]: I1126 15:07:43.369921 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:07:43 crc kubenswrapper[4785]: I1126 15:07:43.369932 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:07:43Z","lastTransitionTime":"2025-11-26T15:07:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:07:43 crc kubenswrapper[4785]: I1126 15:07:43.374213 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xbz7b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84d83039-5d86-45bb-a5c1-ca5b94ed92c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gdb7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f7072a2a9067489a27300fea823075b89d71d45a1d972bd9d94226b01c2c123\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f7072a2a9067489a27300fea823075b89d71d45a1d972bd9d94226b01c2c123\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gdb7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://538d65a0b5f3e2c45c1b9761f2c73e958d5de0873652f20f4047631423e538eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://538d65a0b5f3e2c45c1b9761f2c73e958d5de0873652f20f4047631423e538eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gdb7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1abf2b4c1a00c0bc17acaf7547130008127579df59592382ae6e81430936e6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d1abf2b4c1a00c0bc17acaf7547130008127579df59592382ae6e81430936e6d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gdb7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://35a2151729e5cb2ae347214fc91e6de116a4062943e4cb4a27ab0553436953ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35a2151729e5cb2ae347214fc91e6de116a4062943e4cb4a27ab0553436953ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gdb7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gdb7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gdb7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xbz7b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:43Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:43 crc kubenswrapper[4785]: E1126 15:07:43.380435 4785 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T15:07:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T15:07:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:43Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T15:07:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T15:07:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:43Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"564b0a72-079b-4004-bac6-7e7947bc6860\\\",\\\"systemUUID\\\":\\\"0559caf5-1d73-4afa-a2e0-e8d6b738bfd5\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:43Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:43 crc kubenswrapper[4785]: I1126 15:07:43.385955 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:07:43 crc kubenswrapper[4785]: I1126 15:07:43.385998 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:07:43 crc kubenswrapper[4785]: I1126 15:07:43.386011 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:07:43 crc kubenswrapper[4785]: I1126 15:07:43.386027 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:07:43 crc kubenswrapper[4785]: I1126 15:07:43.386039 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:07:43Z","lastTransitionTime":"2025-11-26T15:07:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:07:43 crc kubenswrapper[4785]: E1126 15:07:43.398968 4785 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T15:07:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T15:07:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:43Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T15:07:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T15:07:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:43Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"564b0a72-079b-4004-bac6-7e7947bc6860\\\",\\\"systemUUID\\\":\\\"0559caf5-1d73-4afa-a2e0-e8d6b738bfd5\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:43Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:43 crc kubenswrapper[4785]: I1126 15:07:43.403909 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:07:43 crc kubenswrapper[4785]: I1126 15:07:43.403944 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:07:43 crc kubenswrapper[4785]: I1126 15:07:43.403955 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:07:43 crc kubenswrapper[4785]: I1126 15:07:43.403971 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:07:43 crc kubenswrapper[4785]: I1126 15:07:43.403983 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:07:43Z","lastTransitionTime":"2025-11-26T15:07:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:07:43 crc kubenswrapper[4785]: I1126 15:07:43.410644 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-gkxdl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5539d39a-e2bc-4e7f-8b5a-a5e3e10c4ba4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://183451ec5e6ad1983b6e56a53adfcf2eaf3489dcfbc3cbc709f67002691b9010\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqzs8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5538d579d2e0ac36c778f58da6e8176f8f7a0995797e8bc7c3ad21a054346867\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqzs8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-gkxdl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:43Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:43 crc kubenswrapper[4785]: E1126 15:07:43.415002 4785 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T15:07:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T15:07:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:43Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T15:07:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T15:07:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:43Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"564b0a72-079b-4004-bac6-7e7947bc6860\\\",\\\"systemUUID\\\":\\\"0559caf5-1d73-4afa-a2e0-e8d6b738bfd5\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:43Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:43 crc kubenswrapper[4785]: E1126 15:07:43.415250 4785 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Nov 26 15:07:43 crc kubenswrapper[4785]: I1126 15:07:43.416752 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:07:43 crc kubenswrapper[4785]: I1126 15:07:43.416785 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:07:43 crc kubenswrapper[4785]: I1126 15:07:43.416794 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:07:43 crc kubenswrapper[4785]: I1126 15:07:43.416808 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:07:43 crc kubenswrapper[4785]: I1126 15:07:43.416817 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:07:43Z","lastTransitionTime":"2025-11-26T15:07:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:07:43 crc kubenswrapper[4785]: I1126 15:07:43.453815 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dbac4f7f-578e-41e6-88a0-4ad8d2f5eef2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://320fc519a91dfed4d5c233e7c1c2f21aadf25336ccade8830069aae4bd804a68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b79ab88118b1e683ad26fc3bd390961d15af99cce73126c56587c0df819f3a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6c964ff713703fe4484d76d86473279ffc4a3eef0cf3dd8b29aea7fdc6271616\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03ac9c745ec9eb21263a599561496630202872ef631f5e846e0475efedee6943\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:43Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:43 crc kubenswrapper[4785]: I1126 15:07:43.498797 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ec3c2133-1999-46b4-b5a6-c1b800f04c30\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee866b365ef5e15c3cc262ab6ad0649b10fa7fdf0c10c3f747d9e20d48535e76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79b8f6bd2047642cacd4effcee8f9760c7c9480092993c7a25eabe4727bd8238\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dee1454ae94951e83f437b999f871e4965a9199a20bcd6392a6e07a7bfa9c6aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://257e51cb9b93727ccb616e193abc21dfed1ee9ee2f1ce459e35728457dae5280\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc743861f85f56a87c6a20d7152af59e554284fe13b8212c5dae854438771406\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ff60b31932a511f9156f1416e112d115fcbc79a0a6030143ba9f5ddd327577a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ff60b31932a511f9156f1416e112d115fcbc79a0a6030143ba9f5ddd327577a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff524be439c6bf410b9377cd503516b43be3826e4c2fcd03a6ed9bf3fcfab95b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff524be439c6bf410b9377cd503516b43be3826e4c2fcd03a6ed9bf3fcfab95b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:19Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://26cb5ba5508ef320a24ba05e352547c9feb82f37ebeab58ad7c0a073d03b8c2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26cb5ba5508ef320a24ba05e352547c9feb82f37ebeab58ad7c0a073d03b8c2d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:17Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:43Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:43 crc kubenswrapper[4785]: I1126 15:07:43.519274 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:07:43 crc kubenswrapper[4785]: I1126 15:07:43.519321 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:07:43 crc kubenswrapper[4785]: I1126 15:07:43.519333 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:07:43 crc kubenswrapper[4785]: I1126 15:07:43.519351 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:07:43 crc kubenswrapper[4785]: I1126 15:07:43.519385 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:07:43Z","lastTransitionTime":"2025-11-26T15:07:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:07:43 crc kubenswrapper[4785]: I1126 15:07:43.530781 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8acfa933b476aa5e196c195906225ae92af5cca3353292e23f29f87ee374d0ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f432643beb27bff31e146230a60837b1292cddb8595ede8ed7f929170fceeb17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:43Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:43 crc kubenswrapper[4785]: I1126 15:07:43.573888 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ab8b8fda3321443cb24c214c10ba7171f01c80adf81860f4aa931062ec886d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:43Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:43 crc kubenswrapper[4785]: I1126 15:07:43.613475 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:43Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:43 crc kubenswrapper[4785]: I1126 15:07:43.621353 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:07:43 crc kubenswrapper[4785]: I1126 15:07:43.621393 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:07:43 crc kubenswrapper[4785]: I1126 15:07:43.621408 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:07:43 crc kubenswrapper[4785]: I1126 15:07:43.621425 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:07:43 crc kubenswrapper[4785]: I1126 15:07:43.621439 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:07:43Z","lastTransitionTime":"2025-11-26T15:07:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:07:43 crc kubenswrapper[4785]: I1126 15:07:43.661962 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-925q9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"862c58fd-3f79-4276-bd76-ce689d32cbd6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://049fc6de07d4b51e1c898035109c3427dff189cc509f5973db7cb75c3a022bc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://049fc6de07d4b51e1c898035109c3427dff189cc509f5973db7cb75c3a022bc1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-925q9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:43Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:43 crc kubenswrapper[4785]: I1126 15:07:43.697742 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-hk884" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b9a87c55-b930-4993-88b8-15902e000caa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://018c4591e8ee49a626a6357d6672b723d221f62eaa62face83fd6c3d5f1b3bfa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n9rk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:38Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-hk884\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:43Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:43 crc kubenswrapper[4785]: I1126 15:07:43.724330 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:07:43 crc kubenswrapper[4785]: I1126 15:07:43.724395 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:07:43 crc kubenswrapper[4785]: I1126 15:07:43.724415 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:07:43 crc kubenswrapper[4785]: I1126 15:07:43.724440 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:07:43 crc kubenswrapper[4785]: I1126 15:07:43.724457 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:07:43Z","lastTransitionTime":"2025-11-26T15:07:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:07:43 crc kubenswrapper[4785]: I1126 15:07:43.739395 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6q4xd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"855bd894-cca9-4fe1-a0d5-8b72afe7c93a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://041064cd63f3d654e02e5e310fbc190ca527a986959c9d7a35ab87962184f2a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7zv2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6q4xd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:43Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:43 crc kubenswrapper[4785]: I1126 15:07:43.779436 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:43Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:43 crc kubenswrapper[4785]: I1126 15:07:43.816072 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-smv28" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d77660b-5b10-4573-84ab-3dc318d4b4ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3bb62d64073e9a5efa6ec04394fb33bf1498892d59e0843f323a728dff2e7b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mwfft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:38Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-smv28\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:43Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:43 crc kubenswrapper[4785]: I1126 15:07:43.827070 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:07:43 crc kubenswrapper[4785]: I1126 15:07:43.827129 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:07:43 crc kubenswrapper[4785]: I1126 15:07:43.827146 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:07:43 crc kubenswrapper[4785]: I1126 15:07:43.827170 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:07:43 crc kubenswrapper[4785]: I1126 15:07:43.827187 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:07:43Z","lastTransitionTime":"2025-11-26T15:07:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:07:43 crc kubenswrapper[4785]: I1126 15:07:43.862172 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ce2473df-8540-437a-9b68-a0c79c8f1189\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26237e500f4655636096b017418a9029308b9ad9e5c12041546e51d919766529\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7904d95c1ece08fadcdc54e83f80b295f3904cc950fb84624bc2a6967e4d5bdd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://936454b4c6cdedceb140c50e2dba0e677ce2b126075b203bdae8a77bbdefde76\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2202f005c9b704cc7a97a7387e4563a14bda9e52001c63918f1067a7e676d34d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a6d4414478527bb76f539125e4be6ebf2a6470ddb46fd35e07180319fa98ede\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-26T15:07:36Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1126 15:07:30.557519 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1126 15:07:30.559919 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2275430498/tls.crt::/tmp/serving-cert-2275430498/tls.key\\\\\\\"\\\\nI1126 15:07:36.843783 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1126 15:07:36.847546 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1126 15:07:36.847601 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1126 15:07:36.847630 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1126 15:07:36.847642 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1126 15:07:36.858915 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1126 15:07:36.858955 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1126 15:07:36.858960 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1126 15:07:36.858971 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1126 15:07:36.858975 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1126 15:07:36.858978 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1126 15:07:36.858982 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1126 15:07:36.859422 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1126 15:07:36.862125 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:20Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcce80995d95ff6993051a4f1ca260133d2513d216a291e3f2c905ec29f88d1b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e86ca1ef167bd0c31ad5ae1283d40945c72c4e528f3702b577bbbf9818e84c08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e86ca1ef167bd0c31ad5ae1283d40945c72c4e528f3702b577bbbf9818e84c08\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:43Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:43 crc kubenswrapper[4785]: I1126 15:07:43.900802 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f369bd7df062792e57a0cee19e7be96c6850b20d6e739a0c7fa5eccc14ae1d1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:43Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:43 crc kubenswrapper[4785]: I1126 15:07:43.930232 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:07:43 crc kubenswrapper[4785]: I1126 15:07:43.930287 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:07:43 crc kubenswrapper[4785]: I1126 15:07:43.930304 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:07:43 crc kubenswrapper[4785]: I1126 15:07:43.930326 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:07:43 crc kubenswrapper[4785]: I1126 15:07:43.930343 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:07:43Z","lastTransitionTime":"2025-11-26T15:07:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:07:43 crc kubenswrapper[4785]: I1126 15:07:43.942519 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:43Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:44 crc kubenswrapper[4785]: I1126 15:07:44.032536 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:07:44 crc kubenswrapper[4785]: I1126 15:07:44.032645 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:07:44 crc kubenswrapper[4785]: I1126 15:07:44.032671 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:07:44 crc kubenswrapper[4785]: I1126 15:07:44.032700 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:07:44 crc kubenswrapper[4785]: I1126 15:07:44.032722 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:07:44Z","lastTransitionTime":"2025-11-26T15:07:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:07:44 crc kubenswrapper[4785]: I1126 15:07:44.036039 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 26 15:07:44 crc kubenswrapper[4785]: E1126 15:07:44.036159 4785 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 26 15:07:44 crc kubenswrapper[4785]: I1126 15:07:44.135042 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:07:44 crc kubenswrapper[4785]: I1126 15:07:44.135075 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:07:44 crc kubenswrapper[4785]: I1126 15:07:44.135084 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:07:44 crc kubenswrapper[4785]: I1126 15:07:44.135096 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:07:44 crc kubenswrapper[4785]: I1126 15:07:44.135105 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:07:44Z","lastTransitionTime":"2025-11-26T15:07:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:07:44 crc kubenswrapper[4785]: I1126 15:07:44.239158 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:07:44 crc kubenswrapper[4785]: I1126 15:07:44.239296 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:07:44 crc kubenswrapper[4785]: I1126 15:07:44.239310 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:07:44 crc kubenswrapper[4785]: I1126 15:07:44.239365 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:07:44 crc kubenswrapper[4785]: I1126 15:07:44.239380 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:07:44Z","lastTransitionTime":"2025-11-26T15:07:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:07:44 crc kubenswrapper[4785]: I1126 15:07:44.251992 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-925q9" event={"ID":"862c58fd-3f79-4276-bd76-ce689d32cbd6","Type":"ContainerStarted","Data":"84b98c7c476182b515072f71c6d81656debe5c03855324623b1eb31f53e6c5c1"} Nov 26 15:07:44 crc kubenswrapper[4785]: I1126 15:07:44.255138 4785 generic.go:334] "Generic (PLEG): container finished" podID="84d83039-5d86-45bb-a5c1-ca5b94ed92c5" containerID="b51e95d33c83c57a3b5a7f580eb9cac5e50575e699c066694a9ae8fd4dcdfb20" exitCode=0 Nov 26 15:07:44 crc kubenswrapper[4785]: I1126 15:07:44.255242 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-xbz7b" event={"ID":"84d83039-5d86-45bb-a5c1-ca5b94ed92c5","Type":"ContainerDied","Data":"b51e95d33c83c57a3b5a7f580eb9cac5e50575e699c066694a9ae8fd4dcdfb20"} Nov 26 15:07:44 crc kubenswrapper[4785]: I1126 15:07:44.280925 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:44Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:44 crc kubenswrapper[4785]: I1126 15:07:44.298582 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-smv28" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d77660b-5b10-4573-84ab-3dc318d4b4ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3bb62d64073e9a5efa6ec04394fb33bf1498892d59e0843f323a728dff2e7b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mwfft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:38Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-smv28\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:44Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:44 crc kubenswrapper[4785]: I1126 15:07:44.316087 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ce2473df-8540-437a-9b68-a0c79c8f1189\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26237e500f4655636096b017418a9029308b9ad9e5c12041546e51d919766529\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7904d95c1ece08fadcdc54e83f80b295f3904cc950fb84624bc2a6967e4d5bdd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://936454b4c6cdedceb140c50e2dba0e677ce2b126075b203bdae8a77bbdefde76\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2202f005c9b704cc7a97a7387e4563a14bda9e52001c63918f1067a7e676d34d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a6d4414478527bb76f539125e4be6ebf2a6470ddb46fd35e07180319fa98ede\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-26T15:07:36Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1126 15:07:30.557519 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1126 15:07:30.559919 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2275430498/tls.crt::/tmp/serving-cert-2275430498/tls.key\\\\\\\"\\\\nI1126 15:07:36.843783 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1126 15:07:36.847546 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1126 15:07:36.847601 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1126 15:07:36.847630 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1126 15:07:36.847642 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1126 15:07:36.858915 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1126 15:07:36.858955 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1126 15:07:36.858960 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1126 15:07:36.858971 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1126 15:07:36.858975 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1126 15:07:36.858978 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1126 15:07:36.858982 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1126 15:07:36.859422 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1126 15:07:36.862125 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:20Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcce80995d95ff6993051a4f1ca260133d2513d216a291e3f2c905ec29f88d1b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e86ca1ef167bd0c31ad5ae1283d40945c72c4e528f3702b577bbbf9818e84c08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e86ca1ef167bd0c31ad5ae1283d40945c72c4e528f3702b577bbbf9818e84c08\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:44Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:44 crc kubenswrapper[4785]: I1126 15:07:44.332635 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f369bd7df062792e57a0cee19e7be96c6850b20d6e739a0c7fa5eccc14ae1d1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:44Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:44 crc kubenswrapper[4785]: I1126 15:07:44.343147 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:07:44 crc kubenswrapper[4785]: I1126 15:07:44.343207 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:07:44 crc kubenswrapper[4785]: I1126 15:07:44.343221 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:07:44 crc kubenswrapper[4785]: I1126 15:07:44.343242 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:07:44 crc kubenswrapper[4785]: I1126 15:07:44.343256 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:07:44Z","lastTransitionTime":"2025-11-26T15:07:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:07:44 crc kubenswrapper[4785]: I1126 15:07:44.348577 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:44Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:44 crc kubenswrapper[4785]: I1126 15:07:44.359061 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-gkxdl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5539d39a-e2bc-4e7f-8b5a-a5e3e10c4ba4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://183451ec5e6ad1983b6e56a53adfcf2eaf3489dcfbc3cbc709f67002691b9010\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqzs8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5538d579d2e0ac36c778f58da6e8176f8f7a0995797e8bc7c3ad21a054346867\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqzs8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-gkxdl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:44Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:44 crc kubenswrapper[4785]: I1126 15:07:44.379057 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dbac4f7f-578e-41e6-88a0-4ad8d2f5eef2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://320fc519a91dfed4d5c233e7c1c2f21aadf25336ccade8830069aae4bd804a68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b79ab88118b1e683ad26fc3bd390961d15af99cce73126c56587c0df819f3a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6c964ff713703fe4484d76d86473279ffc4a3eef0cf3dd8b29aea7fdc6271616\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03ac9c745ec9eb21263a599561496630202872ef631f5e846e0475efedee6943\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:44Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:44 crc kubenswrapper[4785]: I1126 15:07:44.404797 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ec3c2133-1999-46b4-b5a6-c1b800f04c30\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee866b365ef5e15c3cc262ab6ad0649b10fa7fdf0c10c3f747d9e20d48535e76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79b8f6bd2047642cacd4effcee8f9760c7c9480092993c7a25eabe4727bd8238\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dee1454ae94951e83f437b999f871e4965a9199a20bcd6392a6e07a7bfa9c6aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://257e51cb9b93727ccb616e193abc21dfed1ee9ee2f1ce459e35728457dae5280\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc743861f85f56a87c6a20d7152af59e554284fe13b8212c5dae854438771406\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ff60b31932a511f9156f1416e112d115fcbc79a0a6030143ba9f5ddd327577a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ff60b31932a511f9156f1416e112d115fcbc79a0a6030143ba9f5ddd327577a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff524be439c6bf410b9377cd503516b43be3826e4c2fcd03a6ed9bf3fcfab95b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff524be439c6bf410b9377cd503516b43be3826e4c2fcd03a6ed9bf3fcfab95b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:19Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://26cb5ba5508ef320a24ba05e352547c9feb82f37ebeab58ad7c0a073d03b8c2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26cb5ba5508ef320a24ba05e352547c9feb82f37ebeab58ad7c0a073d03b8c2d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:17Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:44Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:44 crc kubenswrapper[4785]: I1126 15:07:44.417909 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8acfa933b476aa5e196c195906225ae92af5cca3353292e23f29f87ee374d0ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f432643beb27bff31e146230a60837b1292cddb8595ede8ed7f929170fceeb17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:44Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:44 crc kubenswrapper[4785]: I1126 15:07:44.434184 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ab8b8fda3321443cb24c214c10ba7171f01c80adf81860f4aa931062ec886d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:44Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:44 crc kubenswrapper[4785]: I1126 15:07:44.445713 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:07:44 crc kubenswrapper[4785]: I1126 15:07:44.445775 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:07:44 crc kubenswrapper[4785]: I1126 15:07:44.445791 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:07:44 crc kubenswrapper[4785]: I1126 15:07:44.445815 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:07:44 crc kubenswrapper[4785]: I1126 15:07:44.445833 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:07:44Z","lastTransitionTime":"2025-11-26T15:07:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:07:44 crc kubenswrapper[4785]: I1126 15:07:44.452683 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:44Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:44 crc kubenswrapper[4785]: I1126 15:07:44.472989 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xbz7b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84d83039-5d86-45bb-a5c1-ca5b94ed92c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gdb7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f7072a2a9067489a27300fea823075b89d71d45a1d972bd9d94226b01c2c123\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f7072a2a9067489a27300fea823075b89d71d45a1d972bd9d94226b01c2c123\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gdb7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://538d65a0b5f3e2c45c1b9761f2c73e958d5de0873652f20f4047631423e538eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://538d65a0b5f3e2c45c1b9761f2c73e958d5de0873652f20f4047631423e538eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gdb7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1abf2b4c1a00c0bc17acaf7547130008127579df59592382ae6e81430936e6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d1abf2b4c1a00c0bc17acaf7547130008127579df59592382ae6e81430936e6d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gdb7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://35a2151729e5cb2ae347214fc91e6de116a4062943e4cb4a27ab0553436953ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35a2151729e5cb2ae347214fc91e6de116a4062943e4cb4a27ab0553436953ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gdb7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b51e95d33c83c57a3b5a7f580eb9cac5e50575e699c066694a9ae8fd4dcdfb20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b51e95d33c83c57a3b5a7f580eb9cac5e50575e699c066694a9ae8fd4dcdfb20\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gdb7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gdb7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xbz7b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:44Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:44 crc kubenswrapper[4785]: I1126 15:07:44.497886 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-925q9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"862c58fd-3f79-4276-bd76-ce689d32cbd6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://049fc6de07d4b51e1c898035109c3427dff189cc509f5973db7cb75c3a022bc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://049fc6de07d4b51e1c898035109c3427dff189cc509f5973db7cb75c3a022bc1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-925q9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:44Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:44 crc kubenswrapper[4785]: I1126 15:07:44.510463 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-hk884" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b9a87c55-b930-4993-88b8-15902e000caa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://018c4591e8ee49a626a6357d6672b723d221f62eaa62face83fd6c3d5f1b3bfa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n9rk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:38Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-hk884\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:44Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:44 crc kubenswrapper[4785]: I1126 15:07:44.535219 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6q4xd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"855bd894-cca9-4fe1-a0d5-8b72afe7c93a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://041064cd63f3d654e02e5e310fbc190ca527a986959c9d7a35ab87962184f2a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7zv2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6q4xd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:44Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:44 crc kubenswrapper[4785]: I1126 15:07:44.548011 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:07:44 crc kubenswrapper[4785]: I1126 15:07:44.548056 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:07:44 crc kubenswrapper[4785]: I1126 15:07:44.548073 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:07:44 crc kubenswrapper[4785]: I1126 15:07:44.548093 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:07:44 crc kubenswrapper[4785]: I1126 15:07:44.548109 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:07:44Z","lastTransitionTime":"2025-11-26T15:07:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:07:44 crc kubenswrapper[4785]: I1126 15:07:44.621865 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 26 15:07:44 crc kubenswrapper[4785]: I1126 15:07:44.622041 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 26 15:07:44 crc kubenswrapper[4785]: I1126 15:07:44.622113 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 26 15:07:44 crc kubenswrapper[4785]: I1126 15:07:44.622150 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 26 15:07:44 crc kubenswrapper[4785]: I1126 15:07:44.622199 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 26 15:07:44 crc kubenswrapper[4785]: E1126 15:07:44.622302 4785 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 26 15:07:44 crc kubenswrapper[4785]: E1126 15:07:44.622400 4785 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 26 15:07:44 crc kubenswrapper[4785]: E1126 15:07:44.622301 4785 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Nov 26 15:07:44 crc kubenswrapper[4785]: E1126 15:07:44.622410 4785 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-26 15:07:52.622385111 +0000 UTC m=+36.300750905 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 26 15:07:44 crc kubenswrapper[4785]: E1126 15:07:44.622466 4785 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-26 15:07:52.622453623 +0000 UTC m=+36.300819427 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Nov 26 15:07:44 crc kubenswrapper[4785]: E1126 15:07:44.622466 4785 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 26 15:07:44 crc kubenswrapper[4785]: E1126 15:07:44.622514 4785 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 26 15:07:44 crc kubenswrapper[4785]: E1126 15:07:44.622537 4785 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 26 15:07:44 crc kubenswrapper[4785]: E1126 15:07:44.622430 4785 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 26 15:07:44 crc kubenswrapper[4785]: E1126 15:07:44.622618 4785 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 26 15:07:44 crc kubenswrapper[4785]: E1126 15:07:44.622656 4785 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-11-26 15:07:52.622631068 +0000 UTC m=+36.300996862 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 26 15:07:44 crc kubenswrapper[4785]: E1126 15:07:44.622687 4785 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-11-26 15:07:52.622674179 +0000 UTC m=+36.301039983 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 26 15:07:44 crc kubenswrapper[4785]: E1126 15:07:44.622712 4785 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-26 15:07:52.622696999 +0000 UTC m=+36.301062793 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 15:07:44 crc kubenswrapper[4785]: I1126 15:07:44.651171 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:07:44 crc kubenswrapper[4785]: I1126 15:07:44.651260 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:07:44 crc kubenswrapper[4785]: I1126 15:07:44.651278 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:07:44 crc kubenswrapper[4785]: I1126 15:07:44.651304 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:07:44 crc kubenswrapper[4785]: I1126 15:07:44.651321 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:07:44Z","lastTransitionTime":"2025-11-26T15:07:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:07:44 crc kubenswrapper[4785]: I1126 15:07:44.754628 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:07:44 crc kubenswrapper[4785]: I1126 15:07:44.754707 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:07:44 crc kubenswrapper[4785]: I1126 15:07:44.754725 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:07:44 crc kubenswrapper[4785]: I1126 15:07:44.754749 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:07:44 crc kubenswrapper[4785]: I1126 15:07:44.754769 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:07:44Z","lastTransitionTime":"2025-11-26T15:07:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:07:44 crc kubenswrapper[4785]: I1126 15:07:44.858351 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:07:44 crc kubenswrapper[4785]: I1126 15:07:44.858419 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:07:44 crc kubenswrapper[4785]: I1126 15:07:44.858437 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:07:44 crc kubenswrapper[4785]: I1126 15:07:44.858460 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:07:44 crc kubenswrapper[4785]: I1126 15:07:44.858481 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:07:44Z","lastTransitionTime":"2025-11-26T15:07:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:07:44 crc kubenswrapper[4785]: I1126 15:07:44.962892 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:07:44 crc kubenswrapper[4785]: I1126 15:07:44.962946 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:07:44 crc kubenswrapper[4785]: I1126 15:07:44.962965 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:07:44 crc kubenswrapper[4785]: I1126 15:07:44.962988 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:07:44 crc kubenswrapper[4785]: I1126 15:07:44.963006 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:07:44Z","lastTransitionTime":"2025-11-26T15:07:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:07:45 crc kubenswrapper[4785]: I1126 15:07:45.035767 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 26 15:07:45 crc kubenswrapper[4785]: I1126 15:07:45.035859 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 26 15:07:45 crc kubenswrapper[4785]: E1126 15:07:45.036058 4785 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 26 15:07:45 crc kubenswrapper[4785]: E1126 15:07:45.036209 4785 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 26 15:07:45 crc kubenswrapper[4785]: I1126 15:07:45.066226 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:07:45 crc kubenswrapper[4785]: I1126 15:07:45.066281 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:07:45 crc kubenswrapper[4785]: I1126 15:07:45.066299 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:07:45 crc kubenswrapper[4785]: I1126 15:07:45.066325 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:07:45 crc kubenswrapper[4785]: I1126 15:07:45.066343 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:07:45Z","lastTransitionTime":"2025-11-26T15:07:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:07:45 crc kubenswrapper[4785]: I1126 15:07:45.180358 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:07:45 crc kubenswrapper[4785]: I1126 15:07:45.180405 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:07:45 crc kubenswrapper[4785]: I1126 15:07:45.180421 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:07:45 crc kubenswrapper[4785]: I1126 15:07:45.180444 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:07:45 crc kubenswrapper[4785]: I1126 15:07:45.180460 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:07:45Z","lastTransitionTime":"2025-11-26T15:07:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:07:45 crc kubenswrapper[4785]: I1126 15:07:45.264715 4785 generic.go:334] "Generic (PLEG): container finished" podID="84d83039-5d86-45bb-a5c1-ca5b94ed92c5" containerID="20f87e07ed26b3b93d4dad87f971e4ee3bf3c46299491a95990ab34c63167f93" exitCode=0 Nov 26 15:07:45 crc kubenswrapper[4785]: I1126 15:07:45.264778 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-xbz7b" event={"ID":"84d83039-5d86-45bb-a5c1-ca5b94ed92c5","Type":"ContainerDied","Data":"20f87e07ed26b3b93d4dad87f971e4ee3bf3c46299491a95990ab34c63167f93"} Nov 26 15:07:45 crc kubenswrapper[4785]: I1126 15:07:45.283683 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-hk884" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b9a87c55-b930-4993-88b8-15902e000caa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://018c4591e8ee49a626a6357d6672b723d221f62eaa62face83fd6c3d5f1b3bfa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n9rk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:38Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-hk884\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:45Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:45 crc kubenswrapper[4785]: I1126 15:07:45.288266 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:07:45 crc kubenswrapper[4785]: I1126 15:07:45.288580 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:07:45 crc kubenswrapper[4785]: I1126 15:07:45.288596 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:07:45 crc kubenswrapper[4785]: I1126 15:07:45.288617 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:07:45 crc kubenswrapper[4785]: I1126 15:07:45.288633 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:07:45Z","lastTransitionTime":"2025-11-26T15:07:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:07:45 crc kubenswrapper[4785]: I1126 15:07:45.307342 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6q4xd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"855bd894-cca9-4fe1-a0d5-8b72afe7c93a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://041064cd63f3d654e02e5e310fbc190ca527a986959c9d7a35ab87962184f2a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7zv2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6q4xd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:45Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:45 crc kubenswrapper[4785]: I1126 15:07:45.330399 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:45Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:45 crc kubenswrapper[4785]: I1126 15:07:45.345799 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-smv28" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d77660b-5b10-4573-84ab-3dc318d4b4ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3bb62d64073e9a5efa6ec04394fb33bf1498892d59e0843f323a728dff2e7b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mwfft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:38Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-smv28\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:45Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:45 crc kubenswrapper[4785]: I1126 15:07:45.369168 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ce2473df-8540-437a-9b68-a0c79c8f1189\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26237e500f4655636096b017418a9029308b9ad9e5c12041546e51d919766529\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7904d95c1ece08fadcdc54e83f80b295f3904cc950fb84624bc2a6967e4d5bdd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://936454b4c6cdedceb140c50e2dba0e677ce2b126075b203bdae8a77bbdefde76\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2202f005c9b704cc7a97a7387e4563a14bda9e52001c63918f1067a7e676d34d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a6d4414478527bb76f539125e4be6ebf2a6470ddb46fd35e07180319fa98ede\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-26T15:07:36Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1126 15:07:30.557519 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1126 15:07:30.559919 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2275430498/tls.crt::/tmp/serving-cert-2275430498/tls.key\\\\\\\"\\\\nI1126 15:07:36.843783 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1126 15:07:36.847546 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1126 15:07:36.847601 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1126 15:07:36.847630 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1126 15:07:36.847642 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1126 15:07:36.858915 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1126 15:07:36.858955 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1126 15:07:36.858960 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1126 15:07:36.858971 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1126 15:07:36.858975 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1126 15:07:36.858978 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1126 15:07:36.858982 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1126 15:07:36.859422 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1126 15:07:36.862125 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:20Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcce80995d95ff6993051a4f1ca260133d2513d216a291e3f2c905ec29f88d1b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e86ca1ef167bd0c31ad5ae1283d40945c72c4e528f3702b577bbbf9818e84c08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e86ca1ef167bd0c31ad5ae1283d40945c72c4e528f3702b577bbbf9818e84c08\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:45Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:45 crc kubenswrapper[4785]: I1126 15:07:45.390882 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f369bd7df062792e57a0cee19e7be96c6850b20d6e739a0c7fa5eccc14ae1d1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:45Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:45 crc kubenswrapper[4785]: I1126 15:07:45.393199 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:07:45 crc kubenswrapper[4785]: I1126 15:07:45.393239 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:07:45 crc kubenswrapper[4785]: I1126 15:07:45.393252 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:07:45 crc kubenswrapper[4785]: I1126 15:07:45.393269 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:07:45 crc kubenswrapper[4785]: I1126 15:07:45.393282 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:07:45Z","lastTransitionTime":"2025-11-26T15:07:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:07:45 crc kubenswrapper[4785]: I1126 15:07:45.412037 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:45Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:45 crc kubenswrapper[4785]: I1126 15:07:45.435544 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8acfa933b476aa5e196c195906225ae92af5cca3353292e23f29f87ee374d0ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f432643beb27bff31e146230a60837b1292cddb8595ede8ed7f929170fceeb17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:45Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:45 crc kubenswrapper[4785]: I1126 15:07:45.454048 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ab8b8fda3321443cb24c214c10ba7171f01c80adf81860f4aa931062ec886d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:45Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:45 crc kubenswrapper[4785]: I1126 15:07:45.471067 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:45Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:45 crc kubenswrapper[4785]: I1126 15:07:45.491288 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xbz7b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84d83039-5d86-45bb-a5c1-ca5b94ed92c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gdb7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f7072a2a9067489a27300fea823075b89d71d45a1d972bd9d94226b01c2c123\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f7072a2a9067489a27300fea823075b89d71d45a1d972bd9d94226b01c2c123\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gdb7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://538d65a0b5f3e2c45c1b9761f2c73e958d5de0873652f20f4047631423e538eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://538d65a0b5f3e2c45c1b9761f2c73e958d5de0873652f20f4047631423e538eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gdb7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1abf2b4c1a00c0bc17acaf7547130008127579df59592382ae6e81430936e6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d1abf2b4c1a00c0bc17acaf7547130008127579df59592382ae6e81430936e6d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gdb7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://35a2151729e5cb2ae347214fc91e6de116a4062943e4cb4a27ab0553436953ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35a2151729e5cb2ae347214fc91e6de116a4062943e4cb4a27ab0553436953ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gdb7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b51e95d33c83c57a3b5a7f580eb9cac5e50575e699c066694a9ae8fd4dcdfb20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b51e95d33c83c57a3b5a7f580eb9cac5e50575e699c066694a9ae8fd4dcdfb20\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gdb7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20f87e07ed26b3b93d4dad87f971e4ee3bf3c46299491a95990ab34c63167f93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://20f87e07ed26b3b93d4dad87f971e4ee3bf3c46299491a95990ab34c63167f93\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gdb7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xbz7b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:45Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:45 crc kubenswrapper[4785]: I1126 15:07:45.496120 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:07:45 crc kubenswrapper[4785]: I1126 15:07:45.496149 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:07:45 crc kubenswrapper[4785]: I1126 15:07:45.496157 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:07:45 crc kubenswrapper[4785]: I1126 15:07:45.496170 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:07:45 crc kubenswrapper[4785]: I1126 15:07:45.496181 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:07:45Z","lastTransitionTime":"2025-11-26T15:07:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:07:45 crc kubenswrapper[4785]: I1126 15:07:45.508914 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-gkxdl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5539d39a-e2bc-4e7f-8b5a-a5e3e10c4ba4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://183451ec5e6ad1983b6e56a53adfcf2eaf3489dcfbc3cbc709f67002691b9010\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqzs8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5538d579d2e0ac36c778f58da6e8176f8f7a0995797e8bc7c3ad21a054346867\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqzs8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-gkxdl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:45Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:45 crc kubenswrapper[4785]: I1126 15:07:45.524291 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dbac4f7f-578e-41e6-88a0-4ad8d2f5eef2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://320fc519a91dfed4d5c233e7c1c2f21aadf25336ccade8830069aae4bd804a68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b79ab88118b1e683ad26fc3bd390961d15af99cce73126c56587c0df819f3a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6c964ff713703fe4484d76d86473279ffc4a3eef0cf3dd8b29aea7fdc6271616\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03ac9c745ec9eb21263a599561496630202872ef631f5e846e0475efedee6943\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:45Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:45 crc kubenswrapper[4785]: I1126 15:07:45.559343 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ec3c2133-1999-46b4-b5a6-c1b800f04c30\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee866b365ef5e15c3cc262ab6ad0649b10fa7fdf0c10c3f747d9e20d48535e76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79b8f6bd2047642cacd4effcee8f9760c7c9480092993c7a25eabe4727bd8238\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dee1454ae94951e83f437b999f871e4965a9199a20bcd6392a6e07a7bfa9c6aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://257e51cb9b93727ccb616e193abc21dfed1ee9ee2f1ce459e35728457dae5280\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc743861f85f56a87c6a20d7152af59e554284fe13b8212c5dae854438771406\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ff60b31932a511f9156f1416e112d115fcbc79a0a6030143ba9f5ddd327577a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ff60b31932a511f9156f1416e112d115fcbc79a0a6030143ba9f5ddd327577a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff524be439c6bf410b9377cd503516b43be3826e4c2fcd03a6ed9bf3fcfab95b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff524be439c6bf410b9377cd503516b43be3826e4c2fcd03a6ed9bf3fcfab95b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:19Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://26cb5ba5508ef320a24ba05e352547c9feb82f37ebeab58ad7c0a073d03b8c2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26cb5ba5508ef320a24ba05e352547c9feb82f37ebeab58ad7c0a073d03b8c2d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:17Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:45Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:45 crc kubenswrapper[4785]: I1126 15:07:45.585925 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-925q9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"862c58fd-3f79-4276-bd76-ce689d32cbd6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://049fc6de07d4b51e1c898035109c3427dff189cc509f5973db7cb75c3a022bc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://049fc6de07d4b51e1c898035109c3427dff189cc509f5973db7cb75c3a022bc1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-925q9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:45Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:45 crc kubenswrapper[4785]: I1126 15:07:45.598485 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:07:45 crc kubenswrapper[4785]: I1126 15:07:45.598529 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:07:45 crc kubenswrapper[4785]: I1126 15:07:45.598541 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:07:45 crc kubenswrapper[4785]: I1126 15:07:45.598576 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:07:45 crc kubenswrapper[4785]: I1126 15:07:45.598587 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:07:45Z","lastTransitionTime":"2025-11-26T15:07:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:07:45 crc kubenswrapper[4785]: I1126 15:07:45.702154 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:07:45 crc kubenswrapper[4785]: I1126 15:07:45.702215 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:07:45 crc kubenswrapper[4785]: I1126 15:07:45.702228 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:07:45 crc kubenswrapper[4785]: I1126 15:07:45.702248 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:07:45 crc kubenswrapper[4785]: I1126 15:07:45.702261 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:07:45Z","lastTransitionTime":"2025-11-26T15:07:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:07:45 crc kubenswrapper[4785]: I1126 15:07:45.805801 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:07:45 crc kubenswrapper[4785]: I1126 15:07:45.805880 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:07:45 crc kubenswrapper[4785]: I1126 15:07:45.805903 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:07:45 crc kubenswrapper[4785]: I1126 15:07:45.805933 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:07:45 crc kubenswrapper[4785]: I1126 15:07:45.805955 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:07:45Z","lastTransitionTime":"2025-11-26T15:07:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:07:45 crc kubenswrapper[4785]: I1126 15:07:45.909023 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:07:45 crc kubenswrapper[4785]: I1126 15:07:45.909154 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:07:45 crc kubenswrapper[4785]: I1126 15:07:45.909284 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:07:45 crc kubenswrapper[4785]: I1126 15:07:45.909319 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:07:45 crc kubenswrapper[4785]: I1126 15:07:45.909379 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:07:45Z","lastTransitionTime":"2025-11-26T15:07:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:07:46 crc kubenswrapper[4785]: I1126 15:07:46.014410 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:07:46 crc kubenswrapper[4785]: I1126 15:07:46.014538 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:07:46 crc kubenswrapper[4785]: I1126 15:07:46.014650 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:07:46 crc kubenswrapper[4785]: I1126 15:07:46.014743 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:07:46 crc kubenswrapper[4785]: I1126 15:07:46.014767 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:07:46Z","lastTransitionTime":"2025-11-26T15:07:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:07:46 crc kubenswrapper[4785]: I1126 15:07:46.035826 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 26 15:07:46 crc kubenswrapper[4785]: E1126 15:07:46.036022 4785 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 26 15:07:46 crc kubenswrapper[4785]: I1126 15:07:46.118042 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:07:46 crc kubenswrapper[4785]: I1126 15:07:46.118094 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:07:46 crc kubenswrapper[4785]: I1126 15:07:46.118117 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:07:46 crc kubenswrapper[4785]: I1126 15:07:46.118147 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:07:46 crc kubenswrapper[4785]: I1126 15:07:46.118197 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:07:46Z","lastTransitionTime":"2025-11-26T15:07:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:07:46 crc kubenswrapper[4785]: I1126 15:07:46.220938 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:07:46 crc kubenswrapper[4785]: I1126 15:07:46.221015 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:07:46 crc kubenswrapper[4785]: I1126 15:07:46.221041 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:07:46 crc kubenswrapper[4785]: I1126 15:07:46.221071 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:07:46 crc kubenswrapper[4785]: I1126 15:07:46.221092 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:07:46Z","lastTransitionTime":"2025-11-26T15:07:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:07:46 crc kubenswrapper[4785]: I1126 15:07:46.274644 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-925q9" event={"ID":"862c58fd-3f79-4276-bd76-ce689d32cbd6","Type":"ContainerStarted","Data":"f40be3fafbe55a70dd7b23d6fa3fe8267f607d03f065543745f6c93773f1663a"} Nov 26 15:07:46 crc kubenswrapper[4785]: I1126 15:07:46.275033 4785 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-925q9" Nov 26 15:07:46 crc kubenswrapper[4785]: I1126 15:07:46.284029 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-xbz7b" event={"ID":"84d83039-5d86-45bb-a5c1-ca5b94ed92c5","Type":"ContainerStarted","Data":"5459c3414a7754407ac6d182e88047bbe1b7b9750d15c7a88767c50c4b9dc720"} Nov 26 15:07:46 crc kubenswrapper[4785]: I1126 15:07:46.298042 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-gkxdl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5539d39a-e2bc-4e7f-8b5a-a5e3e10c4ba4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://183451ec5e6ad1983b6e56a53adfcf2eaf3489dcfbc3cbc709f67002691b9010\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqzs8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5538d579d2e0ac36c778f58da6e8176f8f7a0995797e8bc7c3ad21a054346867\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqzs8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-gkxdl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:46Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:46 crc kubenswrapper[4785]: I1126 15:07:46.316914 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dbac4f7f-578e-41e6-88a0-4ad8d2f5eef2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://320fc519a91dfed4d5c233e7c1c2f21aadf25336ccade8830069aae4bd804a68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b79ab88118b1e683ad26fc3bd390961d15af99cce73126c56587c0df819f3a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6c964ff713703fe4484d76d86473279ffc4a3eef0cf3dd8b29aea7fdc6271616\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03ac9c745ec9eb21263a599561496630202872ef631f5e846e0475efedee6943\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:46Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:46 crc kubenswrapper[4785]: I1126 15:07:46.323895 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:07:46 crc kubenswrapper[4785]: I1126 15:07:46.323949 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:07:46 crc kubenswrapper[4785]: I1126 15:07:46.323966 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:07:46 crc kubenswrapper[4785]: I1126 15:07:46.323997 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:07:46 crc kubenswrapper[4785]: I1126 15:07:46.324014 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:07:46Z","lastTransitionTime":"2025-11-26T15:07:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:07:46 crc kubenswrapper[4785]: I1126 15:07:46.331131 4785 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-925q9" Nov 26 15:07:46 crc kubenswrapper[4785]: I1126 15:07:46.348242 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ec3c2133-1999-46b4-b5a6-c1b800f04c30\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee866b365ef5e15c3cc262ab6ad0649b10fa7fdf0c10c3f747d9e20d48535e76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79b8f6bd2047642cacd4effcee8f9760c7c9480092993c7a25eabe4727bd8238\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dee1454ae94951e83f437b999f871e4965a9199a20bcd6392a6e07a7bfa9c6aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://257e51cb9b93727ccb616e193abc21dfed1ee9ee2f1ce459e35728457dae5280\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc743861f85f56a87c6a20d7152af59e554284fe13b8212c5dae854438771406\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ff60b31932a511f9156f1416e112d115fcbc79a0a6030143ba9f5ddd327577a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ff60b31932a511f9156f1416e112d115fcbc79a0a6030143ba9f5ddd327577a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff524be439c6bf410b9377cd503516b43be3826e4c2fcd03a6ed9bf3fcfab95b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff524be439c6bf410b9377cd503516b43be3826e4c2fcd03a6ed9bf3fcfab95b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:19Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://26cb5ba5508ef320a24ba05e352547c9feb82f37ebeab58ad7c0a073d03b8c2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26cb5ba5508ef320a24ba05e352547c9feb82f37ebeab58ad7c0a073d03b8c2d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:17Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:46Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:46 crc kubenswrapper[4785]: I1126 15:07:46.367892 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8acfa933b476aa5e196c195906225ae92af5cca3353292e23f29f87ee374d0ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f432643beb27bff31e146230a60837b1292cddb8595ede8ed7f929170fceeb17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:46Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:46 crc kubenswrapper[4785]: I1126 15:07:46.390373 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ab8b8fda3321443cb24c214c10ba7171f01c80adf81860f4aa931062ec886d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:46Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:46 crc kubenswrapper[4785]: I1126 15:07:46.412372 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:46Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:46 crc kubenswrapper[4785]: I1126 15:07:46.428577 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:07:46 crc kubenswrapper[4785]: I1126 15:07:46.428636 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:07:46 crc kubenswrapper[4785]: I1126 15:07:46.428659 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:07:46 crc kubenswrapper[4785]: I1126 15:07:46.428689 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:07:46 crc kubenswrapper[4785]: I1126 15:07:46.428709 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:07:46Z","lastTransitionTime":"2025-11-26T15:07:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:07:46 crc kubenswrapper[4785]: I1126 15:07:46.438179 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xbz7b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84d83039-5d86-45bb-a5c1-ca5b94ed92c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gdb7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f7072a2a9067489a27300fea823075b89d71d45a1d972bd9d94226b01c2c123\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f7072a2a9067489a27300fea823075b89d71d45a1d972bd9d94226b01c2c123\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gdb7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://538d65a0b5f3e2c45c1b9761f2c73e958d5de0873652f20f4047631423e538eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://538d65a0b5f3e2c45c1b9761f2c73e958d5de0873652f20f4047631423e538eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gdb7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1abf2b4c1a00c0bc17acaf7547130008127579df59592382ae6e81430936e6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d1abf2b4c1a00c0bc17acaf7547130008127579df59592382ae6e81430936e6d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gdb7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://35a2151729e5cb2ae347214fc91e6de116a4062943e4cb4a27ab0553436953ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35a2151729e5cb2ae347214fc91e6de116a4062943e4cb4a27ab0553436953ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gdb7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b51e95d33c83c57a3b5a7f580eb9cac5e50575e699c066694a9ae8fd4dcdfb20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b51e95d33c83c57a3b5a7f580eb9cac5e50575e699c066694a9ae8fd4dcdfb20\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gdb7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20f87e07ed26b3b93d4dad87f971e4ee3bf3c46299491a95990ab34c63167f93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://20f87e07ed26b3b93d4dad87f971e4ee3bf3c46299491a95990ab34c63167f93\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gdb7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xbz7b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:46Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:46 crc kubenswrapper[4785]: I1126 15:07:46.469451 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-925q9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"862c58fd-3f79-4276-bd76-ce689d32cbd6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b754df1d5f2d20f3afdb10f2e77f4716de9d953c7322bb3e289d989d2699d17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://018b59a73e9d454f9156de17b00a8c5cccc3f321164cac374187afa18466bc45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e1a2013c12cbd23c8c9e4e0d5bd9f76df1c2c2f30f0d55fdb8c8cb03698ea576\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bc34631383b7470b2d582d037ea922f2c46d09ca9e37011d758e16837336a78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://77387f2d556a76807d809c75aa4d9e2fdf5de82d7a04e23d3f5461ccef860196\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://47be007cc11c8c2db5bc6dd5786cad4e45c1f8f08f872d060be0211012aad935\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f40be3fafbe55a70dd7b23d6fa3fe8267f607d03f065543745f6c93773f1663a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84b98c7c476182b515072f71c6d81656debe5c03855324623b1eb31f53e6c5c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://049fc6de07d4b51e1c898035109c3427dff189cc509f5973db7cb75c3a022bc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://049fc6de07d4b51e1c898035109c3427dff189cc509f5973db7cb75c3a022bc1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-925q9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:46Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:46 crc kubenswrapper[4785]: I1126 15:07:46.481309 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-hk884" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b9a87c55-b930-4993-88b8-15902e000caa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://018c4591e8ee49a626a6357d6672b723d221f62eaa62face83fd6c3d5f1b3bfa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n9rk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:38Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-hk884\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:46Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:46 crc kubenswrapper[4785]: I1126 15:07:46.496816 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6q4xd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"855bd894-cca9-4fe1-a0d5-8b72afe7c93a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://041064cd63f3d654e02e5e310fbc190ca527a986959c9d7a35ab87962184f2a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7zv2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6q4xd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:46Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:46 crc kubenswrapper[4785]: I1126 15:07:46.509418 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:46Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:46 crc kubenswrapper[4785]: I1126 15:07:46.521068 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-smv28" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d77660b-5b10-4573-84ab-3dc318d4b4ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3bb62d64073e9a5efa6ec04394fb33bf1498892d59e0843f323a728dff2e7b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mwfft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:38Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-smv28\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:46Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:46 crc kubenswrapper[4785]: I1126 15:07:46.530826 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:07:46 crc kubenswrapper[4785]: I1126 15:07:46.530918 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:07:46 crc kubenswrapper[4785]: I1126 15:07:46.530942 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:07:46 crc kubenswrapper[4785]: I1126 15:07:46.530968 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:07:46 crc kubenswrapper[4785]: I1126 15:07:46.530983 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:07:46Z","lastTransitionTime":"2025-11-26T15:07:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:07:46 crc kubenswrapper[4785]: I1126 15:07:46.537296 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ce2473df-8540-437a-9b68-a0c79c8f1189\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26237e500f4655636096b017418a9029308b9ad9e5c12041546e51d919766529\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7904d95c1ece08fadcdc54e83f80b295f3904cc950fb84624bc2a6967e4d5bdd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://936454b4c6cdedceb140c50e2dba0e677ce2b126075b203bdae8a77bbdefde76\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2202f005c9b704cc7a97a7387e4563a14bda9e52001c63918f1067a7e676d34d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a6d4414478527bb76f539125e4be6ebf2a6470ddb46fd35e07180319fa98ede\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-26T15:07:36Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1126 15:07:30.557519 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1126 15:07:30.559919 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2275430498/tls.crt::/tmp/serving-cert-2275430498/tls.key\\\\\\\"\\\\nI1126 15:07:36.843783 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1126 15:07:36.847546 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1126 15:07:36.847601 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1126 15:07:36.847630 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1126 15:07:36.847642 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1126 15:07:36.858915 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1126 15:07:36.858955 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1126 15:07:36.858960 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1126 15:07:36.858971 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1126 15:07:36.858975 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1126 15:07:36.858978 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1126 15:07:36.858982 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1126 15:07:36.859422 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1126 15:07:36.862125 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:20Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcce80995d95ff6993051a4f1ca260133d2513d216a291e3f2c905ec29f88d1b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e86ca1ef167bd0c31ad5ae1283d40945c72c4e528f3702b577bbbf9818e84c08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e86ca1ef167bd0c31ad5ae1283d40945c72c4e528f3702b577bbbf9818e84c08\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:46Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:46 crc kubenswrapper[4785]: I1126 15:07:46.551775 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f369bd7df062792e57a0cee19e7be96c6850b20d6e739a0c7fa5eccc14ae1d1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:46Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:46 crc kubenswrapper[4785]: I1126 15:07:46.566515 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:46Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:46 crc kubenswrapper[4785]: I1126 15:07:46.587152 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ce2473df-8540-437a-9b68-a0c79c8f1189\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26237e500f4655636096b017418a9029308b9ad9e5c12041546e51d919766529\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7904d95c1ece08fadcdc54e83f80b295f3904cc950fb84624bc2a6967e4d5bdd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://936454b4c6cdedceb140c50e2dba0e677ce2b126075b203bdae8a77bbdefde76\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2202f005c9b704cc7a97a7387e4563a14bda9e52001c63918f1067a7e676d34d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a6d4414478527bb76f539125e4be6ebf2a6470ddb46fd35e07180319fa98ede\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-26T15:07:36Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1126 15:07:30.557519 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1126 15:07:30.559919 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2275430498/tls.crt::/tmp/serving-cert-2275430498/tls.key\\\\\\\"\\\\nI1126 15:07:36.843783 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1126 15:07:36.847546 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1126 15:07:36.847601 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1126 15:07:36.847630 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1126 15:07:36.847642 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1126 15:07:36.858915 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1126 15:07:36.858955 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1126 15:07:36.858960 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1126 15:07:36.858971 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1126 15:07:36.858975 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1126 15:07:36.858978 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1126 15:07:36.858982 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1126 15:07:36.859422 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1126 15:07:36.862125 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:20Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcce80995d95ff6993051a4f1ca260133d2513d216a291e3f2c905ec29f88d1b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e86ca1ef167bd0c31ad5ae1283d40945c72c4e528f3702b577bbbf9818e84c08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e86ca1ef167bd0c31ad5ae1283d40945c72c4e528f3702b577bbbf9818e84c08\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:46Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:46 crc kubenswrapper[4785]: I1126 15:07:46.605185 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f369bd7df062792e57a0cee19e7be96c6850b20d6e739a0c7fa5eccc14ae1d1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:46Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:46 crc kubenswrapper[4785]: I1126 15:07:46.617473 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:46Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:46 crc kubenswrapper[4785]: I1126 15:07:46.627925 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8acfa933b476aa5e196c195906225ae92af5cca3353292e23f29f87ee374d0ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f432643beb27bff31e146230a60837b1292cddb8595ede8ed7f929170fceeb17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:46Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:46 crc kubenswrapper[4785]: I1126 15:07:46.633243 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:07:46 crc kubenswrapper[4785]: I1126 15:07:46.633284 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:07:46 crc kubenswrapper[4785]: I1126 15:07:46.633299 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:07:46 crc kubenswrapper[4785]: I1126 15:07:46.633367 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:07:46 crc kubenswrapper[4785]: I1126 15:07:46.633415 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:07:46Z","lastTransitionTime":"2025-11-26T15:07:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:07:46 crc kubenswrapper[4785]: I1126 15:07:46.638246 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ab8b8fda3321443cb24c214c10ba7171f01c80adf81860f4aa931062ec886d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:46Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:46 crc kubenswrapper[4785]: I1126 15:07:46.648744 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:46Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:46 crc kubenswrapper[4785]: I1126 15:07:46.660860 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xbz7b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84d83039-5d86-45bb-a5c1-ca5b94ed92c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5459c3414a7754407ac6d182e88047bbe1b7b9750d15c7a88767c50c4b9dc720\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gdb7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f7072a2a9067489a27300fea823075b89d71d45a1d972bd9d94226b01c2c123\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f7072a2a9067489a27300fea823075b89d71d45a1d972bd9d94226b01c2c123\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gdb7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://538d65a0b5f3e2c45c1b9761f2c73e958d5de0873652f20f4047631423e538eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://538d65a0b5f3e2c45c1b9761f2c73e958d5de0873652f20f4047631423e538eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gdb7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1abf2b4c1a00c0bc17acaf7547130008127579df59592382ae6e81430936e6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d1abf2b4c1a00c0bc17acaf7547130008127579df59592382ae6e81430936e6d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gdb7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://35a2151729e5cb2ae347214fc91e6de116a4062943e4cb4a27ab0553436953ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35a2151729e5cb2ae347214fc91e6de116a4062943e4cb4a27ab0553436953ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gdb7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b51e95d33c83c57a3b5a7f580eb9cac5e50575e699c066694a9ae8fd4dcdfb20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b51e95d33c83c57a3b5a7f580eb9cac5e50575e699c066694a9ae8fd4dcdfb20\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gdb7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20f87e07ed26b3b93d4dad87f971e4ee3bf3c46299491a95990ab34c63167f93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://20f87e07ed26b3b93d4dad87f971e4ee3bf3c46299491a95990ab34c63167f93\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gdb7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xbz7b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:46Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:46 crc kubenswrapper[4785]: I1126 15:07:46.670405 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-gkxdl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5539d39a-e2bc-4e7f-8b5a-a5e3e10c4ba4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://183451ec5e6ad1983b6e56a53adfcf2eaf3489dcfbc3cbc709f67002691b9010\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqzs8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5538d579d2e0ac36c778f58da6e8176f8f7a0995797e8bc7c3ad21a054346867\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqzs8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-gkxdl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:46Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:46 crc kubenswrapper[4785]: I1126 15:07:46.686981 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dbac4f7f-578e-41e6-88a0-4ad8d2f5eef2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://320fc519a91dfed4d5c233e7c1c2f21aadf25336ccade8830069aae4bd804a68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b79ab88118b1e683ad26fc3bd390961d15af99cce73126c56587c0df819f3a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6c964ff713703fe4484d76d86473279ffc4a3eef0cf3dd8b29aea7fdc6271616\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03ac9c745ec9eb21263a599561496630202872ef631f5e846e0475efedee6943\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:46Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:46 crc kubenswrapper[4785]: I1126 15:07:46.704946 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ec3c2133-1999-46b4-b5a6-c1b800f04c30\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee866b365ef5e15c3cc262ab6ad0649b10fa7fdf0c10c3f747d9e20d48535e76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79b8f6bd2047642cacd4effcee8f9760c7c9480092993c7a25eabe4727bd8238\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dee1454ae94951e83f437b999f871e4965a9199a20bcd6392a6e07a7bfa9c6aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://257e51cb9b93727ccb616e193abc21dfed1ee9ee2f1ce459e35728457dae5280\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc743861f85f56a87c6a20d7152af59e554284fe13b8212c5dae854438771406\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ff60b31932a511f9156f1416e112d115fcbc79a0a6030143ba9f5ddd327577a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ff60b31932a511f9156f1416e112d115fcbc79a0a6030143ba9f5ddd327577a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff524be439c6bf410b9377cd503516b43be3826e4c2fcd03a6ed9bf3fcfab95b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff524be439c6bf410b9377cd503516b43be3826e4c2fcd03a6ed9bf3fcfab95b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:19Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://26cb5ba5508ef320a24ba05e352547c9feb82f37ebeab58ad7c0a073d03b8c2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26cb5ba5508ef320a24ba05e352547c9feb82f37ebeab58ad7c0a073d03b8c2d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:17Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:46Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:46 crc kubenswrapper[4785]: I1126 15:07:46.724624 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-925q9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"862c58fd-3f79-4276-bd76-ce689d32cbd6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b754df1d5f2d20f3afdb10f2e77f4716de9d953c7322bb3e289d989d2699d17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://018b59a73e9d454f9156de17b00a8c5cccc3f321164cac374187afa18466bc45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e1a2013c12cbd23c8c9e4e0d5bd9f76df1c2c2f30f0d55fdb8c8cb03698ea576\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bc34631383b7470b2d582d037ea922f2c46d09ca9e37011d758e16837336a78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://77387f2d556a76807d809c75aa4d9e2fdf5de82d7a04e23d3f5461ccef860196\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://47be007cc11c8c2db5bc6dd5786cad4e45c1f8f08f872d060be0211012aad935\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f40be3fafbe55a70dd7b23d6fa3fe8267f607d03f065543745f6c93773f1663a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84b98c7c476182b515072f71c6d81656debe5c03855324623b1eb31f53e6c5c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://049fc6de07d4b51e1c898035109c3427dff189cc509f5973db7cb75c3a022bc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://049fc6de07d4b51e1c898035109c3427dff189cc509f5973db7cb75c3a022bc1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-925q9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:46Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:46 crc kubenswrapper[4785]: I1126 15:07:46.733720 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-hk884" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b9a87c55-b930-4993-88b8-15902e000caa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://018c4591e8ee49a626a6357d6672b723d221f62eaa62face83fd6c3d5f1b3bfa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n9rk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:38Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-hk884\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:46Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:46 crc kubenswrapper[4785]: I1126 15:07:46.735120 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:07:46 crc kubenswrapper[4785]: I1126 15:07:46.735168 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:07:46 crc kubenswrapper[4785]: I1126 15:07:46.735180 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:07:46 crc kubenswrapper[4785]: I1126 15:07:46.735194 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:07:46 crc kubenswrapper[4785]: I1126 15:07:46.735204 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:07:46Z","lastTransitionTime":"2025-11-26T15:07:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:07:46 crc kubenswrapper[4785]: I1126 15:07:46.744802 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6q4xd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"855bd894-cca9-4fe1-a0d5-8b72afe7c93a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://041064cd63f3d654e02e5e310fbc190ca527a986959c9d7a35ab87962184f2a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7zv2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6q4xd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:46Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:46 crc kubenswrapper[4785]: I1126 15:07:46.753743 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:46Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:46 crc kubenswrapper[4785]: I1126 15:07:46.762232 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-smv28" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d77660b-5b10-4573-84ab-3dc318d4b4ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3bb62d64073e9a5efa6ec04394fb33bf1498892d59e0843f323a728dff2e7b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mwfft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:38Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-smv28\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:46Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:46 crc kubenswrapper[4785]: I1126 15:07:46.838439 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:07:46 crc kubenswrapper[4785]: I1126 15:07:46.838531 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:07:46 crc kubenswrapper[4785]: I1126 15:07:46.838544 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:07:46 crc kubenswrapper[4785]: I1126 15:07:46.838599 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:07:46 crc kubenswrapper[4785]: I1126 15:07:46.838624 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:07:46Z","lastTransitionTime":"2025-11-26T15:07:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:07:46 crc kubenswrapper[4785]: I1126 15:07:46.940878 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:07:46 crc kubenswrapper[4785]: I1126 15:07:46.940914 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:07:46 crc kubenswrapper[4785]: I1126 15:07:46.940925 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:07:46 crc kubenswrapper[4785]: I1126 15:07:46.940965 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:07:46 crc kubenswrapper[4785]: I1126 15:07:46.940978 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:07:46Z","lastTransitionTime":"2025-11-26T15:07:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:07:47 crc kubenswrapper[4785]: I1126 15:07:47.036085 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 26 15:07:47 crc kubenswrapper[4785]: E1126 15:07:47.036292 4785 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 26 15:07:47 crc kubenswrapper[4785]: I1126 15:07:47.036423 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 26 15:07:47 crc kubenswrapper[4785]: E1126 15:07:47.036628 4785 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 26 15:07:47 crc kubenswrapper[4785]: I1126 15:07:47.043262 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:07:47 crc kubenswrapper[4785]: I1126 15:07:47.043320 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:07:47 crc kubenswrapper[4785]: I1126 15:07:47.043332 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:07:47 crc kubenswrapper[4785]: I1126 15:07:47.043349 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:07:47 crc kubenswrapper[4785]: I1126 15:07:47.043364 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:07:47Z","lastTransitionTime":"2025-11-26T15:07:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:07:47 crc kubenswrapper[4785]: I1126 15:07:47.059867 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ce2473df-8540-437a-9b68-a0c79c8f1189\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26237e500f4655636096b017418a9029308b9ad9e5c12041546e51d919766529\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7904d95c1ece08fadcdc54e83f80b295f3904cc950fb84624bc2a6967e4d5bdd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://936454b4c6cdedceb140c50e2dba0e677ce2b126075b203bdae8a77bbdefde76\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2202f005c9b704cc7a97a7387e4563a14bda9e52001c63918f1067a7e676d34d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a6d4414478527bb76f539125e4be6ebf2a6470ddb46fd35e07180319fa98ede\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-26T15:07:36Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1126 15:07:30.557519 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1126 15:07:30.559919 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2275430498/tls.crt::/tmp/serving-cert-2275430498/tls.key\\\\\\\"\\\\nI1126 15:07:36.843783 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1126 15:07:36.847546 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1126 15:07:36.847601 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1126 15:07:36.847630 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1126 15:07:36.847642 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1126 15:07:36.858915 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1126 15:07:36.858955 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1126 15:07:36.858960 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1126 15:07:36.858971 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1126 15:07:36.858975 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1126 15:07:36.858978 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1126 15:07:36.858982 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1126 15:07:36.859422 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1126 15:07:36.862125 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:20Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcce80995d95ff6993051a4f1ca260133d2513d216a291e3f2c905ec29f88d1b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e86ca1ef167bd0c31ad5ae1283d40945c72c4e528f3702b577bbbf9818e84c08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e86ca1ef167bd0c31ad5ae1283d40945c72c4e528f3702b577bbbf9818e84c08\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:47Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:47 crc kubenswrapper[4785]: I1126 15:07:47.085109 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f369bd7df062792e57a0cee19e7be96c6850b20d6e739a0c7fa5eccc14ae1d1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:47Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:47 crc kubenswrapper[4785]: I1126 15:07:47.114462 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:47Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:47 crc kubenswrapper[4785]: I1126 15:07:47.133600 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xbz7b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84d83039-5d86-45bb-a5c1-ca5b94ed92c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5459c3414a7754407ac6d182e88047bbe1b7b9750d15c7a88767c50c4b9dc720\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gdb7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f7072a2a9067489a27300fea823075b89d71d45a1d972bd9d94226b01c2c123\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f7072a2a9067489a27300fea823075b89d71d45a1d972bd9d94226b01c2c123\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gdb7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://538d65a0b5f3e2c45c1b9761f2c73e958d5de0873652f20f4047631423e538eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://538d65a0b5f3e2c45c1b9761f2c73e958d5de0873652f20f4047631423e538eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gdb7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1abf2b4c1a00c0bc17acaf7547130008127579df59592382ae6e81430936e6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d1abf2b4c1a00c0bc17acaf7547130008127579df59592382ae6e81430936e6d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gdb7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://35a2151729e5cb2ae347214fc91e6de116a4062943e4cb4a27ab0553436953ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35a2151729e5cb2ae347214fc91e6de116a4062943e4cb4a27ab0553436953ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gdb7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b51e95d33c83c57a3b5a7f580eb9cac5e50575e699c066694a9ae8fd4dcdfb20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b51e95d33c83c57a3b5a7f580eb9cac5e50575e699c066694a9ae8fd4dcdfb20\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gdb7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20f87e07ed26b3b93d4dad87f971e4ee3bf3c46299491a95990ab34c63167f93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://20f87e07ed26b3b93d4dad87f971e4ee3bf3c46299491a95990ab34c63167f93\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gdb7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xbz7b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:47Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:47 crc kubenswrapper[4785]: I1126 15:07:47.149666 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:07:47 crc kubenswrapper[4785]: I1126 15:07:47.149704 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:07:47 crc kubenswrapper[4785]: I1126 15:07:47.149713 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:07:47 crc kubenswrapper[4785]: I1126 15:07:47.149725 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:07:47 crc kubenswrapper[4785]: I1126 15:07:47.149733 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:07:47Z","lastTransitionTime":"2025-11-26T15:07:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:07:47 crc kubenswrapper[4785]: I1126 15:07:47.185913 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-gkxdl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5539d39a-e2bc-4e7f-8b5a-a5e3e10c4ba4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://183451ec5e6ad1983b6e56a53adfcf2eaf3489dcfbc3cbc709f67002691b9010\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqzs8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5538d579d2e0ac36c778f58da6e8176f8f7a0995797e8bc7c3ad21a054346867\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqzs8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-gkxdl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:47Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:47 crc kubenswrapper[4785]: I1126 15:07:47.204114 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dbac4f7f-578e-41e6-88a0-4ad8d2f5eef2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://320fc519a91dfed4d5c233e7c1c2f21aadf25336ccade8830069aae4bd804a68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b79ab88118b1e683ad26fc3bd390961d15af99cce73126c56587c0df819f3a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6c964ff713703fe4484d76d86473279ffc4a3eef0cf3dd8b29aea7fdc6271616\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03ac9c745ec9eb21263a599561496630202872ef631f5e846e0475efedee6943\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:47Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:47 crc kubenswrapper[4785]: I1126 15:07:47.225356 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ec3c2133-1999-46b4-b5a6-c1b800f04c30\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee866b365ef5e15c3cc262ab6ad0649b10fa7fdf0c10c3f747d9e20d48535e76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79b8f6bd2047642cacd4effcee8f9760c7c9480092993c7a25eabe4727bd8238\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dee1454ae94951e83f437b999f871e4965a9199a20bcd6392a6e07a7bfa9c6aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://257e51cb9b93727ccb616e193abc21dfed1ee9ee2f1ce459e35728457dae5280\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc743861f85f56a87c6a20d7152af59e554284fe13b8212c5dae854438771406\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ff60b31932a511f9156f1416e112d115fcbc79a0a6030143ba9f5ddd327577a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ff60b31932a511f9156f1416e112d115fcbc79a0a6030143ba9f5ddd327577a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff524be439c6bf410b9377cd503516b43be3826e4c2fcd03a6ed9bf3fcfab95b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff524be439c6bf410b9377cd503516b43be3826e4c2fcd03a6ed9bf3fcfab95b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:19Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://26cb5ba5508ef320a24ba05e352547c9feb82f37ebeab58ad7c0a073d03b8c2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26cb5ba5508ef320a24ba05e352547c9feb82f37ebeab58ad7c0a073d03b8c2d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:17Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:47Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:47 crc kubenswrapper[4785]: I1126 15:07:47.235343 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8acfa933b476aa5e196c195906225ae92af5cca3353292e23f29f87ee374d0ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f432643beb27bff31e146230a60837b1292cddb8595ede8ed7f929170fceeb17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:47Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:47 crc kubenswrapper[4785]: I1126 15:07:47.244959 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ab8b8fda3321443cb24c214c10ba7171f01c80adf81860f4aa931062ec886d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:47Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:47 crc kubenswrapper[4785]: I1126 15:07:47.252315 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:07:47 crc kubenswrapper[4785]: I1126 15:07:47.252337 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:07:47 crc kubenswrapper[4785]: I1126 15:07:47.252347 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:07:47 crc kubenswrapper[4785]: I1126 15:07:47.252360 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:07:47 crc kubenswrapper[4785]: I1126 15:07:47.252368 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:07:47Z","lastTransitionTime":"2025-11-26T15:07:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:07:47 crc kubenswrapper[4785]: I1126 15:07:47.255081 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:47Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:47 crc kubenswrapper[4785]: I1126 15:07:47.269711 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-925q9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"862c58fd-3f79-4276-bd76-ce689d32cbd6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b754df1d5f2d20f3afdb10f2e77f4716de9d953c7322bb3e289d989d2699d17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://018b59a73e9d454f9156de17b00a8c5cccc3f321164cac374187afa18466bc45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e1a2013c12cbd23c8c9e4e0d5bd9f76df1c2c2f30f0d55fdb8c8cb03698ea576\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bc34631383b7470b2d582d037ea922f2c46d09ca9e37011d758e16837336a78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://77387f2d556a76807d809c75aa4d9e2fdf5de82d7a04e23d3f5461ccef860196\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://47be007cc11c8c2db5bc6dd5786cad4e45c1f8f08f872d060be0211012aad935\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f40be3fafbe55a70dd7b23d6fa3fe8267f607d03f065543745f6c93773f1663a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84b98c7c476182b515072f71c6d81656debe5c03855324623b1eb31f53e6c5c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://049fc6de07d4b51e1c898035109c3427dff189cc509f5973db7cb75c3a022bc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://049fc6de07d4b51e1c898035109c3427dff189cc509f5973db7cb75c3a022bc1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-925q9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:47Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:47 crc kubenswrapper[4785]: I1126 15:07:47.280435 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-hk884" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b9a87c55-b930-4993-88b8-15902e000caa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://018c4591e8ee49a626a6357d6672b723d221f62eaa62face83fd6c3d5f1b3bfa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n9rk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:38Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-hk884\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:47Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:47 crc kubenswrapper[4785]: I1126 15:07:47.286927 4785 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-925q9" Nov 26 15:07:47 crc kubenswrapper[4785]: I1126 15:07:47.286971 4785 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-925q9" Nov 26 15:07:47 crc kubenswrapper[4785]: I1126 15:07:47.294675 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6q4xd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"855bd894-cca9-4fe1-a0d5-8b72afe7c93a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://041064cd63f3d654e02e5e310fbc190ca527a986959c9d7a35ab87962184f2a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7zv2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6q4xd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:47Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:47 crc kubenswrapper[4785]: I1126 15:07:47.309483 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:47Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:47 crc kubenswrapper[4785]: I1126 15:07:47.315418 4785 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-925q9" Nov 26 15:07:47 crc kubenswrapper[4785]: I1126 15:07:47.319841 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-smv28" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d77660b-5b10-4573-84ab-3dc318d4b4ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3bb62d64073e9a5efa6ec04394fb33bf1498892d59e0843f323a728dff2e7b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mwfft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:38Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-smv28\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:47Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:47 crc kubenswrapper[4785]: I1126 15:07:47.330755 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:47Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:47 crc kubenswrapper[4785]: I1126 15:07:47.339683 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-smv28" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d77660b-5b10-4573-84ab-3dc318d4b4ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3bb62d64073e9a5efa6ec04394fb33bf1498892d59e0843f323a728dff2e7b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mwfft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:38Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-smv28\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:47Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:47 crc kubenswrapper[4785]: I1126 15:07:47.350863 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ce2473df-8540-437a-9b68-a0c79c8f1189\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26237e500f4655636096b017418a9029308b9ad9e5c12041546e51d919766529\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7904d95c1ece08fadcdc54e83f80b295f3904cc950fb84624bc2a6967e4d5bdd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://936454b4c6cdedceb140c50e2dba0e677ce2b126075b203bdae8a77bbdefde76\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2202f005c9b704cc7a97a7387e4563a14bda9e52001c63918f1067a7e676d34d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a6d4414478527bb76f539125e4be6ebf2a6470ddb46fd35e07180319fa98ede\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-26T15:07:36Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1126 15:07:30.557519 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1126 15:07:30.559919 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2275430498/tls.crt::/tmp/serving-cert-2275430498/tls.key\\\\\\\"\\\\nI1126 15:07:36.843783 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1126 15:07:36.847546 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1126 15:07:36.847601 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1126 15:07:36.847630 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1126 15:07:36.847642 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1126 15:07:36.858915 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1126 15:07:36.858955 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1126 15:07:36.858960 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1126 15:07:36.858971 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1126 15:07:36.858975 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1126 15:07:36.858978 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1126 15:07:36.858982 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1126 15:07:36.859422 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1126 15:07:36.862125 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:20Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcce80995d95ff6993051a4f1ca260133d2513d216a291e3f2c905ec29f88d1b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e86ca1ef167bd0c31ad5ae1283d40945c72c4e528f3702b577bbbf9818e84c08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e86ca1ef167bd0c31ad5ae1283d40945c72c4e528f3702b577bbbf9818e84c08\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:47Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:47 crc kubenswrapper[4785]: I1126 15:07:47.354626 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:07:47 crc kubenswrapper[4785]: I1126 15:07:47.354691 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:07:47 crc kubenswrapper[4785]: I1126 15:07:47.354710 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:07:47 crc kubenswrapper[4785]: I1126 15:07:47.354734 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:07:47 crc kubenswrapper[4785]: I1126 15:07:47.354752 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:07:47Z","lastTransitionTime":"2025-11-26T15:07:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:07:47 crc kubenswrapper[4785]: I1126 15:07:47.364710 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f369bd7df062792e57a0cee19e7be96c6850b20d6e739a0c7fa5eccc14ae1d1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:47Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:47 crc kubenswrapper[4785]: I1126 15:07:47.375786 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:47Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:47 crc kubenswrapper[4785]: I1126 15:07:47.388223 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:47Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:47 crc kubenswrapper[4785]: I1126 15:07:47.401011 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xbz7b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84d83039-5d86-45bb-a5c1-ca5b94ed92c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5459c3414a7754407ac6d182e88047bbe1b7b9750d15c7a88767c50c4b9dc720\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gdb7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f7072a2a9067489a27300fea823075b89d71d45a1d972bd9d94226b01c2c123\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f7072a2a9067489a27300fea823075b89d71d45a1d972bd9d94226b01c2c123\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gdb7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://538d65a0b5f3e2c45c1b9761f2c73e958d5de0873652f20f4047631423e538eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://538d65a0b5f3e2c45c1b9761f2c73e958d5de0873652f20f4047631423e538eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gdb7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1abf2b4c1a00c0bc17acaf7547130008127579df59592382ae6e81430936e6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d1abf2b4c1a00c0bc17acaf7547130008127579df59592382ae6e81430936e6d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gdb7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://35a2151729e5cb2ae347214fc91e6de116a4062943e4cb4a27ab0553436953ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35a2151729e5cb2ae347214fc91e6de116a4062943e4cb4a27ab0553436953ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gdb7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b51e95d33c83c57a3b5a7f580eb9cac5e50575e699c066694a9ae8fd4dcdfb20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b51e95d33c83c57a3b5a7f580eb9cac5e50575e699c066694a9ae8fd4dcdfb20\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gdb7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20f87e07ed26b3b93d4dad87f971e4ee3bf3c46299491a95990ab34c63167f93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://20f87e07ed26b3b93d4dad87f971e4ee3bf3c46299491a95990ab34c63167f93\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gdb7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xbz7b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:47Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:47 crc kubenswrapper[4785]: I1126 15:07:47.415595 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-gkxdl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5539d39a-e2bc-4e7f-8b5a-a5e3e10c4ba4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://183451ec5e6ad1983b6e56a53adfcf2eaf3489dcfbc3cbc709f67002691b9010\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqzs8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5538d579d2e0ac36c778f58da6e8176f8f7a0995797e8bc7c3ad21a054346867\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqzs8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-gkxdl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:47Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:47 crc kubenswrapper[4785]: I1126 15:07:47.430207 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dbac4f7f-578e-41e6-88a0-4ad8d2f5eef2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://320fc519a91dfed4d5c233e7c1c2f21aadf25336ccade8830069aae4bd804a68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b79ab88118b1e683ad26fc3bd390961d15af99cce73126c56587c0df819f3a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6c964ff713703fe4484d76d86473279ffc4a3eef0cf3dd8b29aea7fdc6271616\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03ac9c745ec9eb21263a599561496630202872ef631f5e846e0475efedee6943\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:47Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:47 crc kubenswrapper[4785]: I1126 15:07:47.449334 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ec3c2133-1999-46b4-b5a6-c1b800f04c30\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee866b365ef5e15c3cc262ab6ad0649b10fa7fdf0c10c3f747d9e20d48535e76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79b8f6bd2047642cacd4effcee8f9760c7c9480092993c7a25eabe4727bd8238\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dee1454ae94951e83f437b999f871e4965a9199a20bcd6392a6e07a7bfa9c6aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://257e51cb9b93727ccb616e193abc21dfed1ee9ee2f1ce459e35728457dae5280\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc743861f85f56a87c6a20d7152af59e554284fe13b8212c5dae854438771406\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ff60b31932a511f9156f1416e112d115fcbc79a0a6030143ba9f5ddd327577a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ff60b31932a511f9156f1416e112d115fcbc79a0a6030143ba9f5ddd327577a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff524be439c6bf410b9377cd503516b43be3826e4c2fcd03a6ed9bf3fcfab95b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff524be439c6bf410b9377cd503516b43be3826e4c2fcd03a6ed9bf3fcfab95b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:19Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://26cb5ba5508ef320a24ba05e352547c9feb82f37ebeab58ad7c0a073d03b8c2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26cb5ba5508ef320a24ba05e352547c9feb82f37ebeab58ad7c0a073d03b8c2d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:17Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:47Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:47 crc kubenswrapper[4785]: I1126 15:07:47.457166 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:07:47 crc kubenswrapper[4785]: I1126 15:07:47.457204 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:07:47 crc kubenswrapper[4785]: I1126 15:07:47.457215 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:07:47 crc kubenswrapper[4785]: I1126 15:07:47.457234 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:07:47 crc kubenswrapper[4785]: I1126 15:07:47.457248 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:07:47Z","lastTransitionTime":"2025-11-26T15:07:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:07:47 crc kubenswrapper[4785]: I1126 15:07:47.470061 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8acfa933b476aa5e196c195906225ae92af5cca3353292e23f29f87ee374d0ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f432643beb27bff31e146230a60837b1292cddb8595ede8ed7f929170fceeb17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:47Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:47 crc kubenswrapper[4785]: I1126 15:07:47.491047 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ab8b8fda3321443cb24c214c10ba7171f01c80adf81860f4aa931062ec886d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:47Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:47 crc kubenswrapper[4785]: I1126 15:07:47.510396 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-925q9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"862c58fd-3f79-4276-bd76-ce689d32cbd6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b754df1d5f2d20f3afdb10f2e77f4716de9d953c7322bb3e289d989d2699d17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://018b59a73e9d454f9156de17b00a8c5cccc3f321164cac374187afa18466bc45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e1a2013c12cbd23c8c9e4e0d5bd9f76df1c2c2f30f0d55fdb8c8cb03698ea576\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bc34631383b7470b2d582d037ea922f2c46d09ca9e37011d758e16837336a78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://77387f2d556a76807d809c75aa4d9e2fdf5de82d7a04e23d3f5461ccef860196\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://47be007cc11c8c2db5bc6dd5786cad4e45c1f8f08f872d060be0211012aad935\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f40be3fafbe55a70dd7b23d6fa3fe8267f607d03f065543745f6c93773f1663a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84b98c7c476182b515072f71c6d81656debe5c03855324623b1eb31f53e6c5c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://049fc6de07d4b51e1c898035109c3427dff189cc509f5973db7cb75c3a022bc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://049fc6de07d4b51e1c898035109c3427dff189cc509f5973db7cb75c3a022bc1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-925q9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:47Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:47 crc kubenswrapper[4785]: I1126 15:07:47.521681 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-hk884" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b9a87c55-b930-4993-88b8-15902e000caa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://018c4591e8ee49a626a6357d6672b723d221f62eaa62face83fd6c3d5f1b3bfa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n9rk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:38Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-hk884\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:47Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:47 crc kubenswrapper[4785]: I1126 15:07:47.535007 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6q4xd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"855bd894-cca9-4fe1-a0d5-8b72afe7c93a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://041064cd63f3d654e02e5e310fbc190ca527a986959c9d7a35ab87962184f2a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7zv2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6q4xd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:47Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:47 crc kubenswrapper[4785]: I1126 15:07:47.559943 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:07:47 crc kubenswrapper[4785]: I1126 15:07:47.560065 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:07:47 crc kubenswrapper[4785]: I1126 15:07:47.560133 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:07:47 crc kubenswrapper[4785]: I1126 15:07:47.560220 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:07:47 crc kubenswrapper[4785]: I1126 15:07:47.560285 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:07:47Z","lastTransitionTime":"2025-11-26T15:07:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:07:47 crc kubenswrapper[4785]: I1126 15:07:47.663142 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:07:47 crc kubenswrapper[4785]: I1126 15:07:47.663173 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:07:47 crc kubenswrapper[4785]: I1126 15:07:47.663183 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:07:47 crc kubenswrapper[4785]: I1126 15:07:47.663198 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:07:47 crc kubenswrapper[4785]: I1126 15:07:47.663209 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:07:47Z","lastTransitionTime":"2025-11-26T15:07:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:07:47 crc kubenswrapper[4785]: I1126 15:07:47.765499 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:07:47 crc kubenswrapper[4785]: I1126 15:07:47.765577 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:07:47 crc kubenswrapper[4785]: I1126 15:07:47.765597 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:07:47 crc kubenswrapper[4785]: I1126 15:07:47.765619 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:07:47 crc kubenswrapper[4785]: I1126 15:07:47.765636 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:07:47Z","lastTransitionTime":"2025-11-26T15:07:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:07:47 crc kubenswrapper[4785]: I1126 15:07:47.868829 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:07:47 crc kubenswrapper[4785]: I1126 15:07:47.868886 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:07:47 crc kubenswrapper[4785]: I1126 15:07:47.868903 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:07:47 crc kubenswrapper[4785]: I1126 15:07:47.868927 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:07:47 crc kubenswrapper[4785]: I1126 15:07:47.868943 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:07:47Z","lastTransitionTime":"2025-11-26T15:07:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:07:47 crc kubenswrapper[4785]: I1126 15:07:47.971391 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:07:47 crc kubenswrapper[4785]: I1126 15:07:47.971743 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:07:47 crc kubenswrapper[4785]: I1126 15:07:47.971826 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:07:47 crc kubenswrapper[4785]: I1126 15:07:47.971915 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:07:47 crc kubenswrapper[4785]: I1126 15:07:47.971982 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:07:47Z","lastTransitionTime":"2025-11-26T15:07:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:07:48 crc kubenswrapper[4785]: I1126 15:07:48.036412 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 26 15:07:48 crc kubenswrapper[4785]: E1126 15:07:48.036982 4785 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 26 15:07:48 crc kubenswrapper[4785]: I1126 15:07:48.074928 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:07:48 crc kubenswrapper[4785]: I1126 15:07:48.074966 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:07:48 crc kubenswrapper[4785]: I1126 15:07:48.074978 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:07:48 crc kubenswrapper[4785]: I1126 15:07:48.074994 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:07:48 crc kubenswrapper[4785]: I1126 15:07:48.075004 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:07:48Z","lastTransitionTime":"2025-11-26T15:07:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:07:48 crc kubenswrapper[4785]: I1126 15:07:48.177041 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:07:48 crc kubenswrapper[4785]: I1126 15:07:48.177078 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:07:48 crc kubenswrapper[4785]: I1126 15:07:48.177088 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:07:48 crc kubenswrapper[4785]: I1126 15:07:48.177102 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:07:48 crc kubenswrapper[4785]: I1126 15:07:48.177112 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:07:48Z","lastTransitionTime":"2025-11-26T15:07:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:07:48 crc kubenswrapper[4785]: I1126 15:07:48.280954 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:07:48 crc kubenswrapper[4785]: I1126 15:07:48.280996 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:07:48 crc kubenswrapper[4785]: I1126 15:07:48.281005 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:07:48 crc kubenswrapper[4785]: I1126 15:07:48.281020 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:07:48 crc kubenswrapper[4785]: I1126 15:07:48.281029 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:07:48Z","lastTransitionTime":"2025-11-26T15:07:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:07:48 crc kubenswrapper[4785]: I1126 15:07:48.383528 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:07:48 crc kubenswrapper[4785]: I1126 15:07:48.383591 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:07:48 crc kubenswrapper[4785]: I1126 15:07:48.383600 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:07:48 crc kubenswrapper[4785]: I1126 15:07:48.383615 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:07:48 crc kubenswrapper[4785]: I1126 15:07:48.383629 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:07:48Z","lastTransitionTime":"2025-11-26T15:07:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:07:48 crc kubenswrapper[4785]: I1126 15:07:48.486700 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:07:48 crc kubenswrapper[4785]: I1126 15:07:48.486733 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:07:48 crc kubenswrapper[4785]: I1126 15:07:48.486745 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:07:48 crc kubenswrapper[4785]: I1126 15:07:48.486787 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:07:48 crc kubenswrapper[4785]: I1126 15:07:48.486804 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:07:48Z","lastTransitionTime":"2025-11-26T15:07:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:07:48 crc kubenswrapper[4785]: I1126 15:07:48.589975 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:07:48 crc kubenswrapper[4785]: I1126 15:07:48.590005 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:07:48 crc kubenswrapper[4785]: I1126 15:07:48.590016 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:07:48 crc kubenswrapper[4785]: I1126 15:07:48.590032 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:07:48 crc kubenswrapper[4785]: I1126 15:07:48.590045 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:07:48Z","lastTransitionTime":"2025-11-26T15:07:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:07:48 crc kubenswrapper[4785]: I1126 15:07:48.692593 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:07:48 crc kubenswrapper[4785]: I1126 15:07:48.692645 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:07:48 crc kubenswrapper[4785]: I1126 15:07:48.692656 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:07:48 crc kubenswrapper[4785]: I1126 15:07:48.692675 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:07:48 crc kubenswrapper[4785]: I1126 15:07:48.692687 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:07:48Z","lastTransitionTime":"2025-11-26T15:07:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:07:48 crc kubenswrapper[4785]: I1126 15:07:48.796258 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:07:48 crc kubenswrapper[4785]: I1126 15:07:48.796302 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:07:48 crc kubenswrapper[4785]: I1126 15:07:48.796313 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:07:48 crc kubenswrapper[4785]: I1126 15:07:48.796330 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:07:48 crc kubenswrapper[4785]: I1126 15:07:48.796342 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:07:48Z","lastTransitionTime":"2025-11-26T15:07:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:07:48 crc kubenswrapper[4785]: I1126 15:07:48.898975 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:07:48 crc kubenswrapper[4785]: I1126 15:07:48.899004 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:07:48 crc kubenswrapper[4785]: I1126 15:07:48.899011 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:07:48 crc kubenswrapper[4785]: I1126 15:07:48.899027 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:07:48 crc kubenswrapper[4785]: I1126 15:07:48.899036 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:07:48Z","lastTransitionTime":"2025-11-26T15:07:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:07:49 crc kubenswrapper[4785]: I1126 15:07:49.001142 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:07:49 crc kubenswrapper[4785]: I1126 15:07:49.001185 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:07:49 crc kubenswrapper[4785]: I1126 15:07:49.001196 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:07:49 crc kubenswrapper[4785]: I1126 15:07:49.001211 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:07:49 crc kubenswrapper[4785]: I1126 15:07:49.001224 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:07:49Z","lastTransitionTime":"2025-11-26T15:07:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:07:49 crc kubenswrapper[4785]: I1126 15:07:49.035586 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 26 15:07:49 crc kubenswrapper[4785]: I1126 15:07:49.035618 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 26 15:07:49 crc kubenswrapper[4785]: E1126 15:07:49.035766 4785 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 26 15:07:49 crc kubenswrapper[4785]: E1126 15:07:49.035884 4785 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 26 15:07:49 crc kubenswrapper[4785]: I1126 15:07:49.102542 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:07:49 crc kubenswrapper[4785]: I1126 15:07:49.102610 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:07:49 crc kubenswrapper[4785]: I1126 15:07:49.102618 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:07:49 crc kubenswrapper[4785]: I1126 15:07:49.102633 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:07:49 crc kubenswrapper[4785]: I1126 15:07:49.102644 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:07:49Z","lastTransitionTime":"2025-11-26T15:07:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:07:49 crc kubenswrapper[4785]: I1126 15:07:49.205059 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:07:49 crc kubenswrapper[4785]: I1126 15:07:49.205123 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:07:49 crc kubenswrapper[4785]: I1126 15:07:49.205140 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:07:49 crc kubenswrapper[4785]: I1126 15:07:49.205163 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:07:49 crc kubenswrapper[4785]: I1126 15:07:49.205181 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:07:49Z","lastTransitionTime":"2025-11-26T15:07:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:07:49 crc kubenswrapper[4785]: I1126 15:07:49.294508 4785 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-925q9_862c58fd-3f79-4276-bd76-ce689d32cbd6/ovnkube-controller/0.log" Nov 26 15:07:49 crc kubenswrapper[4785]: I1126 15:07:49.300150 4785 generic.go:334] "Generic (PLEG): container finished" podID="862c58fd-3f79-4276-bd76-ce689d32cbd6" containerID="f40be3fafbe55a70dd7b23d6fa3fe8267f607d03f065543745f6c93773f1663a" exitCode=1 Nov 26 15:07:49 crc kubenswrapper[4785]: I1126 15:07:49.300221 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-925q9" event={"ID":"862c58fd-3f79-4276-bd76-ce689d32cbd6","Type":"ContainerDied","Data":"f40be3fafbe55a70dd7b23d6fa3fe8267f607d03f065543745f6c93773f1663a"} Nov 26 15:07:49 crc kubenswrapper[4785]: I1126 15:07:49.301153 4785 scope.go:117] "RemoveContainer" containerID="f40be3fafbe55a70dd7b23d6fa3fe8267f607d03f065543745f6c93773f1663a" Nov 26 15:07:49 crc kubenswrapper[4785]: I1126 15:07:49.314991 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:07:49 crc kubenswrapper[4785]: I1126 15:07:49.315025 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:07:49 crc kubenswrapper[4785]: I1126 15:07:49.315033 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:07:49 crc kubenswrapper[4785]: I1126 15:07:49.315045 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:07:49 crc kubenswrapper[4785]: I1126 15:07:49.315054 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:07:49Z","lastTransitionTime":"2025-11-26T15:07:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:07:49 crc kubenswrapper[4785]: I1126 15:07:49.317838 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6q4xd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"855bd894-cca9-4fe1-a0d5-8b72afe7c93a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://041064cd63f3d654e02e5e310fbc190ca527a986959c9d7a35ab87962184f2a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7zv2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6q4xd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:49Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:49 crc kubenswrapper[4785]: I1126 15:07:49.328712 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-hk884" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b9a87c55-b930-4993-88b8-15902e000caa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://018c4591e8ee49a626a6357d6672b723d221f62eaa62face83fd6c3d5f1b3bfa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n9rk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:38Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-hk884\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:49Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:49 crc kubenswrapper[4785]: I1126 15:07:49.348469 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:49Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:49 crc kubenswrapper[4785]: I1126 15:07:49.359481 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-smv28" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d77660b-5b10-4573-84ab-3dc318d4b4ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3bb62d64073e9a5efa6ec04394fb33bf1498892d59e0843f323a728dff2e7b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mwfft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:38Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-smv28\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:49Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:49 crc kubenswrapper[4785]: I1126 15:07:49.375122 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f369bd7df062792e57a0cee19e7be96c6850b20d6e739a0c7fa5eccc14ae1d1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:49Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:49 crc kubenswrapper[4785]: I1126 15:07:49.387036 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:49Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:49 crc kubenswrapper[4785]: I1126 15:07:49.399762 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ce2473df-8540-437a-9b68-a0c79c8f1189\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26237e500f4655636096b017418a9029308b9ad9e5c12041546e51d919766529\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7904d95c1ece08fadcdc54e83f80b295f3904cc950fb84624bc2a6967e4d5bdd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://936454b4c6cdedceb140c50e2dba0e677ce2b126075b203bdae8a77bbdefde76\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2202f005c9b704cc7a97a7387e4563a14bda9e52001c63918f1067a7e676d34d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a6d4414478527bb76f539125e4be6ebf2a6470ddb46fd35e07180319fa98ede\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-26T15:07:36Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1126 15:07:30.557519 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1126 15:07:30.559919 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2275430498/tls.crt::/tmp/serving-cert-2275430498/tls.key\\\\\\\"\\\\nI1126 15:07:36.843783 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1126 15:07:36.847546 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1126 15:07:36.847601 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1126 15:07:36.847630 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1126 15:07:36.847642 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1126 15:07:36.858915 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1126 15:07:36.858955 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1126 15:07:36.858960 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1126 15:07:36.858971 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1126 15:07:36.858975 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1126 15:07:36.858978 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1126 15:07:36.858982 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1126 15:07:36.859422 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1126 15:07:36.862125 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:20Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcce80995d95ff6993051a4f1ca260133d2513d216a291e3f2c905ec29f88d1b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e86ca1ef167bd0c31ad5ae1283d40945c72c4e528f3702b577bbbf9818e84c08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e86ca1ef167bd0c31ad5ae1283d40945c72c4e528f3702b577bbbf9818e84c08\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:49Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:49 crc kubenswrapper[4785]: I1126 15:07:49.417520 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:07:49 crc kubenswrapper[4785]: I1126 15:07:49.417579 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:07:49 crc kubenswrapper[4785]: I1126 15:07:49.417592 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:07:49 crc kubenswrapper[4785]: I1126 15:07:49.417604 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:07:49 crc kubenswrapper[4785]: I1126 15:07:49.417614 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:07:49Z","lastTransitionTime":"2025-11-26T15:07:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:07:49 crc kubenswrapper[4785]: I1126 15:07:49.417980 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ec3c2133-1999-46b4-b5a6-c1b800f04c30\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee866b365ef5e15c3cc262ab6ad0649b10fa7fdf0c10c3f747d9e20d48535e76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79b8f6bd2047642cacd4effcee8f9760c7c9480092993c7a25eabe4727bd8238\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dee1454ae94951e83f437b999f871e4965a9199a20bcd6392a6e07a7bfa9c6aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://257e51cb9b93727ccb616e193abc21dfed1ee9ee2f1ce459e35728457dae5280\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc743861f85f56a87c6a20d7152af59e554284fe13b8212c5dae854438771406\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ff60b31932a511f9156f1416e112d115fcbc79a0a6030143ba9f5ddd327577a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ff60b31932a511f9156f1416e112d115fcbc79a0a6030143ba9f5ddd327577a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff524be439c6bf410b9377cd503516b43be3826e4c2fcd03a6ed9bf3fcfab95b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff524be439c6bf410b9377cd503516b43be3826e4c2fcd03a6ed9bf3fcfab95b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:19Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://26cb5ba5508ef320a24ba05e352547c9feb82f37ebeab58ad7c0a073d03b8c2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26cb5ba5508ef320a24ba05e352547c9feb82f37ebeab58ad7c0a073d03b8c2d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:17Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:49Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:49 crc kubenswrapper[4785]: I1126 15:07:49.432037 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8acfa933b476aa5e196c195906225ae92af5cca3353292e23f29f87ee374d0ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f432643beb27bff31e146230a60837b1292cddb8595ede8ed7f929170fceeb17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:49Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:49 crc kubenswrapper[4785]: I1126 15:07:49.446252 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ab8b8fda3321443cb24c214c10ba7171f01c80adf81860f4aa931062ec886d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:49Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:49 crc kubenswrapper[4785]: I1126 15:07:49.459018 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:49Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:49 crc kubenswrapper[4785]: I1126 15:07:49.471681 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xbz7b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84d83039-5d86-45bb-a5c1-ca5b94ed92c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5459c3414a7754407ac6d182e88047bbe1b7b9750d15c7a88767c50c4b9dc720\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gdb7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f7072a2a9067489a27300fea823075b89d71d45a1d972bd9d94226b01c2c123\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f7072a2a9067489a27300fea823075b89d71d45a1d972bd9d94226b01c2c123\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gdb7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://538d65a0b5f3e2c45c1b9761f2c73e958d5de0873652f20f4047631423e538eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://538d65a0b5f3e2c45c1b9761f2c73e958d5de0873652f20f4047631423e538eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gdb7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1abf2b4c1a00c0bc17acaf7547130008127579df59592382ae6e81430936e6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d1abf2b4c1a00c0bc17acaf7547130008127579df59592382ae6e81430936e6d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gdb7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://35a2151729e5cb2ae347214fc91e6de116a4062943e4cb4a27ab0553436953ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35a2151729e5cb2ae347214fc91e6de116a4062943e4cb4a27ab0553436953ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gdb7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b51e95d33c83c57a3b5a7f580eb9cac5e50575e699c066694a9ae8fd4dcdfb20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b51e95d33c83c57a3b5a7f580eb9cac5e50575e699c066694a9ae8fd4dcdfb20\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gdb7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20f87e07ed26b3b93d4dad87f971e4ee3bf3c46299491a95990ab34c63167f93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://20f87e07ed26b3b93d4dad87f971e4ee3bf3c46299491a95990ab34c63167f93\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gdb7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xbz7b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:49Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:49 crc kubenswrapper[4785]: I1126 15:07:49.485879 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-gkxdl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5539d39a-e2bc-4e7f-8b5a-a5e3e10c4ba4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://183451ec5e6ad1983b6e56a53adfcf2eaf3489dcfbc3cbc709f67002691b9010\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqzs8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5538d579d2e0ac36c778f58da6e8176f8f7a0995797e8bc7c3ad21a054346867\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqzs8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-gkxdl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:49Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:49 crc kubenswrapper[4785]: I1126 15:07:49.498147 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dbac4f7f-578e-41e6-88a0-4ad8d2f5eef2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://320fc519a91dfed4d5c233e7c1c2f21aadf25336ccade8830069aae4bd804a68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b79ab88118b1e683ad26fc3bd390961d15af99cce73126c56587c0df819f3a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6c964ff713703fe4484d76d86473279ffc4a3eef0cf3dd8b29aea7fdc6271616\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03ac9c745ec9eb21263a599561496630202872ef631f5e846e0475efedee6943\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:49Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:49 crc kubenswrapper[4785]: I1126 15:07:49.513980 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-925q9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"862c58fd-3f79-4276-bd76-ce689d32cbd6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b754df1d5f2d20f3afdb10f2e77f4716de9d953c7322bb3e289d989d2699d17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://018b59a73e9d454f9156de17b00a8c5cccc3f321164cac374187afa18466bc45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e1a2013c12cbd23c8c9e4e0d5bd9f76df1c2c2f30f0d55fdb8c8cb03698ea576\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bc34631383b7470b2d582d037ea922f2c46d09ca9e37011d758e16837336a78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://77387f2d556a76807d809c75aa4d9e2fdf5de82d7a04e23d3f5461ccef860196\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://47be007cc11c8c2db5bc6dd5786cad4e45c1f8f08f872d060be0211012aad935\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f40be3fafbe55a70dd7b23d6fa3fe8267f607d03f065543745f6c93773f1663a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f40be3fafbe55a70dd7b23d6fa3fe8267f607d03f065543745f6c93773f1663a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-26T15:07:48Z\\\",\\\"message\\\":\\\"/pkg/client/informers/externalversions/factory.go:117\\\\nI1126 15:07:48.711996 6090 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1126 15:07:48.712044 6090 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1126 15:07:48.712060 6090 handler.go:208] Removed *v1.Node event handler 2\\\\nI1126 15:07:48.712068 6090 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1126 15:07:48.712085 6090 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1126 15:07:48.712108 6090 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1126 15:07:48.712143 6090 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1126 15:07:48.712149 6090 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1126 15:07:48.712151 6090 handler.go:208] Removed *v1.Node event handler 7\\\\nI1126 15:07:48.712154 6090 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1126 15:07:48.712159 6090 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1126 15:07:48.712166 6090 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1126 15:07:48.712171 6090 factory.go:656] Stopping watch factory\\\\nI1126 15:07:48.712172 6090 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1126 15:07:48.712183 6090 ovnkube.go:599] Stopped ovnkube\\\\nI1126 15\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84b98c7c476182b515072f71c6d81656debe5c03855324623b1eb31f53e6c5c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://049fc6de07d4b51e1c898035109c3427dff189cc509f5973db7cb75c3a022bc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://049fc6de07d4b51e1c898035109c3427dff189cc509f5973db7cb75c3a022bc1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-925q9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:49Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:49 crc kubenswrapper[4785]: I1126 15:07:49.519582 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:07:49 crc kubenswrapper[4785]: I1126 15:07:49.519609 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:07:49 crc kubenswrapper[4785]: I1126 15:07:49.519619 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:07:49 crc kubenswrapper[4785]: I1126 15:07:49.519634 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:07:49 crc kubenswrapper[4785]: I1126 15:07:49.519644 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:07:49Z","lastTransitionTime":"2025-11-26T15:07:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:07:49 crc kubenswrapper[4785]: I1126 15:07:49.621629 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:07:49 crc kubenswrapper[4785]: I1126 15:07:49.621868 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:07:49 crc kubenswrapper[4785]: I1126 15:07:49.621875 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:07:49 crc kubenswrapper[4785]: I1126 15:07:49.621887 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:07:49 crc kubenswrapper[4785]: I1126 15:07:49.621895 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:07:49Z","lastTransitionTime":"2025-11-26T15:07:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:07:49 crc kubenswrapper[4785]: I1126 15:07:49.723677 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:07:49 crc kubenswrapper[4785]: I1126 15:07:49.723719 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:07:49 crc kubenswrapper[4785]: I1126 15:07:49.723729 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:07:49 crc kubenswrapper[4785]: I1126 15:07:49.723742 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:07:49 crc kubenswrapper[4785]: I1126 15:07:49.723750 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:07:49Z","lastTransitionTime":"2025-11-26T15:07:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:07:49 crc kubenswrapper[4785]: I1126 15:07:49.826828 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:07:49 crc kubenswrapper[4785]: I1126 15:07:49.826878 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:07:49 crc kubenswrapper[4785]: I1126 15:07:49.826892 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:07:49 crc kubenswrapper[4785]: I1126 15:07:49.826909 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:07:49 crc kubenswrapper[4785]: I1126 15:07:49.826922 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:07:49Z","lastTransitionTime":"2025-11-26T15:07:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:07:49 crc kubenswrapper[4785]: I1126 15:07:49.929105 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:07:49 crc kubenswrapper[4785]: I1126 15:07:49.929148 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:07:49 crc kubenswrapper[4785]: I1126 15:07:49.929157 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:07:49 crc kubenswrapper[4785]: I1126 15:07:49.929174 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:07:49 crc kubenswrapper[4785]: I1126 15:07:49.929186 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:07:49Z","lastTransitionTime":"2025-11-26T15:07:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:07:50 crc kubenswrapper[4785]: I1126 15:07:50.030942 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:07:50 crc kubenswrapper[4785]: I1126 15:07:50.031004 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:07:50 crc kubenswrapper[4785]: I1126 15:07:50.031022 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:07:50 crc kubenswrapper[4785]: I1126 15:07:50.031045 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:07:50 crc kubenswrapper[4785]: I1126 15:07:50.031061 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:07:50Z","lastTransitionTime":"2025-11-26T15:07:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:07:50 crc kubenswrapper[4785]: I1126 15:07:50.036312 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 26 15:07:50 crc kubenswrapper[4785]: E1126 15:07:50.036629 4785 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 26 15:07:50 crc kubenswrapper[4785]: I1126 15:07:50.134338 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:07:50 crc kubenswrapper[4785]: I1126 15:07:50.134396 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:07:50 crc kubenswrapper[4785]: I1126 15:07:50.134410 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:07:50 crc kubenswrapper[4785]: I1126 15:07:50.134429 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:07:50 crc kubenswrapper[4785]: I1126 15:07:50.134751 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:07:50Z","lastTransitionTime":"2025-11-26T15:07:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:07:50 crc kubenswrapper[4785]: I1126 15:07:50.238620 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:07:50 crc kubenswrapper[4785]: I1126 15:07:50.238710 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:07:50 crc kubenswrapper[4785]: I1126 15:07:50.238731 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:07:50 crc kubenswrapper[4785]: I1126 15:07:50.238760 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:07:50 crc kubenswrapper[4785]: I1126 15:07:50.238788 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:07:50Z","lastTransitionTime":"2025-11-26T15:07:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:07:50 crc kubenswrapper[4785]: I1126 15:07:50.312798 4785 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-925q9_862c58fd-3f79-4276-bd76-ce689d32cbd6/ovnkube-controller/1.log" Nov 26 15:07:50 crc kubenswrapper[4785]: I1126 15:07:50.313842 4785 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-925q9_862c58fd-3f79-4276-bd76-ce689d32cbd6/ovnkube-controller/0.log" Nov 26 15:07:50 crc kubenswrapper[4785]: I1126 15:07:50.318366 4785 generic.go:334] "Generic (PLEG): container finished" podID="862c58fd-3f79-4276-bd76-ce689d32cbd6" containerID="70cfa5c92e3afde972c24f4c6d70dcfe2bf73b25223abd8628949f267b3c30c5" exitCode=1 Nov 26 15:07:50 crc kubenswrapper[4785]: I1126 15:07:50.318424 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-925q9" event={"ID":"862c58fd-3f79-4276-bd76-ce689d32cbd6","Type":"ContainerDied","Data":"70cfa5c92e3afde972c24f4c6d70dcfe2bf73b25223abd8628949f267b3c30c5"} Nov 26 15:07:50 crc kubenswrapper[4785]: I1126 15:07:50.318488 4785 scope.go:117] "RemoveContainer" containerID="f40be3fafbe55a70dd7b23d6fa3fe8267f607d03f065543745f6c93773f1663a" Nov 26 15:07:50 crc kubenswrapper[4785]: I1126 15:07:50.319706 4785 scope.go:117] "RemoveContainer" containerID="70cfa5c92e3afde972c24f4c6d70dcfe2bf73b25223abd8628949f267b3c30c5" Nov 26 15:07:50 crc kubenswrapper[4785]: E1126 15:07:50.320071 4785 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-925q9_openshift-ovn-kubernetes(862c58fd-3f79-4276-bd76-ce689d32cbd6)\"" pod="openshift-ovn-kubernetes/ovnkube-node-925q9" podUID="862c58fd-3f79-4276-bd76-ce689d32cbd6" Nov 26 15:07:50 crc kubenswrapper[4785]: I1126 15:07:50.341455 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:07:50 crc kubenswrapper[4785]: I1126 15:07:50.341494 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:07:50 crc kubenswrapper[4785]: I1126 15:07:50.341513 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:07:50 crc kubenswrapper[4785]: I1126 15:07:50.341535 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:07:50 crc kubenswrapper[4785]: I1126 15:07:50.341570 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:07:50Z","lastTransitionTime":"2025-11-26T15:07:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:07:50 crc kubenswrapper[4785]: I1126 15:07:50.343409 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dbac4f7f-578e-41e6-88a0-4ad8d2f5eef2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://320fc519a91dfed4d5c233e7c1c2f21aadf25336ccade8830069aae4bd804a68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b79ab88118b1e683ad26fc3bd390961d15af99cce73126c56587c0df819f3a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6c964ff713703fe4484d76d86473279ffc4a3eef0cf3dd8b29aea7fdc6271616\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03ac9c745ec9eb21263a599561496630202872ef631f5e846e0475efedee6943\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:50Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:50 crc kubenswrapper[4785]: I1126 15:07:50.369833 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ec3c2133-1999-46b4-b5a6-c1b800f04c30\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee866b365ef5e15c3cc262ab6ad0649b10fa7fdf0c10c3f747d9e20d48535e76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79b8f6bd2047642cacd4effcee8f9760c7c9480092993c7a25eabe4727bd8238\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dee1454ae94951e83f437b999f871e4965a9199a20bcd6392a6e07a7bfa9c6aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://257e51cb9b93727ccb616e193abc21dfed1ee9ee2f1ce459e35728457dae5280\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc743861f85f56a87c6a20d7152af59e554284fe13b8212c5dae854438771406\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ff60b31932a511f9156f1416e112d115fcbc79a0a6030143ba9f5ddd327577a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ff60b31932a511f9156f1416e112d115fcbc79a0a6030143ba9f5ddd327577a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff524be439c6bf410b9377cd503516b43be3826e4c2fcd03a6ed9bf3fcfab95b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff524be439c6bf410b9377cd503516b43be3826e4c2fcd03a6ed9bf3fcfab95b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:19Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://26cb5ba5508ef320a24ba05e352547c9feb82f37ebeab58ad7c0a073d03b8c2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26cb5ba5508ef320a24ba05e352547c9feb82f37ebeab58ad7c0a073d03b8c2d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:17Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:50Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:50 crc kubenswrapper[4785]: I1126 15:07:50.386968 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8acfa933b476aa5e196c195906225ae92af5cca3353292e23f29f87ee374d0ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f432643beb27bff31e146230a60837b1292cddb8595ede8ed7f929170fceeb17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:50Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:50 crc kubenswrapper[4785]: I1126 15:07:50.404589 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ab8b8fda3321443cb24c214c10ba7171f01c80adf81860f4aa931062ec886d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:50Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:50 crc kubenswrapper[4785]: I1126 15:07:50.425662 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:50Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:50 crc kubenswrapper[4785]: I1126 15:07:50.444468 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:07:50 crc kubenswrapper[4785]: I1126 15:07:50.444508 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:07:50 crc kubenswrapper[4785]: I1126 15:07:50.444515 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:07:50 crc kubenswrapper[4785]: I1126 15:07:50.444527 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:07:50 crc kubenswrapper[4785]: I1126 15:07:50.444536 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:07:50Z","lastTransitionTime":"2025-11-26T15:07:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:07:50 crc kubenswrapper[4785]: I1126 15:07:50.449489 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xbz7b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84d83039-5d86-45bb-a5c1-ca5b94ed92c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5459c3414a7754407ac6d182e88047bbe1b7b9750d15c7a88767c50c4b9dc720\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gdb7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f7072a2a9067489a27300fea823075b89d71d45a1d972bd9d94226b01c2c123\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f7072a2a9067489a27300fea823075b89d71d45a1d972bd9d94226b01c2c123\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gdb7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://538d65a0b5f3e2c45c1b9761f2c73e958d5de0873652f20f4047631423e538eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://538d65a0b5f3e2c45c1b9761f2c73e958d5de0873652f20f4047631423e538eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gdb7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1abf2b4c1a00c0bc17acaf7547130008127579df59592382ae6e81430936e6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d1abf2b4c1a00c0bc17acaf7547130008127579df59592382ae6e81430936e6d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gdb7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://35a2151729e5cb2ae347214fc91e6de116a4062943e4cb4a27ab0553436953ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35a2151729e5cb2ae347214fc91e6de116a4062943e4cb4a27ab0553436953ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gdb7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b51e95d33c83c57a3b5a7f580eb9cac5e50575e699c066694a9ae8fd4dcdfb20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b51e95d33c83c57a3b5a7f580eb9cac5e50575e699c066694a9ae8fd4dcdfb20\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gdb7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20f87e07ed26b3b93d4dad87f971e4ee3bf3c46299491a95990ab34c63167f93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://20f87e07ed26b3b93d4dad87f971e4ee3bf3c46299491a95990ab34c63167f93\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gdb7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xbz7b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:50Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:50 crc kubenswrapper[4785]: I1126 15:07:50.466708 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-gkxdl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5539d39a-e2bc-4e7f-8b5a-a5e3e10c4ba4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://183451ec5e6ad1983b6e56a53adfcf2eaf3489dcfbc3cbc709f67002691b9010\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqzs8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5538d579d2e0ac36c778f58da6e8176f8f7a0995797e8bc7c3ad21a054346867\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqzs8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-gkxdl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:50Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:50 crc kubenswrapper[4785]: I1126 15:07:50.496405 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-925q9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"862c58fd-3f79-4276-bd76-ce689d32cbd6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b754df1d5f2d20f3afdb10f2e77f4716de9d953c7322bb3e289d989d2699d17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://018b59a73e9d454f9156de17b00a8c5cccc3f321164cac374187afa18466bc45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e1a2013c12cbd23c8c9e4e0d5bd9f76df1c2c2f30f0d55fdb8c8cb03698ea576\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bc34631383b7470b2d582d037ea922f2c46d09ca9e37011d758e16837336a78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://77387f2d556a76807d809c75aa4d9e2fdf5de82d7a04e23d3f5461ccef860196\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://47be007cc11c8c2db5bc6dd5786cad4e45c1f8f08f872d060be0211012aad935\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70cfa5c92e3afde972c24f4c6d70dcfe2bf73b25223abd8628949f267b3c30c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f40be3fafbe55a70dd7b23d6fa3fe8267f607d03f065543745f6c93773f1663a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-26T15:07:48Z\\\",\\\"message\\\":\\\"/pkg/client/informers/externalversions/factory.go:117\\\\nI1126 15:07:48.711996 6090 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1126 15:07:48.712044 6090 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1126 15:07:48.712060 6090 handler.go:208] Removed *v1.Node event handler 2\\\\nI1126 15:07:48.712068 6090 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1126 15:07:48.712085 6090 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1126 15:07:48.712108 6090 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1126 15:07:48.712143 6090 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1126 15:07:48.712149 6090 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1126 15:07:48.712151 6090 handler.go:208] Removed *v1.Node event handler 7\\\\nI1126 15:07:48.712154 6090 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1126 15:07:48.712159 6090 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1126 15:07:48.712166 6090 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1126 15:07:48.712171 6090 factory.go:656] Stopping watch factory\\\\nI1126 15:07:48.712172 6090 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1126 15:07:48.712183 6090 ovnkube.go:599] Stopped ovnkube\\\\nI1126 15\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:45Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://70cfa5c92e3afde972c24f4c6d70dcfe2bf73b25223abd8628949f267b3c30c5\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-26T15:07:50Z\\\",\\\"message\\\":\\\"Built service openshift-config-operator/metrics per-node LB for network=default: []services.LB{}\\\\nI1126 15:07:50.056310 6208 services_controller.go:453] Built service openshift-config-operator/metrics template LB for network=default: []services.LB{}\\\\nI1126 15:07:50.056320 6208 services_controller.go:454] Service openshift-config-operator/metrics for network=default has 1 cluster-wide, 0 per-node configs, 0 template configs, making 1 (cluster) 0 (per node) and 0 (template) load balancers\\\\nI1126 15:07:50.056218 6208 lb_config.go:1031] Cluster endpoints for openshift-ingress-canary/ingress-canary for network=default are: map[]\\\\nI1126 15:07:50.055930 6208 ovn.go:134] Ensuring zone local for Pod openshift-machine-config-operator/machine-config-daemon-gkxdl in node crc\\\\nI1126 15:07:50.056357 6208 services_controller.go:443] Built service openshift-ingress-canary/ingress-canary LB cluster-wide configs for network=default: []services.lbConfig{services.lbConfig{vips:[]string{\\\\\\\"10.217.5.34\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:8443, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]string(nil), V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}, services.lbConfig{vips:[]string{\\\\\\\"10.217.5.34\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:8888, clusterEndpoints:services.lbEndpoints\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84b98c7c476182b515072f71c6d81656debe5c03855324623b1eb31f53e6c5c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://049fc6de07d4b51e1c898035109c3427dff189cc509f5973db7cb75c3a022bc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://049fc6de07d4b51e1c898035109c3427dff189cc509f5973db7cb75c3a022bc1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-925q9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:50Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:50 crc kubenswrapper[4785]: I1126 15:07:50.508016 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-hk884" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b9a87c55-b930-4993-88b8-15902e000caa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://018c4591e8ee49a626a6357d6672b723d221f62eaa62face83fd6c3d5f1b3bfa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n9rk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:38Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-hk884\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:50Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:50 crc kubenswrapper[4785]: I1126 15:07:50.533446 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6q4xd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"855bd894-cca9-4fe1-a0d5-8b72afe7c93a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://041064cd63f3d654e02e5e310fbc190ca527a986959c9d7a35ab87962184f2a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7zv2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6q4xd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:50Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:50 crc kubenswrapper[4785]: I1126 15:07:50.546968 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:07:50 crc kubenswrapper[4785]: I1126 15:07:50.547036 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:07:50 crc kubenswrapper[4785]: I1126 15:07:50.547053 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:07:50 crc kubenswrapper[4785]: I1126 15:07:50.547076 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:07:50 crc kubenswrapper[4785]: I1126 15:07:50.547093 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:07:50Z","lastTransitionTime":"2025-11-26T15:07:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:07:50 crc kubenswrapper[4785]: I1126 15:07:50.552789 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:50Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:50 crc kubenswrapper[4785]: I1126 15:07:50.565936 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-smv28" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d77660b-5b10-4573-84ab-3dc318d4b4ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3bb62d64073e9a5efa6ec04394fb33bf1498892d59e0843f323a728dff2e7b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mwfft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:38Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-smv28\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:50Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:50 crc kubenswrapper[4785]: I1126 15:07:50.584330 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ce2473df-8540-437a-9b68-a0c79c8f1189\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26237e500f4655636096b017418a9029308b9ad9e5c12041546e51d919766529\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7904d95c1ece08fadcdc54e83f80b295f3904cc950fb84624bc2a6967e4d5bdd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://936454b4c6cdedceb140c50e2dba0e677ce2b126075b203bdae8a77bbdefde76\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2202f005c9b704cc7a97a7387e4563a14bda9e52001c63918f1067a7e676d34d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a6d4414478527bb76f539125e4be6ebf2a6470ddb46fd35e07180319fa98ede\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-26T15:07:36Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1126 15:07:30.557519 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1126 15:07:30.559919 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2275430498/tls.crt::/tmp/serving-cert-2275430498/tls.key\\\\\\\"\\\\nI1126 15:07:36.843783 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1126 15:07:36.847546 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1126 15:07:36.847601 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1126 15:07:36.847630 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1126 15:07:36.847642 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1126 15:07:36.858915 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1126 15:07:36.858955 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1126 15:07:36.858960 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1126 15:07:36.858971 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1126 15:07:36.858975 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1126 15:07:36.858978 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1126 15:07:36.858982 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1126 15:07:36.859422 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1126 15:07:36.862125 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:20Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcce80995d95ff6993051a4f1ca260133d2513d216a291e3f2c905ec29f88d1b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e86ca1ef167bd0c31ad5ae1283d40945c72c4e528f3702b577bbbf9818e84c08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e86ca1ef167bd0c31ad5ae1283d40945c72c4e528f3702b577bbbf9818e84c08\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:50Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:50 crc kubenswrapper[4785]: I1126 15:07:50.601407 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f369bd7df062792e57a0cee19e7be96c6850b20d6e739a0c7fa5eccc14ae1d1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:50Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:50 crc kubenswrapper[4785]: I1126 15:07:50.621262 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:50Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:50 crc kubenswrapper[4785]: I1126 15:07:50.649795 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:07:50 crc kubenswrapper[4785]: I1126 15:07:50.649875 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:07:50 crc kubenswrapper[4785]: I1126 15:07:50.649901 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:07:50 crc kubenswrapper[4785]: I1126 15:07:50.649933 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:07:50 crc kubenswrapper[4785]: I1126 15:07:50.649958 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:07:50Z","lastTransitionTime":"2025-11-26T15:07:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:07:50 crc kubenswrapper[4785]: I1126 15:07:50.753510 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:07:50 crc kubenswrapper[4785]: I1126 15:07:50.753617 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:07:50 crc kubenswrapper[4785]: I1126 15:07:50.753636 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:07:50 crc kubenswrapper[4785]: I1126 15:07:50.753660 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:07:50 crc kubenswrapper[4785]: I1126 15:07:50.753677 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:07:50Z","lastTransitionTime":"2025-11-26T15:07:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:07:50 crc kubenswrapper[4785]: I1126 15:07:50.857281 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:07:50 crc kubenswrapper[4785]: I1126 15:07:50.857338 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:07:50 crc kubenswrapper[4785]: I1126 15:07:50.857354 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:07:50 crc kubenswrapper[4785]: I1126 15:07:50.857378 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:07:50 crc kubenswrapper[4785]: I1126 15:07:50.857396 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:07:50Z","lastTransitionTime":"2025-11-26T15:07:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:07:50 crc kubenswrapper[4785]: I1126 15:07:50.960538 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:07:50 crc kubenswrapper[4785]: I1126 15:07:50.960638 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:07:50 crc kubenswrapper[4785]: I1126 15:07:50.960661 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:07:50 crc kubenswrapper[4785]: I1126 15:07:50.960689 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:07:50 crc kubenswrapper[4785]: I1126 15:07:50.960709 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:07:50Z","lastTransitionTime":"2025-11-26T15:07:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:07:51 crc kubenswrapper[4785]: I1126 15:07:51.035664 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 26 15:07:51 crc kubenswrapper[4785]: I1126 15:07:51.035742 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 26 15:07:51 crc kubenswrapper[4785]: E1126 15:07:51.035885 4785 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 26 15:07:51 crc kubenswrapper[4785]: E1126 15:07:51.036100 4785 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 26 15:07:51 crc kubenswrapper[4785]: I1126 15:07:51.063352 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:07:51 crc kubenswrapper[4785]: I1126 15:07:51.063384 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:07:51 crc kubenswrapper[4785]: I1126 15:07:51.063391 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:07:51 crc kubenswrapper[4785]: I1126 15:07:51.063404 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:07:51 crc kubenswrapper[4785]: I1126 15:07:51.063413 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:07:51Z","lastTransitionTime":"2025-11-26T15:07:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:07:51 crc kubenswrapper[4785]: I1126 15:07:51.166488 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:07:51 crc kubenswrapper[4785]: I1126 15:07:51.166528 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:07:51 crc kubenswrapper[4785]: I1126 15:07:51.166540 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:07:51 crc kubenswrapper[4785]: I1126 15:07:51.166588 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:07:51 crc kubenswrapper[4785]: I1126 15:07:51.166601 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:07:51Z","lastTransitionTime":"2025-11-26T15:07:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:07:51 crc kubenswrapper[4785]: I1126 15:07:51.250538 4785 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zk6jt"] Nov 26 15:07:51 crc kubenswrapper[4785]: I1126 15:07:51.251755 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zk6jt" Nov 26 15:07:51 crc kubenswrapper[4785]: I1126 15:07:51.255379 4785 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Nov 26 15:07:51 crc kubenswrapper[4785]: I1126 15:07:51.255474 4785 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Nov 26 15:07:51 crc kubenswrapper[4785]: I1126 15:07:51.269748 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:07:51 crc kubenswrapper[4785]: I1126 15:07:51.269790 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:07:51 crc kubenswrapper[4785]: I1126 15:07:51.269801 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:07:51 crc kubenswrapper[4785]: I1126 15:07:51.269818 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:07:51 crc kubenswrapper[4785]: I1126 15:07:51.269834 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:07:51Z","lastTransitionTime":"2025-11-26T15:07:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:07:51 crc kubenswrapper[4785]: I1126 15:07:51.275094 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:51Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:51 crc kubenswrapper[4785]: I1126 15:07:51.290134 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-smv28" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d77660b-5b10-4573-84ab-3dc318d4b4ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3bb62d64073e9a5efa6ec04394fb33bf1498892d59e0843f323a728dff2e7b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mwfft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:38Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-smv28\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:51Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:51 crc kubenswrapper[4785]: I1126 15:07:51.294173 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/09c65f70-203d-40eb-a45e-ed8d7e36912f-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-zk6jt\" (UID: \"09c65f70-203d-40eb-a45e-ed8d7e36912f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zk6jt" Nov 26 15:07:51 crc kubenswrapper[4785]: I1126 15:07:51.294612 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/09c65f70-203d-40eb-a45e-ed8d7e36912f-env-overrides\") pod \"ovnkube-control-plane-749d76644c-zk6jt\" (UID: \"09c65f70-203d-40eb-a45e-ed8d7e36912f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zk6jt" Nov 26 15:07:51 crc kubenswrapper[4785]: I1126 15:07:51.294862 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dg52p\" (UniqueName: \"kubernetes.io/projected/09c65f70-203d-40eb-a45e-ed8d7e36912f-kube-api-access-dg52p\") pod \"ovnkube-control-plane-749d76644c-zk6jt\" (UID: \"09c65f70-203d-40eb-a45e-ed8d7e36912f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zk6jt" Nov 26 15:07:51 crc kubenswrapper[4785]: I1126 15:07:51.295124 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/09c65f70-203d-40eb-a45e-ed8d7e36912f-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-zk6jt\" (UID: \"09c65f70-203d-40eb-a45e-ed8d7e36912f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zk6jt" Nov 26 15:07:51 crc kubenswrapper[4785]: I1126 15:07:51.303968 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zk6jt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"09c65f70-203d-40eb-a45e-ed8d7e36912f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dg52p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dg52p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-zk6jt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:51Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:51 crc kubenswrapper[4785]: I1126 15:07:51.321812 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f369bd7df062792e57a0cee19e7be96c6850b20d6e739a0c7fa5eccc14ae1d1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:51Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:51 crc kubenswrapper[4785]: I1126 15:07:51.323687 4785 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-925q9_862c58fd-3f79-4276-bd76-ce689d32cbd6/ovnkube-controller/1.log" Nov 26 15:07:51 crc kubenswrapper[4785]: I1126 15:07:51.327284 4785 scope.go:117] "RemoveContainer" containerID="70cfa5c92e3afde972c24f4c6d70dcfe2bf73b25223abd8628949f267b3c30c5" Nov 26 15:07:51 crc kubenswrapper[4785]: E1126 15:07:51.327493 4785 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-925q9_openshift-ovn-kubernetes(862c58fd-3f79-4276-bd76-ce689d32cbd6)\"" pod="openshift-ovn-kubernetes/ovnkube-node-925q9" podUID="862c58fd-3f79-4276-bd76-ce689d32cbd6" Nov 26 15:07:51 crc kubenswrapper[4785]: I1126 15:07:51.334734 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:51Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:51 crc kubenswrapper[4785]: I1126 15:07:51.348658 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ce2473df-8540-437a-9b68-a0c79c8f1189\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26237e500f4655636096b017418a9029308b9ad9e5c12041546e51d919766529\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7904d95c1ece08fadcdc54e83f80b295f3904cc950fb84624bc2a6967e4d5bdd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://936454b4c6cdedceb140c50e2dba0e677ce2b126075b203bdae8a77bbdefde76\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2202f005c9b704cc7a97a7387e4563a14bda9e52001c63918f1067a7e676d34d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a6d4414478527bb76f539125e4be6ebf2a6470ddb46fd35e07180319fa98ede\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-26T15:07:36Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1126 15:07:30.557519 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1126 15:07:30.559919 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2275430498/tls.crt::/tmp/serving-cert-2275430498/tls.key\\\\\\\"\\\\nI1126 15:07:36.843783 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1126 15:07:36.847546 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1126 15:07:36.847601 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1126 15:07:36.847630 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1126 15:07:36.847642 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1126 15:07:36.858915 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1126 15:07:36.858955 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1126 15:07:36.858960 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1126 15:07:36.858971 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1126 15:07:36.858975 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1126 15:07:36.858978 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1126 15:07:36.858982 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1126 15:07:36.859422 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1126 15:07:36.862125 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:20Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcce80995d95ff6993051a4f1ca260133d2513d216a291e3f2c905ec29f88d1b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e86ca1ef167bd0c31ad5ae1283d40945c72c4e528f3702b577bbbf9818e84c08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e86ca1ef167bd0c31ad5ae1283d40945c72c4e528f3702b577bbbf9818e84c08\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:51Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:51 crc kubenswrapper[4785]: I1126 15:07:51.370311 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ec3c2133-1999-46b4-b5a6-c1b800f04c30\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee866b365ef5e15c3cc262ab6ad0649b10fa7fdf0c10c3f747d9e20d48535e76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79b8f6bd2047642cacd4effcee8f9760c7c9480092993c7a25eabe4727bd8238\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dee1454ae94951e83f437b999f871e4965a9199a20bcd6392a6e07a7bfa9c6aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://257e51cb9b93727ccb616e193abc21dfed1ee9ee2f1ce459e35728457dae5280\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc743861f85f56a87c6a20d7152af59e554284fe13b8212c5dae854438771406\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ff60b31932a511f9156f1416e112d115fcbc79a0a6030143ba9f5ddd327577a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ff60b31932a511f9156f1416e112d115fcbc79a0a6030143ba9f5ddd327577a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff524be439c6bf410b9377cd503516b43be3826e4c2fcd03a6ed9bf3fcfab95b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff524be439c6bf410b9377cd503516b43be3826e4c2fcd03a6ed9bf3fcfab95b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:19Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://26cb5ba5508ef320a24ba05e352547c9feb82f37ebeab58ad7c0a073d03b8c2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26cb5ba5508ef320a24ba05e352547c9feb82f37ebeab58ad7c0a073d03b8c2d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:17Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:51Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:51 crc kubenswrapper[4785]: I1126 15:07:51.372995 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:07:51 crc kubenswrapper[4785]: I1126 15:07:51.373027 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:07:51 crc kubenswrapper[4785]: I1126 15:07:51.373039 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:07:51 crc kubenswrapper[4785]: I1126 15:07:51.373054 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:07:51 crc kubenswrapper[4785]: I1126 15:07:51.373066 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:07:51Z","lastTransitionTime":"2025-11-26T15:07:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:07:51 crc kubenswrapper[4785]: I1126 15:07:51.381985 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8acfa933b476aa5e196c195906225ae92af5cca3353292e23f29f87ee374d0ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f432643beb27bff31e146230a60837b1292cddb8595ede8ed7f929170fceeb17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:51Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:51 crc kubenswrapper[4785]: I1126 15:07:51.392817 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ab8b8fda3321443cb24c214c10ba7171f01c80adf81860f4aa931062ec886d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:51Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:51 crc kubenswrapper[4785]: I1126 15:07:51.396276 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/09c65f70-203d-40eb-a45e-ed8d7e36912f-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-zk6jt\" (UID: \"09c65f70-203d-40eb-a45e-ed8d7e36912f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zk6jt" Nov 26 15:07:51 crc kubenswrapper[4785]: I1126 15:07:51.396352 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/09c65f70-203d-40eb-a45e-ed8d7e36912f-env-overrides\") pod \"ovnkube-control-plane-749d76644c-zk6jt\" (UID: \"09c65f70-203d-40eb-a45e-ed8d7e36912f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zk6jt" Nov 26 15:07:51 crc kubenswrapper[4785]: I1126 15:07:51.396378 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dg52p\" (UniqueName: \"kubernetes.io/projected/09c65f70-203d-40eb-a45e-ed8d7e36912f-kube-api-access-dg52p\") pod \"ovnkube-control-plane-749d76644c-zk6jt\" (UID: \"09c65f70-203d-40eb-a45e-ed8d7e36912f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zk6jt" Nov 26 15:07:51 crc kubenswrapper[4785]: I1126 15:07:51.396401 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/09c65f70-203d-40eb-a45e-ed8d7e36912f-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-zk6jt\" (UID: \"09c65f70-203d-40eb-a45e-ed8d7e36912f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zk6jt" Nov 26 15:07:51 crc kubenswrapper[4785]: I1126 15:07:51.397147 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/09c65f70-203d-40eb-a45e-ed8d7e36912f-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-zk6jt\" (UID: \"09c65f70-203d-40eb-a45e-ed8d7e36912f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zk6jt" Nov 26 15:07:51 crc kubenswrapper[4785]: I1126 15:07:51.397961 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/09c65f70-203d-40eb-a45e-ed8d7e36912f-env-overrides\") pod \"ovnkube-control-plane-749d76644c-zk6jt\" (UID: \"09c65f70-203d-40eb-a45e-ed8d7e36912f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zk6jt" Nov 26 15:07:51 crc kubenswrapper[4785]: I1126 15:07:51.403900 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/09c65f70-203d-40eb-a45e-ed8d7e36912f-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-zk6jt\" (UID: \"09c65f70-203d-40eb-a45e-ed8d7e36912f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zk6jt" Nov 26 15:07:51 crc kubenswrapper[4785]: I1126 15:07:51.404474 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:51Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:51 crc kubenswrapper[4785]: I1126 15:07:51.415665 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dg52p\" (UniqueName: \"kubernetes.io/projected/09c65f70-203d-40eb-a45e-ed8d7e36912f-kube-api-access-dg52p\") pod \"ovnkube-control-plane-749d76644c-zk6jt\" (UID: \"09c65f70-203d-40eb-a45e-ed8d7e36912f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zk6jt" Nov 26 15:07:51 crc kubenswrapper[4785]: I1126 15:07:51.426081 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xbz7b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84d83039-5d86-45bb-a5c1-ca5b94ed92c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5459c3414a7754407ac6d182e88047bbe1b7b9750d15c7a88767c50c4b9dc720\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gdb7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f7072a2a9067489a27300fea823075b89d71d45a1d972bd9d94226b01c2c123\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f7072a2a9067489a27300fea823075b89d71d45a1d972bd9d94226b01c2c123\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gdb7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://538d65a0b5f3e2c45c1b9761f2c73e958d5de0873652f20f4047631423e538eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://538d65a0b5f3e2c45c1b9761f2c73e958d5de0873652f20f4047631423e538eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gdb7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1abf2b4c1a00c0bc17acaf7547130008127579df59592382ae6e81430936e6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d1abf2b4c1a00c0bc17acaf7547130008127579df59592382ae6e81430936e6d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gdb7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://35a2151729e5cb2ae347214fc91e6de116a4062943e4cb4a27ab0553436953ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35a2151729e5cb2ae347214fc91e6de116a4062943e4cb4a27ab0553436953ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gdb7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b51e95d33c83c57a3b5a7f580eb9cac5e50575e699c066694a9ae8fd4dcdfb20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b51e95d33c83c57a3b5a7f580eb9cac5e50575e699c066694a9ae8fd4dcdfb20\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gdb7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20f87e07ed26b3b93d4dad87f971e4ee3bf3c46299491a95990ab34c63167f93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://20f87e07ed26b3b93d4dad87f971e4ee3bf3c46299491a95990ab34c63167f93\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gdb7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xbz7b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:51Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:51 crc kubenswrapper[4785]: I1126 15:07:51.437516 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-gkxdl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5539d39a-e2bc-4e7f-8b5a-a5e3e10c4ba4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://183451ec5e6ad1983b6e56a53adfcf2eaf3489dcfbc3cbc709f67002691b9010\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqzs8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5538d579d2e0ac36c778f58da6e8176f8f7a0995797e8bc7c3ad21a054346867\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqzs8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-gkxdl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:51Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:51 crc kubenswrapper[4785]: I1126 15:07:51.452493 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dbac4f7f-578e-41e6-88a0-4ad8d2f5eef2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://320fc519a91dfed4d5c233e7c1c2f21aadf25336ccade8830069aae4bd804a68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b79ab88118b1e683ad26fc3bd390961d15af99cce73126c56587c0df819f3a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6c964ff713703fe4484d76d86473279ffc4a3eef0cf3dd8b29aea7fdc6271616\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03ac9c745ec9eb21263a599561496630202872ef631f5e846e0475efedee6943\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:51Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:51 crc kubenswrapper[4785]: I1126 15:07:51.475461 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:07:51 crc kubenswrapper[4785]: I1126 15:07:51.475507 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:07:51 crc kubenswrapper[4785]: I1126 15:07:51.475518 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:07:51 crc kubenswrapper[4785]: I1126 15:07:51.475533 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:07:51 crc kubenswrapper[4785]: I1126 15:07:51.475544 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:07:51Z","lastTransitionTime":"2025-11-26T15:07:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:07:51 crc kubenswrapper[4785]: I1126 15:07:51.480202 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-925q9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"862c58fd-3f79-4276-bd76-ce689d32cbd6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b754df1d5f2d20f3afdb10f2e77f4716de9d953c7322bb3e289d989d2699d17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://018b59a73e9d454f9156de17b00a8c5cccc3f321164cac374187afa18466bc45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e1a2013c12cbd23c8c9e4e0d5bd9f76df1c2c2f30f0d55fdb8c8cb03698ea576\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bc34631383b7470b2d582d037ea922f2c46d09ca9e37011d758e16837336a78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://77387f2d556a76807d809c75aa4d9e2fdf5de82d7a04e23d3f5461ccef860196\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://47be007cc11c8c2db5bc6dd5786cad4e45c1f8f08f872d060be0211012aad935\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70cfa5c92e3afde972c24f4c6d70dcfe2bf73b25223abd8628949f267b3c30c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f40be3fafbe55a70dd7b23d6fa3fe8267f607d03f065543745f6c93773f1663a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-26T15:07:48Z\\\",\\\"message\\\":\\\"/pkg/client/informers/externalversions/factory.go:117\\\\nI1126 15:07:48.711996 6090 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1126 15:07:48.712044 6090 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1126 15:07:48.712060 6090 handler.go:208] Removed *v1.Node event handler 2\\\\nI1126 15:07:48.712068 6090 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1126 15:07:48.712085 6090 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1126 15:07:48.712108 6090 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1126 15:07:48.712143 6090 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1126 15:07:48.712149 6090 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1126 15:07:48.712151 6090 handler.go:208] Removed *v1.Node event handler 7\\\\nI1126 15:07:48.712154 6090 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1126 15:07:48.712159 6090 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1126 15:07:48.712166 6090 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1126 15:07:48.712171 6090 factory.go:656] Stopping watch factory\\\\nI1126 15:07:48.712172 6090 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1126 15:07:48.712183 6090 ovnkube.go:599] Stopped ovnkube\\\\nI1126 15\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:45Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://70cfa5c92e3afde972c24f4c6d70dcfe2bf73b25223abd8628949f267b3c30c5\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-26T15:07:50Z\\\",\\\"message\\\":\\\"Built service openshift-config-operator/metrics per-node LB for network=default: []services.LB{}\\\\nI1126 15:07:50.056310 6208 services_controller.go:453] Built service openshift-config-operator/metrics template LB for network=default: []services.LB{}\\\\nI1126 15:07:50.056320 6208 services_controller.go:454] Service openshift-config-operator/metrics for network=default has 1 cluster-wide, 0 per-node configs, 0 template configs, making 1 (cluster) 0 (per node) and 0 (template) load balancers\\\\nI1126 15:07:50.056218 6208 lb_config.go:1031] Cluster endpoints for openshift-ingress-canary/ingress-canary for network=default are: map[]\\\\nI1126 15:07:50.055930 6208 ovn.go:134] Ensuring zone local for Pod openshift-machine-config-operator/machine-config-daemon-gkxdl in node crc\\\\nI1126 15:07:50.056357 6208 services_controller.go:443] Built service openshift-ingress-canary/ingress-canary LB cluster-wide configs for network=default: []services.lbConfig{services.lbConfig{vips:[]string{\\\\\\\"10.217.5.34\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:8443, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]string(nil), V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}, services.lbConfig{vips:[]string{\\\\\\\"10.217.5.34\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:8888, clusterEndpoints:services.lbEndpoints\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84b98c7c476182b515072f71c6d81656debe5c03855324623b1eb31f53e6c5c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://049fc6de07d4b51e1c898035109c3427dff189cc509f5973db7cb75c3a022bc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://049fc6de07d4b51e1c898035109c3427dff189cc509f5973db7cb75c3a022bc1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-925q9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:51Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:51 crc kubenswrapper[4785]: I1126 15:07:51.500367 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6q4xd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"855bd894-cca9-4fe1-a0d5-8b72afe7c93a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://041064cd63f3d654e02e5e310fbc190ca527a986959c9d7a35ab87962184f2a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7zv2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6q4xd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:51Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:51 crc kubenswrapper[4785]: I1126 15:07:51.510187 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-hk884" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b9a87c55-b930-4993-88b8-15902e000caa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://018c4591e8ee49a626a6357d6672b723d221f62eaa62face83fd6c3d5f1b3bfa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n9rk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:38Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-hk884\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:51Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:51 crc kubenswrapper[4785]: I1126 15:07:51.539482 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ec3c2133-1999-46b4-b5a6-c1b800f04c30\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee866b365ef5e15c3cc262ab6ad0649b10fa7fdf0c10c3f747d9e20d48535e76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79b8f6bd2047642cacd4effcee8f9760c7c9480092993c7a25eabe4727bd8238\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dee1454ae94951e83f437b999f871e4965a9199a20bcd6392a6e07a7bfa9c6aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://257e51cb9b93727ccb616e193abc21dfed1ee9ee2f1ce459e35728457dae5280\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc743861f85f56a87c6a20d7152af59e554284fe13b8212c5dae854438771406\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ff60b31932a511f9156f1416e112d115fcbc79a0a6030143ba9f5ddd327577a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ff60b31932a511f9156f1416e112d115fcbc79a0a6030143ba9f5ddd327577a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff524be439c6bf410b9377cd503516b43be3826e4c2fcd03a6ed9bf3fcfab95b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff524be439c6bf410b9377cd503516b43be3826e4c2fcd03a6ed9bf3fcfab95b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:19Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://26cb5ba5508ef320a24ba05e352547c9feb82f37ebeab58ad7c0a073d03b8c2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26cb5ba5508ef320a24ba05e352547c9feb82f37ebeab58ad7c0a073d03b8c2d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:17Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:51Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:51 crc kubenswrapper[4785]: I1126 15:07:51.554580 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8acfa933b476aa5e196c195906225ae92af5cca3353292e23f29f87ee374d0ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f432643beb27bff31e146230a60837b1292cddb8595ede8ed7f929170fceeb17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:51Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:51 crc kubenswrapper[4785]: I1126 15:07:51.568143 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ab8b8fda3321443cb24c214c10ba7171f01c80adf81860f4aa931062ec886d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:51Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:51 crc kubenswrapper[4785]: I1126 15:07:51.572719 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zk6jt" Nov 26 15:07:51 crc kubenswrapper[4785]: I1126 15:07:51.578520 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:07:51 crc kubenswrapper[4785]: I1126 15:07:51.578616 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:07:51 crc kubenswrapper[4785]: I1126 15:07:51.578640 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:07:51 crc kubenswrapper[4785]: I1126 15:07:51.578667 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:07:51 crc kubenswrapper[4785]: I1126 15:07:51.578689 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:07:51Z","lastTransitionTime":"2025-11-26T15:07:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:07:51 crc kubenswrapper[4785]: I1126 15:07:51.580876 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:51Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:51 crc kubenswrapper[4785]: W1126 15:07:51.587083 4785 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod09c65f70_203d_40eb_a45e_ed8d7e36912f.slice/crio-4d9be3d73422b327df6a435506580f986d5a3cca0f3074708772fbc55e937497 WatchSource:0}: Error finding container 4d9be3d73422b327df6a435506580f986d5a3cca0f3074708772fbc55e937497: Status 404 returned error can't find the container with id 4d9be3d73422b327df6a435506580f986d5a3cca0f3074708772fbc55e937497 Nov 26 15:07:51 crc kubenswrapper[4785]: I1126 15:07:51.604848 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xbz7b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84d83039-5d86-45bb-a5c1-ca5b94ed92c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5459c3414a7754407ac6d182e88047bbe1b7b9750d15c7a88767c50c4b9dc720\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gdb7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f7072a2a9067489a27300fea823075b89d71d45a1d972bd9d94226b01c2c123\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f7072a2a9067489a27300fea823075b89d71d45a1d972bd9d94226b01c2c123\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gdb7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://538d65a0b5f3e2c45c1b9761f2c73e958d5de0873652f20f4047631423e538eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://538d65a0b5f3e2c45c1b9761f2c73e958d5de0873652f20f4047631423e538eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gdb7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1abf2b4c1a00c0bc17acaf7547130008127579df59592382ae6e81430936e6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d1abf2b4c1a00c0bc17acaf7547130008127579df59592382ae6e81430936e6d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gdb7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://35a2151729e5cb2ae347214fc91e6de116a4062943e4cb4a27ab0553436953ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35a2151729e5cb2ae347214fc91e6de116a4062943e4cb4a27ab0553436953ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gdb7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b51e95d33c83c57a3b5a7f580eb9cac5e50575e699c066694a9ae8fd4dcdfb20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b51e95d33c83c57a3b5a7f580eb9cac5e50575e699c066694a9ae8fd4dcdfb20\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gdb7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20f87e07ed26b3b93d4dad87f971e4ee3bf3c46299491a95990ab34c63167f93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://20f87e07ed26b3b93d4dad87f971e4ee3bf3c46299491a95990ab34c63167f93\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gdb7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xbz7b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:51Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:51 crc kubenswrapper[4785]: I1126 15:07:51.617577 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-gkxdl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5539d39a-e2bc-4e7f-8b5a-a5e3e10c4ba4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://183451ec5e6ad1983b6e56a53adfcf2eaf3489dcfbc3cbc709f67002691b9010\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqzs8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5538d579d2e0ac36c778f58da6e8176f8f7a0995797e8bc7c3ad21a054346867\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqzs8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-gkxdl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:51Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:51 crc kubenswrapper[4785]: I1126 15:07:51.631582 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dbac4f7f-578e-41e6-88a0-4ad8d2f5eef2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://320fc519a91dfed4d5c233e7c1c2f21aadf25336ccade8830069aae4bd804a68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b79ab88118b1e683ad26fc3bd390961d15af99cce73126c56587c0df819f3a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6c964ff713703fe4484d76d86473279ffc4a3eef0cf3dd8b29aea7fdc6271616\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03ac9c745ec9eb21263a599561496630202872ef631f5e846e0475efedee6943\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:51Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:51 crc kubenswrapper[4785]: I1126 15:07:51.652147 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-925q9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"862c58fd-3f79-4276-bd76-ce689d32cbd6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b754df1d5f2d20f3afdb10f2e77f4716de9d953c7322bb3e289d989d2699d17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://018b59a73e9d454f9156de17b00a8c5cccc3f321164cac374187afa18466bc45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e1a2013c12cbd23c8c9e4e0d5bd9f76df1c2c2f30f0d55fdb8c8cb03698ea576\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bc34631383b7470b2d582d037ea922f2c46d09ca9e37011d758e16837336a78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://77387f2d556a76807d809c75aa4d9e2fdf5de82d7a04e23d3f5461ccef860196\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://47be007cc11c8c2db5bc6dd5786cad4e45c1f8f08f872d060be0211012aad935\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70cfa5c92e3afde972c24f4c6d70dcfe2bf73b25223abd8628949f267b3c30c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://70cfa5c92e3afde972c24f4c6d70dcfe2bf73b25223abd8628949f267b3c30c5\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-26T15:07:50Z\\\",\\\"message\\\":\\\"Built service openshift-config-operator/metrics per-node LB for network=default: []services.LB{}\\\\nI1126 15:07:50.056310 6208 services_controller.go:453] Built service openshift-config-operator/metrics template LB for network=default: []services.LB{}\\\\nI1126 15:07:50.056320 6208 services_controller.go:454] Service openshift-config-operator/metrics for network=default has 1 cluster-wide, 0 per-node configs, 0 template configs, making 1 (cluster) 0 (per node) and 0 (template) load balancers\\\\nI1126 15:07:50.056218 6208 lb_config.go:1031] Cluster endpoints for openshift-ingress-canary/ingress-canary for network=default are: map[]\\\\nI1126 15:07:50.055930 6208 ovn.go:134] Ensuring zone local for Pod openshift-machine-config-operator/machine-config-daemon-gkxdl in node crc\\\\nI1126 15:07:50.056357 6208 services_controller.go:443] Built service openshift-ingress-canary/ingress-canary LB cluster-wide configs for network=default: []services.lbConfig{services.lbConfig{vips:[]string{\\\\\\\"10.217.5.34\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:8443, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]string(nil), V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}, services.lbConfig{vips:[]string{\\\\\\\"10.217.5.34\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:8888, clusterEndpoints:services.lbEndpoints\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:49Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-925q9_openshift-ovn-kubernetes(862c58fd-3f79-4276-bd76-ce689d32cbd6)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84b98c7c476182b515072f71c6d81656debe5c03855324623b1eb31f53e6c5c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://049fc6de07d4b51e1c898035109c3427dff189cc509f5973db7cb75c3a022bc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://049fc6de07d4b51e1c898035109c3427dff189cc509f5973db7cb75c3a022bc1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-925q9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:51Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:51 crc kubenswrapper[4785]: I1126 15:07:51.664343 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6q4xd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"855bd894-cca9-4fe1-a0d5-8b72afe7c93a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://041064cd63f3d654e02e5e310fbc190ca527a986959c9d7a35ab87962184f2a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7zv2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6q4xd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:51Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:51 crc kubenswrapper[4785]: I1126 15:07:51.675291 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-hk884" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b9a87c55-b930-4993-88b8-15902e000caa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://018c4591e8ee49a626a6357d6672b723d221f62eaa62face83fd6c3d5f1b3bfa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n9rk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:38Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-hk884\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:51Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:51 crc kubenswrapper[4785]: I1126 15:07:51.681610 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:07:51 crc kubenswrapper[4785]: I1126 15:07:51.681663 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:07:51 crc kubenswrapper[4785]: I1126 15:07:51.681678 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:07:51 crc kubenswrapper[4785]: I1126 15:07:51.681699 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:07:51 crc kubenswrapper[4785]: I1126 15:07:51.681716 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:07:51Z","lastTransitionTime":"2025-11-26T15:07:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:07:51 crc kubenswrapper[4785]: I1126 15:07:51.689929 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:51Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:51 crc kubenswrapper[4785]: I1126 15:07:51.699936 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-smv28" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d77660b-5b10-4573-84ab-3dc318d4b4ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3bb62d64073e9a5efa6ec04394fb33bf1498892d59e0843f323a728dff2e7b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mwfft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:38Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-smv28\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:51Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:51 crc kubenswrapper[4785]: I1126 15:07:51.713891 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zk6jt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"09c65f70-203d-40eb-a45e-ed8d7e36912f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dg52p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dg52p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-zk6jt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:51Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:51 crc kubenswrapper[4785]: I1126 15:07:51.728254 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f369bd7df062792e57a0cee19e7be96c6850b20d6e739a0c7fa5eccc14ae1d1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:51Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:51 crc kubenswrapper[4785]: I1126 15:07:51.743669 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:51Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:51 crc kubenswrapper[4785]: I1126 15:07:51.759754 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ce2473df-8540-437a-9b68-a0c79c8f1189\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26237e500f4655636096b017418a9029308b9ad9e5c12041546e51d919766529\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7904d95c1ece08fadcdc54e83f80b295f3904cc950fb84624bc2a6967e4d5bdd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://936454b4c6cdedceb140c50e2dba0e677ce2b126075b203bdae8a77bbdefde76\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2202f005c9b704cc7a97a7387e4563a14bda9e52001c63918f1067a7e676d34d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a6d4414478527bb76f539125e4be6ebf2a6470ddb46fd35e07180319fa98ede\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-26T15:07:36Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1126 15:07:30.557519 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1126 15:07:30.559919 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2275430498/tls.crt::/tmp/serving-cert-2275430498/tls.key\\\\\\\"\\\\nI1126 15:07:36.843783 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1126 15:07:36.847546 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1126 15:07:36.847601 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1126 15:07:36.847630 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1126 15:07:36.847642 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1126 15:07:36.858915 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1126 15:07:36.858955 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1126 15:07:36.858960 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1126 15:07:36.858971 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1126 15:07:36.858975 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1126 15:07:36.858978 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1126 15:07:36.858982 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1126 15:07:36.859422 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1126 15:07:36.862125 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:20Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcce80995d95ff6993051a4f1ca260133d2513d216a291e3f2c905ec29f88d1b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e86ca1ef167bd0c31ad5ae1283d40945c72c4e528f3702b577bbbf9818e84c08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e86ca1ef167bd0c31ad5ae1283d40945c72c4e528f3702b577bbbf9818e84c08\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:51Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:51 crc kubenswrapper[4785]: I1126 15:07:51.784378 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:07:51 crc kubenswrapper[4785]: I1126 15:07:51.784411 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:07:51 crc kubenswrapper[4785]: I1126 15:07:51.784422 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:07:51 crc kubenswrapper[4785]: I1126 15:07:51.784440 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:07:51 crc kubenswrapper[4785]: I1126 15:07:51.784453 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:07:51Z","lastTransitionTime":"2025-11-26T15:07:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:07:51 crc kubenswrapper[4785]: I1126 15:07:51.887157 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:07:51 crc kubenswrapper[4785]: I1126 15:07:51.887198 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:07:51 crc kubenswrapper[4785]: I1126 15:07:51.887209 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:07:51 crc kubenswrapper[4785]: I1126 15:07:51.887225 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:07:51 crc kubenswrapper[4785]: I1126 15:07:51.887365 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:07:51Z","lastTransitionTime":"2025-11-26T15:07:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:07:51 crc kubenswrapper[4785]: I1126 15:07:51.992139 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:07:51 crc kubenswrapper[4785]: I1126 15:07:51.992208 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:07:51 crc kubenswrapper[4785]: I1126 15:07:51.992257 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:07:51 crc kubenswrapper[4785]: I1126 15:07:51.992288 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:07:51 crc kubenswrapper[4785]: I1126 15:07:51.992311 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:07:51Z","lastTransitionTime":"2025-11-26T15:07:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:07:52 crc kubenswrapper[4785]: I1126 15:07:52.036070 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 26 15:07:52 crc kubenswrapper[4785]: E1126 15:07:52.036243 4785 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 26 15:07:52 crc kubenswrapper[4785]: I1126 15:07:52.094664 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:07:52 crc kubenswrapper[4785]: I1126 15:07:52.094705 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:07:52 crc kubenswrapper[4785]: I1126 15:07:52.094714 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:07:52 crc kubenswrapper[4785]: I1126 15:07:52.094729 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:07:52 crc kubenswrapper[4785]: I1126 15:07:52.094738 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:07:52Z","lastTransitionTime":"2025-11-26T15:07:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:07:52 crc kubenswrapper[4785]: I1126 15:07:52.197379 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:07:52 crc kubenswrapper[4785]: I1126 15:07:52.197465 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:07:52 crc kubenswrapper[4785]: I1126 15:07:52.197478 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:07:52 crc kubenswrapper[4785]: I1126 15:07:52.197495 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:07:52 crc kubenswrapper[4785]: I1126 15:07:52.197508 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:07:52Z","lastTransitionTime":"2025-11-26T15:07:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:07:52 crc kubenswrapper[4785]: I1126 15:07:52.300140 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:07:52 crc kubenswrapper[4785]: I1126 15:07:52.300200 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:07:52 crc kubenswrapper[4785]: I1126 15:07:52.300223 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:07:52 crc kubenswrapper[4785]: I1126 15:07:52.300247 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:07:52 crc kubenswrapper[4785]: I1126 15:07:52.300262 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:07:52Z","lastTransitionTime":"2025-11-26T15:07:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:07:52 crc kubenswrapper[4785]: I1126 15:07:52.332522 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zk6jt" event={"ID":"09c65f70-203d-40eb-a45e-ed8d7e36912f","Type":"ContainerStarted","Data":"ba1f2e54aa3b8e81f5bc6ad4531d5976cfd8d6c4c15f16cefecc75c3c6189ef4"} Nov 26 15:07:52 crc kubenswrapper[4785]: I1126 15:07:52.332634 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zk6jt" event={"ID":"09c65f70-203d-40eb-a45e-ed8d7e36912f","Type":"ContainerStarted","Data":"69bcee7670290dd116b68ea6d15585455861981551919b4b553fc699f43161e7"} Nov 26 15:07:52 crc kubenswrapper[4785]: I1126 15:07:52.332658 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zk6jt" event={"ID":"09c65f70-203d-40eb-a45e-ed8d7e36912f","Type":"ContainerStarted","Data":"4d9be3d73422b327df6a435506580f986d5a3cca0f3074708772fbc55e937497"} Nov 26 15:07:52 crc kubenswrapper[4785]: I1126 15:07:52.351162 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:52Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:52 crc kubenswrapper[4785]: I1126 15:07:52.364494 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-smv28" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d77660b-5b10-4573-84ab-3dc318d4b4ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3bb62d64073e9a5efa6ec04394fb33bf1498892d59e0843f323a728dff2e7b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mwfft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:38Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-smv28\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:52Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:52 crc kubenswrapper[4785]: I1126 15:07:52.378834 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zk6jt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"09c65f70-203d-40eb-a45e-ed8d7e36912f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69bcee7670290dd116b68ea6d15585455861981551919b4b553fc699f43161e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dg52p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba1f2e54aa3b8e81f5bc6ad4531d5976cfd8d6c4c15f16cefecc75c3c6189ef4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dg52p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-zk6jt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:52Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:52 crc kubenswrapper[4785]: I1126 15:07:52.396944 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f369bd7df062792e57a0cee19e7be96c6850b20d6e739a0c7fa5eccc14ae1d1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:52Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:52 crc kubenswrapper[4785]: I1126 15:07:52.403013 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:07:52 crc kubenswrapper[4785]: I1126 15:07:52.403087 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:07:52 crc kubenswrapper[4785]: I1126 15:07:52.403112 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:07:52 crc kubenswrapper[4785]: I1126 15:07:52.403144 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:07:52 crc kubenswrapper[4785]: I1126 15:07:52.403168 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:07:52Z","lastTransitionTime":"2025-11-26T15:07:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:07:52 crc kubenswrapper[4785]: I1126 15:07:52.416971 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:52Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:52 crc kubenswrapper[4785]: I1126 15:07:52.440094 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ce2473df-8540-437a-9b68-a0c79c8f1189\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26237e500f4655636096b017418a9029308b9ad9e5c12041546e51d919766529\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7904d95c1ece08fadcdc54e83f80b295f3904cc950fb84624bc2a6967e4d5bdd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://936454b4c6cdedceb140c50e2dba0e677ce2b126075b203bdae8a77bbdefde76\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2202f005c9b704cc7a97a7387e4563a14bda9e52001c63918f1067a7e676d34d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a6d4414478527bb76f539125e4be6ebf2a6470ddb46fd35e07180319fa98ede\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-26T15:07:36Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1126 15:07:30.557519 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1126 15:07:30.559919 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2275430498/tls.crt::/tmp/serving-cert-2275430498/tls.key\\\\\\\"\\\\nI1126 15:07:36.843783 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1126 15:07:36.847546 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1126 15:07:36.847601 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1126 15:07:36.847630 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1126 15:07:36.847642 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1126 15:07:36.858915 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1126 15:07:36.858955 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1126 15:07:36.858960 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1126 15:07:36.858971 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1126 15:07:36.858975 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1126 15:07:36.858978 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1126 15:07:36.858982 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1126 15:07:36.859422 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1126 15:07:36.862125 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:20Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcce80995d95ff6993051a4f1ca260133d2513d216a291e3f2c905ec29f88d1b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e86ca1ef167bd0c31ad5ae1283d40945c72c4e528f3702b577bbbf9818e84c08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e86ca1ef167bd0c31ad5ae1283d40945c72c4e528f3702b577bbbf9818e84c08\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:52Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:52 crc kubenswrapper[4785]: I1126 15:07:52.471340 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ec3c2133-1999-46b4-b5a6-c1b800f04c30\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee866b365ef5e15c3cc262ab6ad0649b10fa7fdf0c10c3f747d9e20d48535e76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79b8f6bd2047642cacd4effcee8f9760c7c9480092993c7a25eabe4727bd8238\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dee1454ae94951e83f437b999f871e4965a9199a20bcd6392a6e07a7bfa9c6aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://257e51cb9b93727ccb616e193abc21dfed1ee9ee2f1ce459e35728457dae5280\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc743861f85f56a87c6a20d7152af59e554284fe13b8212c5dae854438771406\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ff60b31932a511f9156f1416e112d115fcbc79a0a6030143ba9f5ddd327577a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ff60b31932a511f9156f1416e112d115fcbc79a0a6030143ba9f5ddd327577a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff524be439c6bf410b9377cd503516b43be3826e4c2fcd03a6ed9bf3fcfab95b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff524be439c6bf410b9377cd503516b43be3826e4c2fcd03a6ed9bf3fcfab95b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:19Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://26cb5ba5508ef320a24ba05e352547c9feb82f37ebeab58ad7c0a073d03b8c2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26cb5ba5508ef320a24ba05e352547c9feb82f37ebeab58ad7c0a073d03b8c2d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:17Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:52Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:52 crc kubenswrapper[4785]: I1126 15:07:52.495599 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8acfa933b476aa5e196c195906225ae92af5cca3353292e23f29f87ee374d0ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f432643beb27bff31e146230a60837b1292cddb8595ede8ed7f929170fceeb17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:52Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:52 crc kubenswrapper[4785]: I1126 15:07:52.506183 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:07:52 crc kubenswrapper[4785]: I1126 15:07:52.506245 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:07:52 crc kubenswrapper[4785]: I1126 15:07:52.506262 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:07:52 crc kubenswrapper[4785]: I1126 15:07:52.506288 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:07:52 crc kubenswrapper[4785]: I1126 15:07:52.506306 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:07:52Z","lastTransitionTime":"2025-11-26T15:07:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:07:52 crc kubenswrapper[4785]: I1126 15:07:52.520645 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ab8b8fda3321443cb24c214c10ba7171f01c80adf81860f4aa931062ec886d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:52Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:52 crc kubenswrapper[4785]: I1126 15:07:52.541182 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:52Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:52 crc kubenswrapper[4785]: I1126 15:07:52.558122 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xbz7b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84d83039-5d86-45bb-a5c1-ca5b94ed92c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5459c3414a7754407ac6d182e88047bbe1b7b9750d15c7a88767c50c4b9dc720\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gdb7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f7072a2a9067489a27300fea823075b89d71d45a1d972bd9d94226b01c2c123\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f7072a2a9067489a27300fea823075b89d71d45a1d972bd9d94226b01c2c123\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gdb7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://538d65a0b5f3e2c45c1b9761f2c73e958d5de0873652f20f4047631423e538eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://538d65a0b5f3e2c45c1b9761f2c73e958d5de0873652f20f4047631423e538eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gdb7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1abf2b4c1a00c0bc17acaf7547130008127579df59592382ae6e81430936e6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d1abf2b4c1a00c0bc17acaf7547130008127579df59592382ae6e81430936e6d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gdb7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://35a2151729e5cb2ae347214fc91e6de116a4062943e4cb4a27ab0553436953ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35a2151729e5cb2ae347214fc91e6de116a4062943e4cb4a27ab0553436953ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gdb7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b51e95d33c83c57a3b5a7f580eb9cac5e50575e699c066694a9ae8fd4dcdfb20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b51e95d33c83c57a3b5a7f580eb9cac5e50575e699c066694a9ae8fd4dcdfb20\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gdb7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20f87e07ed26b3b93d4dad87f971e4ee3bf3c46299491a95990ab34c63167f93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://20f87e07ed26b3b93d4dad87f971e4ee3bf3c46299491a95990ab34c63167f93\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gdb7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xbz7b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:52Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:52 crc kubenswrapper[4785]: I1126 15:07:52.571070 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-gkxdl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5539d39a-e2bc-4e7f-8b5a-a5e3e10c4ba4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://183451ec5e6ad1983b6e56a53adfcf2eaf3489dcfbc3cbc709f67002691b9010\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqzs8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5538d579d2e0ac36c778f58da6e8176f8f7a0995797e8bc7c3ad21a054346867\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqzs8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-gkxdl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:52Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:52 crc kubenswrapper[4785]: I1126 15:07:52.590656 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dbac4f7f-578e-41e6-88a0-4ad8d2f5eef2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://320fc519a91dfed4d5c233e7c1c2f21aadf25336ccade8830069aae4bd804a68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b79ab88118b1e683ad26fc3bd390961d15af99cce73126c56587c0df819f3a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6c964ff713703fe4484d76d86473279ffc4a3eef0cf3dd8b29aea7fdc6271616\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03ac9c745ec9eb21263a599561496630202872ef631f5e846e0475efedee6943\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:52Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:52 crc kubenswrapper[4785]: I1126 15:07:52.608857 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:07:52 crc kubenswrapper[4785]: I1126 15:07:52.608934 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:07:52 crc kubenswrapper[4785]: I1126 15:07:52.608948 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:07:52 crc kubenswrapper[4785]: I1126 15:07:52.608966 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:07:52 crc kubenswrapper[4785]: I1126 15:07:52.608979 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:07:52Z","lastTransitionTime":"2025-11-26T15:07:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:07:52 crc kubenswrapper[4785]: I1126 15:07:52.621016 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-925q9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"862c58fd-3f79-4276-bd76-ce689d32cbd6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b754df1d5f2d20f3afdb10f2e77f4716de9d953c7322bb3e289d989d2699d17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://018b59a73e9d454f9156de17b00a8c5cccc3f321164cac374187afa18466bc45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e1a2013c12cbd23c8c9e4e0d5bd9f76df1c2c2f30f0d55fdb8c8cb03698ea576\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bc34631383b7470b2d582d037ea922f2c46d09ca9e37011d758e16837336a78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://77387f2d556a76807d809c75aa4d9e2fdf5de82d7a04e23d3f5461ccef860196\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://47be007cc11c8c2db5bc6dd5786cad4e45c1f8f08f872d060be0211012aad935\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70cfa5c92e3afde972c24f4c6d70dcfe2bf73b25223abd8628949f267b3c30c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://70cfa5c92e3afde972c24f4c6d70dcfe2bf73b25223abd8628949f267b3c30c5\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-26T15:07:50Z\\\",\\\"message\\\":\\\"Built service openshift-config-operator/metrics per-node LB for network=default: []services.LB{}\\\\nI1126 15:07:50.056310 6208 services_controller.go:453] Built service openshift-config-operator/metrics template LB for network=default: []services.LB{}\\\\nI1126 15:07:50.056320 6208 services_controller.go:454] Service openshift-config-operator/metrics for network=default has 1 cluster-wide, 0 per-node configs, 0 template configs, making 1 (cluster) 0 (per node) and 0 (template) load balancers\\\\nI1126 15:07:50.056218 6208 lb_config.go:1031] Cluster endpoints for openshift-ingress-canary/ingress-canary for network=default are: map[]\\\\nI1126 15:07:50.055930 6208 ovn.go:134] Ensuring zone local for Pod openshift-machine-config-operator/machine-config-daemon-gkxdl in node crc\\\\nI1126 15:07:50.056357 6208 services_controller.go:443] Built service openshift-ingress-canary/ingress-canary LB cluster-wide configs for network=default: []services.lbConfig{services.lbConfig{vips:[]string{\\\\\\\"10.217.5.34\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:8443, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]string(nil), V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}, services.lbConfig{vips:[]string{\\\\\\\"10.217.5.34\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:8888, clusterEndpoints:services.lbEndpoints\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:49Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-925q9_openshift-ovn-kubernetes(862c58fd-3f79-4276-bd76-ce689d32cbd6)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84b98c7c476182b515072f71c6d81656debe5c03855324623b1eb31f53e6c5c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://049fc6de07d4b51e1c898035109c3427dff189cc509f5973db7cb75c3a022bc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://049fc6de07d4b51e1c898035109c3427dff189cc509f5973db7cb75c3a022bc1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-925q9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:52Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:52 crc kubenswrapper[4785]: I1126 15:07:52.638652 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6q4xd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"855bd894-cca9-4fe1-a0d5-8b72afe7c93a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://041064cd63f3d654e02e5e310fbc190ca527a986959c9d7a35ab87962184f2a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7zv2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6q4xd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:52Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:52 crc kubenswrapper[4785]: I1126 15:07:52.654179 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-hk884" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b9a87c55-b930-4993-88b8-15902e000caa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://018c4591e8ee49a626a6357d6672b723d221f62eaa62face83fd6c3d5f1b3bfa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n9rk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:38Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-hk884\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:52Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:52 crc kubenswrapper[4785]: I1126 15:07:52.707366 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 26 15:07:52 crc kubenswrapper[4785]: E1126 15:07:52.707719 4785 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-26 15:08:08.707681838 +0000 UTC m=+52.386047652 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 15:07:52 crc kubenswrapper[4785]: I1126 15:07:52.707810 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 26 15:07:52 crc kubenswrapper[4785]: I1126 15:07:52.707909 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 26 15:07:52 crc kubenswrapper[4785]: I1126 15:07:52.708037 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 26 15:07:52 crc kubenswrapper[4785]: E1126 15:07:52.708090 4785 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 26 15:07:52 crc kubenswrapper[4785]: I1126 15:07:52.708116 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 26 15:07:52 crc kubenswrapper[4785]: E1126 15:07:52.708134 4785 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 26 15:07:52 crc kubenswrapper[4785]: E1126 15:07:52.708158 4785 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 26 15:07:52 crc kubenswrapper[4785]: E1126 15:07:52.708199 4785 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Nov 26 15:07:52 crc kubenswrapper[4785]: E1126 15:07:52.708090 4785 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 26 15:07:52 crc kubenswrapper[4785]: E1126 15:07:52.708235 4785 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-11-26 15:08:08.708214032 +0000 UTC m=+52.386580046 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 26 15:07:52 crc kubenswrapper[4785]: E1126 15:07:52.708347 4785 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 26 15:07:52 crc kubenswrapper[4785]: E1126 15:07:52.708370 4785 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 26 15:07:52 crc kubenswrapper[4785]: E1126 15:07:52.708389 4785 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 26 15:07:52 crc kubenswrapper[4785]: E1126 15:07:52.708441 4785 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-26 15:08:08.708352615 +0000 UTC m=+52.386718419 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Nov 26 15:07:52 crc kubenswrapper[4785]: E1126 15:07:52.708477 4785 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-26 15:08:08.708460388 +0000 UTC m=+52.386826192 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 26 15:07:52 crc kubenswrapper[4785]: E1126 15:07:52.708533 4785 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-11-26 15:08:08.708501329 +0000 UTC m=+52.386867123 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 26 15:07:52 crc kubenswrapper[4785]: I1126 15:07:52.712614 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:07:52 crc kubenswrapper[4785]: I1126 15:07:52.712683 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:07:52 crc kubenswrapper[4785]: I1126 15:07:52.712697 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:07:52 crc kubenswrapper[4785]: I1126 15:07:52.712722 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:07:52 crc kubenswrapper[4785]: I1126 15:07:52.712735 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:07:52Z","lastTransitionTime":"2025-11-26T15:07:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:07:52 crc kubenswrapper[4785]: I1126 15:07:52.734997 4785 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-qdfwp"] Nov 26 15:07:52 crc kubenswrapper[4785]: I1126 15:07:52.735743 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qdfwp" Nov 26 15:07:52 crc kubenswrapper[4785]: E1126 15:07:52.735845 4785 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qdfwp" podUID="72903df2-b694-4229-96b5-167500cab723" Nov 26 15:07:52 crc kubenswrapper[4785]: I1126 15:07:52.747774 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-hk884" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b9a87c55-b930-4993-88b8-15902e000caa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://018c4591e8ee49a626a6357d6672b723d221f62eaa62face83fd6c3d5f1b3bfa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n9rk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:38Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-hk884\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:52Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:52 crc kubenswrapper[4785]: I1126 15:07:52.760111 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6q4xd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"855bd894-cca9-4fe1-a0d5-8b72afe7c93a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://041064cd63f3d654e02e5e310fbc190ca527a986959c9d7a35ab87962184f2a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7zv2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6q4xd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:52Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:52 crc kubenswrapper[4785]: I1126 15:07:52.770014 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-qdfwp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72903df2-b694-4229-96b5-167500cab723\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrprh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrprh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:52Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-qdfwp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:52Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:52 crc kubenswrapper[4785]: I1126 15:07:52.787375 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:52Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:52 crc kubenswrapper[4785]: I1126 15:07:52.801239 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-smv28" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d77660b-5b10-4573-84ab-3dc318d4b4ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3bb62d64073e9a5efa6ec04394fb33bf1498892d59e0843f323a728dff2e7b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mwfft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:38Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-smv28\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:52Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:52 crc kubenswrapper[4785]: I1126 15:07:52.809310 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wrprh\" (UniqueName: \"kubernetes.io/projected/72903df2-b694-4229-96b5-167500cab723-kube-api-access-wrprh\") pod \"network-metrics-daemon-qdfwp\" (UID: \"72903df2-b694-4229-96b5-167500cab723\") " pod="openshift-multus/network-metrics-daemon-qdfwp" Nov 26 15:07:52 crc kubenswrapper[4785]: I1126 15:07:52.809391 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/72903df2-b694-4229-96b5-167500cab723-metrics-certs\") pod \"network-metrics-daemon-qdfwp\" (UID: \"72903df2-b694-4229-96b5-167500cab723\") " pod="openshift-multus/network-metrics-daemon-qdfwp" Nov 26 15:07:52 crc kubenswrapper[4785]: I1126 15:07:52.816412 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:07:52 crc kubenswrapper[4785]: I1126 15:07:52.816485 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:07:52 crc kubenswrapper[4785]: I1126 15:07:52.816507 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:07:52 crc kubenswrapper[4785]: I1126 15:07:52.816540 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:07:52 crc kubenswrapper[4785]: I1126 15:07:52.816593 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:07:52Z","lastTransitionTime":"2025-11-26T15:07:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:07:52 crc kubenswrapper[4785]: I1126 15:07:52.820417 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zk6jt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"09c65f70-203d-40eb-a45e-ed8d7e36912f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69bcee7670290dd116b68ea6d15585455861981551919b4b553fc699f43161e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dg52p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba1f2e54aa3b8e81f5bc6ad4531d5976cfd8d6c4c15f16cefecc75c3c6189ef4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dg52p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-zk6jt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:52Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:52 crc kubenswrapper[4785]: I1126 15:07:52.837809 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ce2473df-8540-437a-9b68-a0c79c8f1189\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26237e500f4655636096b017418a9029308b9ad9e5c12041546e51d919766529\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7904d95c1ece08fadcdc54e83f80b295f3904cc950fb84624bc2a6967e4d5bdd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://936454b4c6cdedceb140c50e2dba0e677ce2b126075b203bdae8a77bbdefde76\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2202f005c9b704cc7a97a7387e4563a14bda9e52001c63918f1067a7e676d34d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a6d4414478527bb76f539125e4be6ebf2a6470ddb46fd35e07180319fa98ede\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-26T15:07:36Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1126 15:07:30.557519 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1126 15:07:30.559919 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2275430498/tls.crt::/tmp/serving-cert-2275430498/tls.key\\\\\\\"\\\\nI1126 15:07:36.843783 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1126 15:07:36.847546 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1126 15:07:36.847601 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1126 15:07:36.847630 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1126 15:07:36.847642 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1126 15:07:36.858915 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1126 15:07:36.858955 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1126 15:07:36.858960 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1126 15:07:36.858971 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1126 15:07:36.858975 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1126 15:07:36.858978 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1126 15:07:36.858982 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1126 15:07:36.859422 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1126 15:07:36.862125 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:20Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcce80995d95ff6993051a4f1ca260133d2513d216a291e3f2c905ec29f88d1b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e86ca1ef167bd0c31ad5ae1283d40945c72c4e528f3702b577bbbf9818e84c08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e86ca1ef167bd0c31ad5ae1283d40945c72c4e528f3702b577bbbf9818e84c08\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:52Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:52 crc kubenswrapper[4785]: I1126 15:07:52.856392 4785 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 26 15:07:52 crc kubenswrapper[4785]: I1126 15:07:52.856509 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f369bd7df062792e57a0cee19e7be96c6850b20d6e739a0c7fa5eccc14ae1d1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:52Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:52 crc kubenswrapper[4785]: I1126 15:07:52.880240 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:52Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:52 crc kubenswrapper[4785]: I1126 15:07:52.900204 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dbac4f7f-578e-41e6-88a0-4ad8d2f5eef2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://320fc519a91dfed4d5c233e7c1c2f21aadf25336ccade8830069aae4bd804a68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b79ab88118b1e683ad26fc3bd390961d15af99cce73126c56587c0df819f3a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6c964ff713703fe4484d76d86473279ffc4a3eef0cf3dd8b29aea7fdc6271616\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03ac9c745ec9eb21263a599561496630202872ef631f5e846e0475efedee6943\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:52Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:52 crc kubenswrapper[4785]: I1126 15:07:52.910719 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wrprh\" (UniqueName: \"kubernetes.io/projected/72903df2-b694-4229-96b5-167500cab723-kube-api-access-wrprh\") pod \"network-metrics-daemon-qdfwp\" (UID: \"72903df2-b694-4229-96b5-167500cab723\") " pod="openshift-multus/network-metrics-daemon-qdfwp" Nov 26 15:07:52 crc kubenswrapper[4785]: I1126 15:07:52.910773 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/72903df2-b694-4229-96b5-167500cab723-metrics-certs\") pod \"network-metrics-daemon-qdfwp\" (UID: \"72903df2-b694-4229-96b5-167500cab723\") " pod="openshift-multus/network-metrics-daemon-qdfwp" Nov 26 15:07:52 crc kubenswrapper[4785]: E1126 15:07:52.910897 4785 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Nov 26 15:07:52 crc kubenswrapper[4785]: E1126 15:07:52.910946 4785 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/72903df2-b694-4229-96b5-167500cab723-metrics-certs podName:72903df2-b694-4229-96b5-167500cab723 nodeName:}" failed. No retries permitted until 2025-11-26 15:07:53.41093164 +0000 UTC m=+37.089297404 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/72903df2-b694-4229-96b5-167500cab723-metrics-certs") pod "network-metrics-daemon-qdfwp" (UID: "72903df2-b694-4229-96b5-167500cab723") : object "openshift-multus"/"metrics-daemon-secret" not registered Nov 26 15:07:52 crc kubenswrapper[4785]: I1126 15:07:52.919673 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:07:52 crc kubenswrapper[4785]: I1126 15:07:52.919743 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:07:52 crc kubenswrapper[4785]: I1126 15:07:52.919755 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:07:52 crc kubenswrapper[4785]: I1126 15:07:52.919776 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:07:52 crc kubenswrapper[4785]: I1126 15:07:52.919795 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:07:52Z","lastTransitionTime":"2025-11-26T15:07:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:07:52 crc kubenswrapper[4785]: I1126 15:07:52.928264 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ec3c2133-1999-46b4-b5a6-c1b800f04c30\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee866b365ef5e15c3cc262ab6ad0649b10fa7fdf0c10c3f747d9e20d48535e76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79b8f6bd2047642cacd4effcee8f9760c7c9480092993c7a25eabe4727bd8238\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dee1454ae94951e83f437b999f871e4965a9199a20bcd6392a6e07a7bfa9c6aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://257e51cb9b93727ccb616e193abc21dfed1ee9ee2f1ce459e35728457dae5280\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc743861f85f56a87c6a20d7152af59e554284fe13b8212c5dae854438771406\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ff60b31932a511f9156f1416e112d115fcbc79a0a6030143ba9f5ddd327577a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ff60b31932a511f9156f1416e112d115fcbc79a0a6030143ba9f5ddd327577a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff524be439c6bf410b9377cd503516b43be3826e4c2fcd03a6ed9bf3fcfab95b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff524be439c6bf410b9377cd503516b43be3826e4c2fcd03a6ed9bf3fcfab95b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:19Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://26cb5ba5508ef320a24ba05e352547c9feb82f37ebeab58ad7c0a073d03b8c2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26cb5ba5508ef320a24ba05e352547c9feb82f37ebeab58ad7c0a073d03b8c2d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:17Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:52Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:52 crc kubenswrapper[4785]: I1126 15:07:52.939649 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wrprh\" (UniqueName: \"kubernetes.io/projected/72903df2-b694-4229-96b5-167500cab723-kube-api-access-wrprh\") pod \"network-metrics-daemon-qdfwp\" (UID: \"72903df2-b694-4229-96b5-167500cab723\") " pod="openshift-multus/network-metrics-daemon-qdfwp" Nov 26 15:07:52 crc kubenswrapper[4785]: I1126 15:07:52.950128 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8acfa933b476aa5e196c195906225ae92af5cca3353292e23f29f87ee374d0ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f432643beb27bff31e146230a60837b1292cddb8595ede8ed7f929170fceeb17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:52Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:52 crc kubenswrapper[4785]: I1126 15:07:52.966515 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ab8b8fda3321443cb24c214c10ba7171f01c80adf81860f4aa931062ec886d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:52Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:52 crc kubenswrapper[4785]: I1126 15:07:52.985239 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:52Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:53 crc kubenswrapper[4785]: I1126 15:07:53.012106 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xbz7b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84d83039-5d86-45bb-a5c1-ca5b94ed92c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5459c3414a7754407ac6d182e88047bbe1b7b9750d15c7a88767c50c4b9dc720\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gdb7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f7072a2a9067489a27300fea823075b89d71d45a1d972bd9d94226b01c2c123\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f7072a2a9067489a27300fea823075b89d71d45a1d972bd9d94226b01c2c123\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gdb7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://538d65a0b5f3e2c45c1b9761f2c73e958d5de0873652f20f4047631423e538eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://538d65a0b5f3e2c45c1b9761f2c73e958d5de0873652f20f4047631423e538eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gdb7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1abf2b4c1a00c0bc17acaf7547130008127579df59592382ae6e81430936e6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d1abf2b4c1a00c0bc17acaf7547130008127579df59592382ae6e81430936e6d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gdb7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://35a2151729e5cb2ae347214fc91e6de116a4062943e4cb4a27ab0553436953ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35a2151729e5cb2ae347214fc91e6de116a4062943e4cb4a27ab0553436953ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gdb7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b51e95d33c83c57a3b5a7f580eb9cac5e50575e699c066694a9ae8fd4dcdfb20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b51e95d33c83c57a3b5a7f580eb9cac5e50575e699c066694a9ae8fd4dcdfb20\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gdb7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20f87e07ed26b3b93d4dad87f971e4ee3bf3c46299491a95990ab34c63167f93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://20f87e07ed26b3b93d4dad87f971e4ee3bf3c46299491a95990ab34c63167f93\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gdb7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xbz7b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:53Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:53 crc kubenswrapper[4785]: I1126 15:07:53.022426 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:07:53 crc kubenswrapper[4785]: I1126 15:07:53.022465 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:07:53 crc kubenswrapper[4785]: I1126 15:07:53.022475 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:07:53 crc kubenswrapper[4785]: I1126 15:07:53.022492 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:07:53 crc kubenswrapper[4785]: I1126 15:07:53.022503 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:07:53Z","lastTransitionTime":"2025-11-26T15:07:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:07:53 crc kubenswrapper[4785]: I1126 15:07:53.030046 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-gkxdl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5539d39a-e2bc-4e7f-8b5a-a5e3e10c4ba4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://183451ec5e6ad1983b6e56a53adfcf2eaf3489dcfbc3cbc709f67002691b9010\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqzs8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5538d579d2e0ac36c778f58da6e8176f8f7a0995797e8bc7c3ad21a054346867\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqzs8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-gkxdl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:53Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:53 crc kubenswrapper[4785]: I1126 15:07:53.035910 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 26 15:07:53 crc kubenswrapper[4785]: I1126 15:07:53.035910 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 26 15:07:53 crc kubenswrapper[4785]: E1126 15:07:53.036162 4785 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 26 15:07:53 crc kubenswrapper[4785]: E1126 15:07:53.036264 4785 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 26 15:07:53 crc kubenswrapper[4785]: I1126 15:07:53.062748 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-925q9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"862c58fd-3f79-4276-bd76-ce689d32cbd6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b754df1d5f2d20f3afdb10f2e77f4716de9d953c7322bb3e289d989d2699d17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://018b59a73e9d454f9156de17b00a8c5cccc3f321164cac374187afa18466bc45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e1a2013c12cbd23c8c9e4e0d5bd9f76df1c2c2f30f0d55fdb8c8cb03698ea576\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bc34631383b7470b2d582d037ea922f2c46d09ca9e37011d758e16837336a78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://77387f2d556a76807d809c75aa4d9e2fdf5de82d7a04e23d3f5461ccef860196\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://47be007cc11c8c2db5bc6dd5786cad4e45c1f8f08f872d060be0211012aad935\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70cfa5c92e3afde972c24f4c6d70dcfe2bf73b25223abd8628949f267b3c30c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://70cfa5c92e3afde972c24f4c6d70dcfe2bf73b25223abd8628949f267b3c30c5\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-26T15:07:50Z\\\",\\\"message\\\":\\\"Built service openshift-config-operator/metrics per-node LB for network=default: []services.LB{}\\\\nI1126 15:07:50.056310 6208 services_controller.go:453] Built service openshift-config-operator/metrics template LB for network=default: []services.LB{}\\\\nI1126 15:07:50.056320 6208 services_controller.go:454] Service openshift-config-operator/metrics for network=default has 1 cluster-wide, 0 per-node configs, 0 template configs, making 1 (cluster) 0 (per node) and 0 (template) load balancers\\\\nI1126 15:07:50.056218 6208 lb_config.go:1031] Cluster endpoints for openshift-ingress-canary/ingress-canary for network=default are: map[]\\\\nI1126 15:07:50.055930 6208 ovn.go:134] Ensuring zone local for Pod openshift-machine-config-operator/machine-config-daemon-gkxdl in node crc\\\\nI1126 15:07:50.056357 6208 services_controller.go:443] Built service openshift-ingress-canary/ingress-canary LB cluster-wide configs for network=default: []services.lbConfig{services.lbConfig{vips:[]string{\\\\\\\"10.217.5.34\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:8443, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]string(nil), V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}, services.lbConfig{vips:[]string{\\\\\\\"10.217.5.34\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:8888, clusterEndpoints:services.lbEndpoints\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:49Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-925q9_openshift-ovn-kubernetes(862c58fd-3f79-4276-bd76-ce689d32cbd6)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84b98c7c476182b515072f71c6d81656debe5c03855324623b1eb31f53e6c5c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://049fc6de07d4b51e1c898035109c3427dff189cc509f5973db7cb75c3a022bc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://049fc6de07d4b51e1c898035109c3427dff189cc509f5973db7cb75c3a022bc1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-925q9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:53Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:53 crc kubenswrapper[4785]: I1126 15:07:53.099974 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-qdfwp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72903df2-b694-4229-96b5-167500cab723\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrprh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrprh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:52Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-qdfwp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:53Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:53 crc kubenswrapper[4785]: I1126 15:07:53.119916 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-hk884" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b9a87c55-b930-4993-88b8-15902e000caa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://018c4591e8ee49a626a6357d6672b723d221f62eaa62face83fd6c3d5f1b3bfa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n9rk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:38Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-hk884\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:53Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:53 crc kubenswrapper[4785]: I1126 15:07:53.130058 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:07:53 crc kubenswrapper[4785]: I1126 15:07:53.130103 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:07:53 crc kubenswrapper[4785]: I1126 15:07:53.130116 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:07:53 crc kubenswrapper[4785]: I1126 15:07:53.130137 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:07:53 crc kubenswrapper[4785]: I1126 15:07:53.130153 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:07:53Z","lastTransitionTime":"2025-11-26T15:07:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:07:53 crc kubenswrapper[4785]: I1126 15:07:53.151175 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6q4xd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"855bd894-cca9-4fe1-a0d5-8b72afe7c93a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://041064cd63f3d654e02e5e310fbc190ca527a986959c9d7a35ab87962184f2a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7zv2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6q4xd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:53Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:53 crc kubenswrapper[4785]: I1126 15:07:53.165047 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-smv28" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d77660b-5b10-4573-84ab-3dc318d4b4ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3bb62d64073e9a5efa6ec04394fb33bf1498892d59e0843f323a728dff2e7b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mwfft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:38Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-smv28\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:53Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:53 crc kubenswrapper[4785]: I1126 15:07:53.178440 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zk6jt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"09c65f70-203d-40eb-a45e-ed8d7e36912f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69bcee7670290dd116b68ea6d15585455861981551919b4b553fc699f43161e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dg52p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba1f2e54aa3b8e81f5bc6ad4531d5976cfd8d6c4c15f16cefecc75c3c6189ef4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dg52p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-zk6jt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:53Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:53 crc kubenswrapper[4785]: I1126 15:07:53.194277 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:53Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:53 crc kubenswrapper[4785]: I1126 15:07:53.211854 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:53Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:53 crc kubenswrapper[4785]: I1126 15:07:53.227149 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ce2473df-8540-437a-9b68-a0c79c8f1189\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26237e500f4655636096b017418a9029308b9ad9e5c12041546e51d919766529\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7904d95c1ece08fadcdc54e83f80b295f3904cc950fb84624bc2a6967e4d5bdd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://936454b4c6cdedceb140c50e2dba0e677ce2b126075b203bdae8a77bbdefde76\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2202f005c9b704cc7a97a7387e4563a14bda9e52001c63918f1067a7e676d34d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a6d4414478527bb76f539125e4be6ebf2a6470ddb46fd35e07180319fa98ede\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-26T15:07:36Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1126 15:07:30.557519 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1126 15:07:30.559919 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2275430498/tls.crt::/tmp/serving-cert-2275430498/tls.key\\\\\\\"\\\\nI1126 15:07:36.843783 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1126 15:07:36.847546 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1126 15:07:36.847601 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1126 15:07:36.847630 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1126 15:07:36.847642 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1126 15:07:36.858915 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1126 15:07:36.858955 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1126 15:07:36.858960 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1126 15:07:36.858971 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1126 15:07:36.858975 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1126 15:07:36.858978 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1126 15:07:36.858982 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1126 15:07:36.859422 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1126 15:07:36.862125 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:20Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcce80995d95ff6993051a4f1ca260133d2513d216a291e3f2c905ec29f88d1b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e86ca1ef167bd0c31ad5ae1283d40945c72c4e528f3702b577bbbf9818e84c08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e86ca1ef167bd0c31ad5ae1283d40945c72c4e528f3702b577bbbf9818e84c08\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:53Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:53 crc kubenswrapper[4785]: I1126 15:07:53.233386 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:07:53 crc kubenswrapper[4785]: I1126 15:07:53.233469 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:07:53 crc kubenswrapper[4785]: I1126 15:07:53.233495 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:07:53 crc kubenswrapper[4785]: I1126 15:07:53.233527 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:07:53 crc kubenswrapper[4785]: I1126 15:07:53.233581 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:07:53Z","lastTransitionTime":"2025-11-26T15:07:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:07:53 crc kubenswrapper[4785]: I1126 15:07:53.245138 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f369bd7df062792e57a0cee19e7be96c6850b20d6e739a0c7fa5eccc14ae1d1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:53Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:53 crc kubenswrapper[4785]: I1126 15:07:53.270402 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ec3c2133-1999-46b4-b5a6-c1b800f04c30\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee866b365ef5e15c3cc262ab6ad0649b10fa7fdf0c10c3f747d9e20d48535e76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79b8f6bd2047642cacd4effcee8f9760c7c9480092993c7a25eabe4727bd8238\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dee1454ae94951e83f437b999f871e4965a9199a20bcd6392a6e07a7bfa9c6aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://257e51cb9b93727ccb616e193abc21dfed1ee9ee2f1ce459e35728457dae5280\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc743861f85f56a87c6a20d7152af59e554284fe13b8212c5dae854438771406\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ff60b31932a511f9156f1416e112d115fcbc79a0a6030143ba9f5ddd327577a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ff60b31932a511f9156f1416e112d115fcbc79a0a6030143ba9f5ddd327577a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff524be439c6bf410b9377cd503516b43be3826e4c2fcd03a6ed9bf3fcfab95b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff524be439c6bf410b9377cd503516b43be3826e4c2fcd03a6ed9bf3fcfab95b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:19Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://26cb5ba5508ef320a24ba05e352547c9feb82f37ebeab58ad7c0a073d03b8c2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26cb5ba5508ef320a24ba05e352547c9feb82f37ebeab58ad7c0a073d03b8c2d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:17Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:53Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:53 crc kubenswrapper[4785]: I1126 15:07:53.295120 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8acfa933b476aa5e196c195906225ae92af5cca3353292e23f29f87ee374d0ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f432643beb27bff31e146230a60837b1292cddb8595ede8ed7f929170fceeb17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:53Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:53 crc kubenswrapper[4785]: I1126 15:07:53.313925 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ab8b8fda3321443cb24c214c10ba7171f01c80adf81860f4aa931062ec886d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:53Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:53 crc kubenswrapper[4785]: I1126 15:07:53.328908 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:53Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:53 crc kubenswrapper[4785]: I1126 15:07:53.336167 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:07:53 crc kubenswrapper[4785]: I1126 15:07:53.336228 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:07:53 crc kubenswrapper[4785]: I1126 15:07:53.336246 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:07:53 crc kubenswrapper[4785]: I1126 15:07:53.336273 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:07:53 crc kubenswrapper[4785]: I1126 15:07:53.336290 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:07:53Z","lastTransitionTime":"2025-11-26T15:07:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:07:53 crc kubenswrapper[4785]: I1126 15:07:53.349799 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xbz7b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84d83039-5d86-45bb-a5c1-ca5b94ed92c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5459c3414a7754407ac6d182e88047bbe1b7b9750d15c7a88767c50c4b9dc720\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gdb7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f7072a2a9067489a27300fea823075b89d71d45a1d972bd9d94226b01c2c123\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f7072a2a9067489a27300fea823075b89d71d45a1d972bd9d94226b01c2c123\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gdb7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://538d65a0b5f3e2c45c1b9761f2c73e958d5de0873652f20f4047631423e538eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://538d65a0b5f3e2c45c1b9761f2c73e958d5de0873652f20f4047631423e538eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gdb7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1abf2b4c1a00c0bc17acaf7547130008127579df59592382ae6e81430936e6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d1abf2b4c1a00c0bc17acaf7547130008127579df59592382ae6e81430936e6d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gdb7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://35a2151729e5cb2ae347214fc91e6de116a4062943e4cb4a27ab0553436953ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35a2151729e5cb2ae347214fc91e6de116a4062943e4cb4a27ab0553436953ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gdb7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b51e95d33c83c57a3b5a7f580eb9cac5e50575e699c066694a9ae8fd4dcdfb20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b51e95d33c83c57a3b5a7f580eb9cac5e50575e699c066694a9ae8fd4dcdfb20\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gdb7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20f87e07ed26b3b93d4dad87f971e4ee3bf3c46299491a95990ab34c63167f93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://20f87e07ed26b3b93d4dad87f971e4ee3bf3c46299491a95990ab34c63167f93\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gdb7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xbz7b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:53Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:53 crc kubenswrapper[4785]: I1126 15:07:53.363669 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-gkxdl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5539d39a-e2bc-4e7f-8b5a-a5e3e10c4ba4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://183451ec5e6ad1983b6e56a53adfcf2eaf3489dcfbc3cbc709f67002691b9010\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqzs8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5538d579d2e0ac36c778f58da6e8176f8f7a0995797e8bc7c3ad21a054346867\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqzs8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-gkxdl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:53Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:53 crc kubenswrapper[4785]: I1126 15:07:53.379828 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dbac4f7f-578e-41e6-88a0-4ad8d2f5eef2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://320fc519a91dfed4d5c233e7c1c2f21aadf25336ccade8830069aae4bd804a68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b79ab88118b1e683ad26fc3bd390961d15af99cce73126c56587c0df819f3a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6c964ff713703fe4484d76d86473279ffc4a3eef0cf3dd8b29aea7fdc6271616\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03ac9c745ec9eb21263a599561496630202872ef631f5e846e0475efedee6943\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:53Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:53 crc kubenswrapper[4785]: I1126 15:07:53.409499 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-925q9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"862c58fd-3f79-4276-bd76-ce689d32cbd6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b754df1d5f2d20f3afdb10f2e77f4716de9d953c7322bb3e289d989d2699d17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://018b59a73e9d454f9156de17b00a8c5cccc3f321164cac374187afa18466bc45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e1a2013c12cbd23c8c9e4e0d5bd9f76df1c2c2f30f0d55fdb8c8cb03698ea576\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bc34631383b7470b2d582d037ea922f2c46d09ca9e37011d758e16837336a78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://77387f2d556a76807d809c75aa4d9e2fdf5de82d7a04e23d3f5461ccef860196\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://47be007cc11c8c2db5bc6dd5786cad4e45c1f8f08f872d060be0211012aad935\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70cfa5c92e3afde972c24f4c6d70dcfe2bf73b25223abd8628949f267b3c30c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://70cfa5c92e3afde972c24f4c6d70dcfe2bf73b25223abd8628949f267b3c30c5\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-26T15:07:50Z\\\",\\\"message\\\":\\\"Built service openshift-config-operator/metrics per-node LB for network=default: []services.LB{}\\\\nI1126 15:07:50.056310 6208 services_controller.go:453] Built service openshift-config-operator/metrics template LB for network=default: []services.LB{}\\\\nI1126 15:07:50.056320 6208 services_controller.go:454] Service openshift-config-operator/metrics for network=default has 1 cluster-wide, 0 per-node configs, 0 template configs, making 1 (cluster) 0 (per node) and 0 (template) load balancers\\\\nI1126 15:07:50.056218 6208 lb_config.go:1031] Cluster endpoints for openshift-ingress-canary/ingress-canary for network=default are: map[]\\\\nI1126 15:07:50.055930 6208 ovn.go:134] Ensuring zone local for Pod openshift-machine-config-operator/machine-config-daemon-gkxdl in node crc\\\\nI1126 15:07:50.056357 6208 services_controller.go:443] Built service openshift-ingress-canary/ingress-canary LB cluster-wide configs for network=default: []services.lbConfig{services.lbConfig{vips:[]string{\\\\\\\"10.217.5.34\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:8443, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]string(nil), V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}, services.lbConfig{vips:[]string{\\\\\\\"10.217.5.34\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:8888, clusterEndpoints:services.lbEndpoints\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:49Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-925q9_openshift-ovn-kubernetes(862c58fd-3f79-4276-bd76-ce689d32cbd6)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84b98c7c476182b515072f71c6d81656debe5c03855324623b1eb31f53e6c5c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://049fc6de07d4b51e1c898035109c3427dff189cc509f5973db7cb75c3a022bc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://049fc6de07d4b51e1c898035109c3427dff189cc509f5973db7cb75c3a022bc1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-925q9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:53Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:53 crc kubenswrapper[4785]: I1126 15:07:53.416457 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/72903df2-b694-4229-96b5-167500cab723-metrics-certs\") pod \"network-metrics-daemon-qdfwp\" (UID: \"72903df2-b694-4229-96b5-167500cab723\") " pod="openshift-multus/network-metrics-daemon-qdfwp" Nov 26 15:07:53 crc kubenswrapper[4785]: E1126 15:07:53.416776 4785 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Nov 26 15:07:53 crc kubenswrapper[4785]: E1126 15:07:53.416904 4785 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/72903df2-b694-4229-96b5-167500cab723-metrics-certs podName:72903df2-b694-4229-96b5-167500cab723 nodeName:}" failed. No retries permitted until 2025-11-26 15:07:54.416876733 +0000 UTC m=+38.095242527 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/72903df2-b694-4229-96b5-167500cab723-metrics-certs") pod "network-metrics-daemon-qdfwp" (UID: "72903df2-b694-4229-96b5-167500cab723") : object "openshift-multus"/"metrics-daemon-secret" not registered Nov 26 15:07:53 crc kubenswrapper[4785]: I1126 15:07:53.440713 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:07:53 crc kubenswrapper[4785]: I1126 15:07:53.440967 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:07:53 crc kubenswrapper[4785]: I1126 15:07:53.441059 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:07:53 crc kubenswrapper[4785]: I1126 15:07:53.441166 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:07:53 crc kubenswrapper[4785]: I1126 15:07:53.441265 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:07:53Z","lastTransitionTime":"2025-11-26T15:07:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:07:53 crc kubenswrapper[4785]: I1126 15:07:53.544723 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:07:53 crc kubenswrapper[4785]: I1126 15:07:53.544805 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:07:53 crc kubenswrapper[4785]: I1126 15:07:53.544821 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:07:53 crc kubenswrapper[4785]: I1126 15:07:53.544838 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:07:53 crc kubenswrapper[4785]: I1126 15:07:53.544850 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:07:53Z","lastTransitionTime":"2025-11-26T15:07:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:07:53 crc kubenswrapper[4785]: I1126 15:07:53.550895 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:07:53 crc kubenswrapper[4785]: I1126 15:07:53.550943 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:07:53 crc kubenswrapper[4785]: I1126 15:07:53.550963 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:07:53 crc kubenswrapper[4785]: I1126 15:07:53.550982 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:07:53 crc kubenswrapper[4785]: I1126 15:07:53.550997 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:07:53Z","lastTransitionTime":"2025-11-26T15:07:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:07:53 crc kubenswrapper[4785]: E1126 15:07:53.566301 4785 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T15:07:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T15:07:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:53Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T15:07:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T15:07:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:53Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"564b0a72-079b-4004-bac6-7e7947bc6860\\\",\\\"systemUUID\\\":\\\"0559caf5-1d73-4afa-a2e0-e8d6b738bfd5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:53Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:53 crc kubenswrapper[4785]: I1126 15:07:53.570870 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:07:53 crc kubenswrapper[4785]: I1126 15:07:53.571104 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:07:53 crc kubenswrapper[4785]: I1126 15:07:53.571212 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:07:53 crc kubenswrapper[4785]: I1126 15:07:53.571325 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:07:53 crc kubenswrapper[4785]: I1126 15:07:53.571581 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:07:53Z","lastTransitionTime":"2025-11-26T15:07:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:07:53 crc kubenswrapper[4785]: E1126 15:07:53.589071 4785 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T15:07:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T15:07:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:53Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T15:07:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T15:07:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:53Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"564b0a72-079b-4004-bac6-7e7947bc6860\\\",\\\"systemUUID\\\":\\\"0559caf5-1d73-4afa-a2e0-e8d6b738bfd5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:53Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:53 crc kubenswrapper[4785]: I1126 15:07:53.593370 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:07:53 crc kubenswrapper[4785]: I1126 15:07:53.593403 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:07:53 crc kubenswrapper[4785]: I1126 15:07:53.593414 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:07:53 crc kubenswrapper[4785]: I1126 15:07:53.593432 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:07:53 crc kubenswrapper[4785]: I1126 15:07:53.593448 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:07:53Z","lastTransitionTime":"2025-11-26T15:07:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:07:53 crc kubenswrapper[4785]: E1126 15:07:53.610367 4785 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T15:07:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T15:07:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:53Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T15:07:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T15:07:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:53Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"564b0a72-079b-4004-bac6-7e7947bc6860\\\",\\\"systemUUID\\\":\\\"0559caf5-1d73-4afa-a2e0-e8d6b738bfd5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:53Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:53 crc kubenswrapper[4785]: I1126 15:07:53.616119 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:07:53 crc kubenswrapper[4785]: I1126 15:07:53.616302 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:07:53 crc kubenswrapper[4785]: I1126 15:07:53.616368 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:07:53 crc kubenswrapper[4785]: I1126 15:07:53.616455 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:07:53 crc kubenswrapper[4785]: I1126 15:07:53.616525 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:07:53Z","lastTransitionTime":"2025-11-26T15:07:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:07:53 crc kubenswrapper[4785]: E1126 15:07:53.632742 4785 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T15:07:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T15:07:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:53Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T15:07:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T15:07:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:53Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"564b0a72-079b-4004-bac6-7e7947bc6860\\\",\\\"systemUUID\\\":\\\"0559caf5-1d73-4afa-a2e0-e8d6b738bfd5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:53Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:53 crc kubenswrapper[4785]: I1126 15:07:53.638265 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:07:53 crc kubenswrapper[4785]: I1126 15:07:53.638318 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:07:53 crc kubenswrapper[4785]: I1126 15:07:53.638331 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:07:53 crc kubenswrapper[4785]: I1126 15:07:53.638352 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:07:53 crc kubenswrapper[4785]: I1126 15:07:53.638365 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:07:53Z","lastTransitionTime":"2025-11-26T15:07:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:07:53 crc kubenswrapper[4785]: E1126 15:07:53.657598 4785 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T15:07:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T15:07:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:53Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T15:07:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T15:07:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:53Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"564b0a72-079b-4004-bac6-7e7947bc6860\\\",\\\"systemUUID\\\":\\\"0559caf5-1d73-4afa-a2e0-e8d6b738bfd5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:53Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:53 crc kubenswrapper[4785]: E1126 15:07:53.658101 4785 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Nov 26 15:07:53 crc kubenswrapper[4785]: I1126 15:07:53.660344 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:07:53 crc kubenswrapper[4785]: I1126 15:07:53.660450 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:07:53 crc kubenswrapper[4785]: I1126 15:07:53.660519 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:07:53 crc kubenswrapper[4785]: I1126 15:07:53.660612 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:07:53 crc kubenswrapper[4785]: I1126 15:07:53.660687 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:07:53Z","lastTransitionTime":"2025-11-26T15:07:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:07:53 crc kubenswrapper[4785]: I1126 15:07:53.764774 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:07:53 crc kubenswrapper[4785]: I1126 15:07:53.764821 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:07:53 crc kubenswrapper[4785]: I1126 15:07:53.764838 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:07:53 crc kubenswrapper[4785]: I1126 15:07:53.764861 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:07:53 crc kubenswrapper[4785]: I1126 15:07:53.764878 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:07:53Z","lastTransitionTime":"2025-11-26T15:07:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:07:53 crc kubenswrapper[4785]: I1126 15:07:53.867855 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:07:53 crc kubenswrapper[4785]: I1126 15:07:53.868188 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:07:53 crc kubenswrapper[4785]: I1126 15:07:53.868199 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:07:53 crc kubenswrapper[4785]: I1126 15:07:53.868216 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:07:53 crc kubenswrapper[4785]: I1126 15:07:53.868231 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:07:53Z","lastTransitionTime":"2025-11-26T15:07:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:07:53 crc kubenswrapper[4785]: I1126 15:07:53.971131 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:07:53 crc kubenswrapper[4785]: I1126 15:07:53.971248 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:07:53 crc kubenswrapper[4785]: I1126 15:07:53.971267 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:07:53 crc kubenswrapper[4785]: I1126 15:07:53.971293 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:07:53 crc kubenswrapper[4785]: I1126 15:07:53.971313 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:07:53Z","lastTransitionTime":"2025-11-26T15:07:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:07:54 crc kubenswrapper[4785]: I1126 15:07:54.036194 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qdfwp" Nov 26 15:07:54 crc kubenswrapper[4785]: I1126 15:07:54.036215 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 26 15:07:54 crc kubenswrapper[4785]: E1126 15:07:54.036398 4785 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qdfwp" podUID="72903df2-b694-4229-96b5-167500cab723" Nov 26 15:07:54 crc kubenswrapper[4785]: E1126 15:07:54.036605 4785 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 26 15:07:54 crc kubenswrapper[4785]: I1126 15:07:54.074515 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:07:54 crc kubenswrapper[4785]: I1126 15:07:54.074625 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:07:54 crc kubenswrapper[4785]: I1126 15:07:54.074642 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:07:54 crc kubenswrapper[4785]: I1126 15:07:54.074700 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:07:54 crc kubenswrapper[4785]: I1126 15:07:54.074718 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:07:54Z","lastTransitionTime":"2025-11-26T15:07:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:07:54 crc kubenswrapper[4785]: I1126 15:07:54.179667 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:07:54 crc kubenswrapper[4785]: I1126 15:07:54.179804 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:07:54 crc kubenswrapper[4785]: I1126 15:07:54.179880 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:07:54 crc kubenswrapper[4785]: I1126 15:07:54.179915 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:07:54 crc kubenswrapper[4785]: I1126 15:07:54.179992 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:07:54Z","lastTransitionTime":"2025-11-26T15:07:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:07:54 crc kubenswrapper[4785]: I1126 15:07:54.283380 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:07:54 crc kubenswrapper[4785]: I1126 15:07:54.283418 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:07:54 crc kubenswrapper[4785]: I1126 15:07:54.283426 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:07:54 crc kubenswrapper[4785]: I1126 15:07:54.283440 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:07:54 crc kubenswrapper[4785]: I1126 15:07:54.283452 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:07:54Z","lastTransitionTime":"2025-11-26T15:07:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:07:54 crc kubenswrapper[4785]: I1126 15:07:54.386928 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:07:54 crc kubenswrapper[4785]: I1126 15:07:54.386972 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:07:54 crc kubenswrapper[4785]: I1126 15:07:54.386983 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:07:54 crc kubenswrapper[4785]: I1126 15:07:54.387001 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:07:54 crc kubenswrapper[4785]: I1126 15:07:54.387013 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:07:54Z","lastTransitionTime":"2025-11-26T15:07:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:07:54 crc kubenswrapper[4785]: I1126 15:07:54.446483 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/72903df2-b694-4229-96b5-167500cab723-metrics-certs\") pod \"network-metrics-daemon-qdfwp\" (UID: \"72903df2-b694-4229-96b5-167500cab723\") " pod="openshift-multus/network-metrics-daemon-qdfwp" Nov 26 15:07:54 crc kubenswrapper[4785]: E1126 15:07:54.446661 4785 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Nov 26 15:07:54 crc kubenswrapper[4785]: E1126 15:07:54.446720 4785 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/72903df2-b694-4229-96b5-167500cab723-metrics-certs podName:72903df2-b694-4229-96b5-167500cab723 nodeName:}" failed. No retries permitted until 2025-11-26 15:07:56.446702253 +0000 UTC m=+40.125068027 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/72903df2-b694-4229-96b5-167500cab723-metrics-certs") pod "network-metrics-daemon-qdfwp" (UID: "72903df2-b694-4229-96b5-167500cab723") : object "openshift-multus"/"metrics-daemon-secret" not registered Nov 26 15:07:54 crc kubenswrapper[4785]: I1126 15:07:54.489768 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:07:54 crc kubenswrapper[4785]: I1126 15:07:54.489821 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:07:54 crc kubenswrapper[4785]: I1126 15:07:54.489834 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:07:54 crc kubenswrapper[4785]: I1126 15:07:54.489856 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:07:54 crc kubenswrapper[4785]: I1126 15:07:54.489869 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:07:54Z","lastTransitionTime":"2025-11-26T15:07:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:07:54 crc kubenswrapper[4785]: I1126 15:07:54.593419 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:07:54 crc kubenswrapper[4785]: I1126 15:07:54.593459 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:07:54 crc kubenswrapper[4785]: I1126 15:07:54.593469 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:07:54 crc kubenswrapper[4785]: I1126 15:07:54.593483 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:07:54 crc kubenswrapper[4785]: I1126 15:07:54.593493 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:07:54Z","lastTransitionTime":"2025-11-26T15:07:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:07:54 crc kubenswrapper[4785]: I1126 15:07:54.697079 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:07:54 crc kubenswrapper[4785]: I1126 15:07:54.697148 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:07:54 crc kubenswrapper[4785]: I1126 15:07:54.697167 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:07:54 crc kubenswrapper[4785]: I1126 15:07:54.697192 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:07:54 crc kubenswrapper[4785]: I1126 15:07:54.697210 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:07:54Z","lastTransitionTime":"2025-11-26T15:07:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:07:54 crc kubenswrapper[4785]: I1126 15:07:54.800207 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:07:54 crc kubenswrapper[4785]: I1126 15:07:54.800279 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:07:54 crc kubenswrapper[4785]: I1126 15:07:54.800301 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:07:54 crc kubenswrapper[4785]: I1126 15:07:54.800332 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:07:54 crc kubenswrapper[4785]: I1126 15:07:54.800352 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:07:54Z","lastTransitionTime":"2025-11-26T15:07:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:07:54 crc kubenswrapper[4785]: I1126 15:07:54.903619 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:07:54 crc kubenswrapper[4785]: I1126 15:07:54.903698 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:07:54 crc kubenswrapper[4785]: I1126 15:07:54.903719 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:07:54 crc kubenswrapper[4785]: I1126 15:07:54.903744 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:07:54 crc kubenswrapper[4785]: I1126 15:07:54.903765 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:07:54Z","lastTransitionTime":"2025-11-26T15:07:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:07:55 crc kubenswrapper[4785]: I1126 15:07:55.007484 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:07:55 crc kubenswrapper[4785]: I1126 15:07:55.007609 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:07:55 crc kubenswrapper[4785]: I1126 15:07:55.007638 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:07:55 crc kubenswrapper[4785]: I1126 15:07:55.007666 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:07:55 crc kubenswrapper[4785]: I1126 15:07:55.007688 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:07:55Z","lastTransitionTime":"2025-11-26T15:07:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:07:55 crc kubenswrapper[4785]: I1126 15:07:55.036056 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 26 15:07:55 crc kubenswrapper[4785]: E1126 15:07:55.036274 4785 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 26 15:07:55 crc kubenswrapper[4785]: I1126 15:07:55.036056 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 26 15:07:55 crc kubenswrapper[4785]: E1126 15:07:55.036916 4785 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 26 15:07:55 crc kubenswrapper[4785]: I1126 15:07:55.116051 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:07:55 crc kubenswrapper[4785]: I1126 15:07:55.116105 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:07:55 crc kubenswrapper[4785]: I1126 15:07:55.116121 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:07:55 crc kubenswrapper[4785]: I1126 15:07:55.116143 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:07:55 crc kubenswrapper[4785]: I1126 15:07:55.116160 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:07:55Z","lastTransitionTime":"2025-11-26T15:07:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:07:55 crc kubenswrapper[4785]: I1126 15:07:55.219386 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:07:55 crc kubenswrapper[4785]: I1126 15:07:55.219449 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:07:55 crc kubenswrapper[4785]: I1126 15:07:55.219465 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:07:55 crc kubenswrapper[4785]: I1126 15:07:55.219488 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:07:55 crc kubenswrapper[4785]: I1126 15:07:55.219506 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:07:55Z","lastTransitionTime":"2025-11-26T15:07:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:07:55 crc kubenswrapper[4785]: I1126 15:07:55.322948 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:07:55 crc kubenswrapper[4785]: I1126 15:07:55.323051 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:07:55 crc kubenswrapper[4785]: I1126 15:07:55.323079 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:07:55 crc kubenswrapper[4785]: I1126 15:07:55.323151 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:07:55 crc kubenswrapper[4785]: I1126 15:07:55.323181 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:07:55Z","lastTransitionTime":"2025-11-26T15:07:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:07:55 crc kubenswrapper[4785]: I1126 15:07:55.426335 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:07:55 crc kubenswrapper[4785]: I1126 15:07:55.426402 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:07:55 crc kubenswrapper[4785]: I1126 15:07:55.426420 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:07:55 crc kubenswrapper[4785]: I1126 15:07:55.426444 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:07:55 crc kubenswrapper[4785]: I1126 15:07:55.426462 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:07:55Z","lastTransitionTime":"2025-11-26T15:07:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:07:55 crc kubenswrapper[4785]: I1126 15:07:55.529845 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:07:55 crc kubenswrapper[4785]: I1126 15:07:55.529963 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:07:55 crc kubenswrapper[4785]: I1126 15:07:55.529990 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:07:55 crc kubenswrapper[4785]: I1126 15:07:55.530024 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:07:55 crc kubenswrapper[4785]: I1126 15:07:55.530047 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:07:55Z","lastTransitionTime":"2025-11-26T15:07:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:07:55 crc kubenswrapper[4785]: I1126 15:07:55.632764 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:07:55 crc kubenswrapper[4785]: I1126 15:07:55.632843 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:07:55 crc kubenswrapper[4785]: I1126 15:07:55.632871 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:07:55 crc kubenswrapper[4785]: I1126 15:07:55.632902 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:07:55 crc kubenswrapper[4785]: I1126 15:07:55.632927 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:07:55Z","lastTransitionTime":"2025-11-26T15:07:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:07:55 crc kubenswrapper[4785]: I1126 15:07:55.736093 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:07:55 crc kubenswrapper[4785]: I1126 15:07:55.736155 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:07:55 crc kubenswrapper[4785]: I1126 15:07:55.736181 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:07:55 crc kubenswrapper[4785]: I1126 15:07:55.736206 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:07:55 crc kubenswrapper[4785]: I1126 15:07:55.736224 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:07:55Z","lastTransitionTime":"2025-11-26T15:07:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:07:55 crc kubenswrapper[4785]: I1126 15:07:55.839930 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:07:55 crc kubenswrapper[4785]: I1126 15:07:55.840005 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:07:55 crc kubenswrapper[4785]: I1126 15:07:55.840025 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:07:55 crc kubenswrapper[4785]: I1126 15:07:55.840050 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:07:55 crc kubenswrapper[4785]: I1126 15:07:55.840077 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:07:55Z","lastTransitionTime":"2025-11-26T15:07:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:07:55 crc kubenswrapper[4785]: I1126 15:07:55.942093 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:07:55 crc kubenswrapper[4785]: I1126 15:07:55.942189 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:07:55 crc kubenswrapper[4785]: I1126 15:07:55.942206 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:07:55 crc kubenswrapper[4785]: I1126 15:07:55.942227 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:07:55 crc kubenswrapper[4785]: I1126 15:07:55.942245 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:07:55Z","lastTransitionTime":"2025-11-26T15:07:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:07:56 crc kubenswrapper[4785]: I1126 15:07:56.036130 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qdfwp" Nov 26 15:07:56 crc kubenswrapper[4785]: I1126 15:07:56.036130 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 26 15:07:56 crc kubenswrapper[4785]: E1126 15:07:56.036357 4785 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qdfwp" podUID="72903df2-b694-4229-96b5-167500cab723" Nov 26 15:07:56 crc kubenswrapper[4785]: E1126 15:07:56.036515 4785 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 26 15:07:56 crc kubenswrapper[4785]: I1126 15:07:56.044338 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:07:56 crc kubenswrapper[4785]: I1126 15:07:56.044387 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:07:56 crc kubenswrapper[4785]: I1126 15:07:56.044400 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:07:56 crc kubenswrapper[4785]: I1126 15:07:56.044423 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:07:56 crc kubenswrapper[4785]: I1126 15:07:56.044443 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:07:56Z","lastTransitionTime":"2025-11-26T15:07:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:07:56 crc kubenswrapper[4785]: I1126 15:07:56.147181 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:07:56 crc kubenswrapper[4785]: I1126 15:07:56.147241 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:07:56 crc kubenswrapper[4785]: I1126 15:07:56.147263 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:07:56 crc kubenswrapper[4785]: I1126 15:07:56.147294 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:07:56 crc kubenswrapper[4785]: I1126 15:07:56.147315 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:07:56Z","lastTransitionTime":"2025-11-26T15:07:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:07:56 crc kubenswrapper[4785]: I1126 15:07:56.250011 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:07:56 crc kubenswrapper[4785]: I1126 15:07:56.250076 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:07:56 crc kubenswrapper[4785]: I1126 15:07:56.250100 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:07:56 crc kubenswrapper[4785]: I1126 15:07:56.250128 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:07:56 crc kubenswrapper[4785]: I1126 15:07:56.250148 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:07:56Z","lastTransitionTime":"2025-11-26T15:07:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:07:56 crc kubenswrapper[4785]: I1126 15:07:56.352255 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:07:56 crc kubenswrapper[4785]: I1126 15:07:56.352316 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:07:56 crc kubenswrapper[4785]: I1126 15:07:56.352333 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:07:56 crc kubenswrapper[4785]: I1126 15:07:56.352357 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:07:56 crc kubenswrapper[4785]: I1126 15:07:56.352374 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:07:56Z","lastTransitionTime":"2025-11-26T15:07:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:07:56 crc kubenswrapper[4785]: I1126 15:07:56.454963 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:07:56 crc kubenswrapper[4785]: I1126 15:07:56.455069 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:07:56 crc kubenswrapper[4785]: I1126 15:07:56.455087 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:07:56 crc kubenswrapper[4785]: I1126 15:07:56.455166 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:07:56 crc kubenswrapper[4785]: I1126 15:07:56.455186 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:07:56Z","lastTransitionTime":"2025-11-26T15:07:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:07:56 crc kubenswrapper[4785]: I1126 15:07:56.465745 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/72903df2-b694-4229-96b5-167500cab723-metrics-certs\") pod \"network-metrics-daemon-qdfwp\" (UID: \"72903df2-b694-4229-96b5-167500cab723\") " pod="openshift-multus/network-metrics-daemon-qdfwp" Nov 26 15:07:56 crc kubenswrapper[4785]: E1126 15:07:56.465907 4785 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Nov 26 15:07:56 crc kubenswrapper[4785]: E1126 15:07:56.466008 4785 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/72903df2-b694-4229-96b5-167500cab723-metrics-certs podName:72903df2-b694-4229-96b5-167500cab723 nodeName:}" failed. No retries permitted until 2025-11-26 15:08:00.465979869 +0000 UTC m=+44.144345673 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/72903df2-b694-4229-96b5-167500cab723-metrics-certs") pod "network-metrics-daemon-qdfwp" (UID: "72903df2-b694-4229-96b5-167500cab723") : object "openshift-multus"/"metrics-daemon-secret" not registered Nov 26 15:07:56 crc kubenswrapper[4785]: I1126 15:07:56.558810 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:07:56 crc kubenswrapper[4785]: I1126 15:07:56.558857 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:07:56 crc kubenswrapper[4785]: I1126 15:07:56.558866 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:07:56 crc kubenswrapper[4785]: I1126 15:07:56.558882 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:07:56 crc kubenswrapper[4785]: I1126 15:07:56.558892 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:07:56Z","lastTransitionTime":"2025-11-26T15:07:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:07:56 crc kubenswrapper[4785]: I1126 15:07:56.661951 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:07:56 crc kubenswrapper[4785]: I1126 15:07:56.662012 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:07:56 crc kubenswrapper[4785]: I1126 15:07:56.662029 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:07:56 crc kubenswrapper[4785]: I1126 15:07:56.662054 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:07:56 crc kubenswrapper[4785]: I1126 15:07:56.662072 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:07:56Z","lastTransitionTime":"2025-11-26T15:07:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:07:56 crc kubenswrapper[4785]: I1126 15:07:56.764424 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:07:56 crc kubenswrapper[4785]: I1126 15:07:56.764472 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:07:56 crc kubenswrapper[4785]: I1126 15:07:56.764483 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:07:56 crc kubenswrapper[4785]: I1126 15:07:56.764499 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:07:56 crc kubenswrapper[4785]: I1126 15:07:56.764512 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:07:56Z","lastTransitionTime":"2025-11-26T15:07:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:07:56 crc kubenswrapper[4785]: I1126 15:07:56.867004 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:07:56 crc kubenswrapper[4785]: I1126 15:07:56.867112 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:07:56 crc kubenswrapper[4785]: I1126 15:07:56.867129 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:07:56 crc kubenswrapper[4785]: I1126 15:07:56.867192 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:07:56 crc kubenswrapper[4785]: I1126 15:07:56.867211 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:07:56Z","lastTransitionTime":"2025-11-26T15:07:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:07:56 crc kubenswrapper[4785]: I1126 15:07:56.970023 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:07:56 crc kubenswrapper[4785]: I1126 15:07:56.970079 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:07:56 crc kubenswrapper[4785]: I1126 15:07:56.970095 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:07:56 crc kubenswrapper[4785]: I1126 15:07:56.970116 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:07:56 crc kubenswrapper[4785]: I1126 15:07:56.970133 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:07:56Z","lastTransitionTime":"2025-11-26T15:07:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:07:57 crc kubenswrapper[4785]: I1126 15:07:57.036094 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 26 15:07:57 crc kubenswrapper[4785]: E1126 15:07:57.036285 4785 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 26 15:07:57 crc kubenswrapper[4785]: I1126 15:07:57.036333 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 26 15:07:57 crc kubenswrapper[4785]: E1126 15:07:57.036477 4785 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 26 15:07:57 crc kubenswrapper[4785]: I1126 15:07:57.049353 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:57Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:57 crc kubenswrapper[4785]: I1126 15:07:57.062256 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-smv28" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d77660b-5b10-4573-84ab-3dc318d4b4ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3bb62d64073e9a5efa6ec04394fb33bf1498892d59e0843f323a728dff2e7b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mwfft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:38Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-smv28\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:57Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:57 crc kubenswrapper[4785]: I1126 15:07:57.072717 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:07:57 crc kubenswrapper[4785]: I1126 15:07:57.072853 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:07:57 crc kubenswrapper[4785]: I1126 15:07:57.072886 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:07:57 crc kubenswrapper[4785]: I1126 15:07:57.072915 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:07:57 crc kubenswrapper[4785]: I1126 15:07:57.072937 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:07:57Z","lastTransitionTime":"2025-11-26T15:07:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:07:57 crc kubenswrapper[4785]: I1126 15:07:57.077365 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zk6jt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"09c65f70-203d-40eb-a45e-ed8d7e36912f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69bcee7670290dd116b68ea6d15585455861981551919b4b553fc699f43161e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dg52p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba1f2e54aa3b8e81f5bc6ad4531d5976cfd8d6c4c15f16cefecc75c3c6189ef4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dg52p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-zk6jt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:57Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:57 crc kubenswrapper[4785]: I1126 15:07:57.092146 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ce2473df-8540-437a-9b68-a0c79c8f1189\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26237e500f4655636096b017418a9029308b9ad9e5c12041546e51d919766529\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7904d95c1ece08fadcdc54e83f80b295f3904cc950fb84624bc2a6967e4d5bdd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://936454b4c6cdedceb140c50e2dba0e677ce2b126075b203bdae8a77bbdefde76\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2202f005c9b704cc7a97a7387e4563a14bda9e52001c63918f1067a7e676d34d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a6d4414478527bb76f539125e4be6ebf2a6470ddb46fd35e07180319fa98ede\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-26T15:07:36Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1126 15:07:30.557519 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1126 15:07:30.559919 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2275430498/tls.crt::/tmp/serving-cert-2275430498/tls.key\\\\\\\"\\\\nI1126 15:07:36.843783 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1126 15:07:36.847546 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1126 15:07:36.847601 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1126 15:07:36.847630 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1126 15:07:36.847642 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1126 15:07:36.858915 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1126 15:07:36.858955 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1126 15:07:36.858960 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1126 15:07:36.858971 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1126 15:07:36.858975 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1126 15:07:36.858978 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1126 15:07:36.858982 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1126 15:07:36.859422 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1126 15:07:36.862125 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:20Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcce80995d95ff6993051a4f1ca260133d2513d216a291e3f2c905ec29f88d1b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e86ca1ef167bd0c31ad5ae1283d40945c72c4e528f3702b577bbbf9818e84c08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e86ca1ef167bd0c31ad5ae1283d40945c72c4e528f3702b577bbbf9818e84c08\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:57Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:57 crc kubenswrapper[4785]: I1126 15:07:57.105737 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f369bd7df062792e57a0cee19e7be96c6850b20d6e739a0c7fa5eccc14ae1d1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:57Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:57 crc kubenswrapper[4785]: I1126 15:07:57.126056 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:57Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:57 crc kubenswrapper[4785]: I1126 15:07:57.139871 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dbac4f7f-578e-41e6-88a0-4ad8d2f5eef2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://320fc519a91dfed4d5c233e7c1c2f21aadf25336ccade8830069aae4bd804a68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b79ab88118b1e683ad26fc3bd390961d15af99cce73126c56587c0df819f3a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6c964ff713703fe4484d76d86473279ffc4a3eef0cf3dd8b29aea7fdc6271616\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03ac9c745ec9eb21263a599561496630202872ef631f5e846e0475efedee6943\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:57Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:57 crc kubenswrapper[4785]: I1126 15:07:57.169079 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ec3c2133-1999-46b4-b5a6-c1b800f04c30\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee866b365ef5e15c3cc262ab6ad0649b10fa7fdf0c10c3f747d9e20d48535e76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79b8f6bd2047642cacd4effcee8f9760c7c9480092993c7a25eabe4727bd8238\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dee1454ae94951e83f437b999f871e4965a9199a20bcd6392a6e07a7bfa9c6aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://257e51cb9b93727ccb616e193abc21dfed1ee9ee2f1ce459e35728457dae5280\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc743861f85f56a87c6a20d7152af59e554284fe13b8212c5dae854438771406\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ff60b31932a511f9156f1416e112d115fcbc79a0a6030143ba9f5ddd327577a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ff60b31932a511f9156f1416e112d115fcbc79a0a6030143ba9f5ddd327577a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff524be439c6bf410b9377cd503516b43be3826e4c2fcd03a6ed9bf3fcfab95b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff524be439c6bf410b9377cd503516b43be3826e4c2fcd03a6ed9bf3fcfab95b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:19Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://26cb5ba5508ef320a24ba05e352547c9feb82f37ebeab58ad7c0a073d03b8c2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26cb5ba5508ef320a24ba05e352547c9feb82f37ebeab58ad7c0a073d03b8c2d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:17Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:57Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:57 crc kubenswrapper[4785]: I1126 15:07:57.175864 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:07:57 crc kubenswrapper[4785]: I1126 15:07:57.175927 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:07:57 crc kubenswrapper[4785]: I1126 15:07:57.175949 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:07:57 crc kubenswrapper[4785]: I1126 15:07:57.175978 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:07:57 crc kubenswrapper[4785]: I1126 15:07:57.175999 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:07:57Z","lastTransitionTime":"2025-11-26T15:07:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:07:57 crc kubenswrapper[4785]: I1126 15:07:57.189889 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8acfa933b476aa5e196c195906225ae92af5cca3353292e23f29f87ee374d0ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f432643beb27bff31e146230a60837b1292cddb8595ede8ed7f929170fceeb17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:57Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:57 crc kubenswrapper[4785]: I1126 15:07:57.207844 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ab8b8fda3321443cb24c214c10ba7171f01c80adf81860f4aa931062ec886d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:57Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:57 crc kubenswrapper[4785]: I1126 15:07:57.225218 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:57Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:57 crc kubenswrapper[4785]: I1126 15:07:57.251072 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xbz7b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84d83039-5d86-45bb-a5c1-ca5b94ed92c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5459c3414a7754407ac6d182e88047bbe1b7b9750d15c7a88767c50c4b9dc720\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gdb7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f7072a2a9067489a27300fea823075b89d71d45a1d972bd9d94226b01c2c123\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f7072a2a9067489a27300fea823075b89d71d45a1d972bd9d94226b01c2c123\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gdb7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://538d65a0b5f3e2c45c1b9761f2c73e958d5de0873652f20f4047631423e538eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://538d65a0b5f3e2c45c1b9761f2c73e958d5de0873652f20f4047631423e538eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gdb7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1abf2b4c1a00c0bc17acaf7547130008127579df59592382ae6e81430936e6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d1abf2b4c1a00c0bc17acaf7547130008127579df59592382ae6e81430936e6d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gdb7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://35a2151729e5cb2ae347214fc91e6de116a4062943e4cb4a27ab0553436953ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35a2151729e5cb2ae347214fc91e6de116a4062943e4cb4a27ab0553436953ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gdb7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b51e95d33c83c57a3b5a7f580eb9cac5e50575e699c066694a9ae8fd4dcdfb20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b51e95d33c83c57a3b5a7f580eb9cac5e50575e699c066694a9ae8fd4dcdfb20\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gdb7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20f87e07ed26b3b93d4dad87f971e4ee3bf3c46299491a95990ab34c63167f93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://20f87e07ed26b3b93d4dad87f971e4ee3bf3c46299491a95990ab34c63167f93\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gdb7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xbz7b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:57Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:57 crc kubenswrapper[4785]: I1126 15:07:57.267739 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-gkxdl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5539d39a-e2bc-4e7f-8b5a-a5e3e10c4ba4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://183451ec5e6ad1983b6e56a53adfcf2eaf3489dcfbc3cbc709f67002691b9010\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqzs8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5538d579d2e0ac36c778f58da6e8176f8f7a0995797e8bc7c3ad21a054346867\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqzs8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-gkxdl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:57Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:57 crc kubenswrapper[4785]: I1126 15:07:57.283279 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:07:57 crc kubenswrapper[4785]: I1126 15:07:57.283343 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:07:57 crc kubenswrapper[4785]: I1126 15:07:57.283359 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:07:57 crc kubenswrapper[4785]: I1126 15:07:57.283382 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:07:57 crc kubenswrapper[4785]: I1126 15:07:57.283401 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:07:57Z","lastTransitionTime":"2025-11-26T15:07:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:07:57 crc kubenswrapper[4785]: I1126 15:07:57.296053 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-925q9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"862c58fd-3f79-4276-bd76-ce689d32cbd6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b754df1d5f2d20f3afdb10f2e77f4716de9d953c7322bb3e289d989d2699d17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://018b59a73e9d454f9156de17b00a8c5cccc3f321164cac374187afa18466bc45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e1a2013c12cbd23c8c9e4e0d5bd9f76df1c2c2f30f0d55fdb8c8cb03698ea576\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bc34631383b7470b2d582d037ea922f2c46d09ca9e37011d758e16837336a78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://77387f2d556a76807d809c75aa4d9e2fdf5de82d7a04e23d3f5461ccef860196\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://47be007cc11c8c2db5bc6dd5786cad4e45c1f8f08f872d060be0211012aad935\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70cfa5c92e3afde972c24f4c6d70dcfe2bf73b25223abd8628949f267b3c30c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://70cfa5c92e3afde972c24f4c6d70dcfe2bf73b25223abd8628949f267b3c30c5\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-26T15:07:50Z\\\",\\\"message\\\":\\\"Built service openshift-config-operator/metrics per-node LB for network=default: []services.LB{}\\\\nI1126 15:07:50.056310 6208 services_controller.go:453] Built service openshift-config-operator/metrics template LB for network=default: []services.LB{}\\\\nI1126 15:07:50.056320 6208 services_controller.go:454] Service openshift-config-operator/metrics for network=default has 1 cluster-wide, 0 per-node configs, 0 template configs, making 1 (cluster) 0 (per node) and 0 (template) load balancers\\\\nI1126 15:07:50.056218 6208 lb_config.go:1031] Cluster endpoints for openshift-ingress-canary/ingress-canary for network=default are: map[]\\\\nI1126 15:07:50.055930 6208 ovn.go:134] Ensuring zone local for Pod openshift-machine-config-operator/machine-config-daemon-gkxdl in node crc\\\\nI1126 15:07:50.056357 6208 services_controller.go:443] Built service openshift-ingress-canary/ingress-canary LB cluster-wide configs for network=default: []services.lbConfig{services.lbConfig{vips:[]string{\\\\\\\"10.217.5.34\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:8443, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]string(nil), V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}, services.lbConfig{vips:[]string{\\\\\\\"10.217.5.34\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:8888, clusterEndpoints:services.lbEndpoints\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:49Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-925q9_openshift-ovn-kubernetes(862c58fd-3f79-4276-bd76-ce689d32cbd6)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84b98c7c476182b515072f71c6d81656debe5c03855324623b1eb31f53e6c5c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://049fc6de07d4b51e1c898035109c3427dff189cc509f5973db7cb75c3a022bc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://049fc6de07d4b51e1c898035109c3427dff189cc509f5973db7cb75c3a022bc1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-925q9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:57Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:57 crc kubenswrapper[4785]: I1126 15:07:57.312330 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-hk884" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b9a87c55-b930-4993-88b8-15902e000caa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://018c4591e8ee49a626a6357d6672b723d221f62eaa62face83fd6c3d5f1b3bfa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n9rk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:38Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-hk884\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:57Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:57 crc kubenswrapper[4785]: I1126 15:07:57.329823 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6q4xd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"855bd894-cca9-4fe1-a0d5-8b72afe7c93a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://041064cd63f3d654e02e5e310fbc190ca527a986959c9d7a35ab87962184f2a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7zv2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6q4xd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:57Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:57 crc kubenswrapper[4785]: I1126 15:07:57.343876 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-qdfwp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72903df2-b694-4229-96b5-167500cab723\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrprh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrprh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:52Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-qdfwp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:07:57Z is after 2025-08-24T17:21:41Z" Nov 26 15:07:57 crc kubenswrapper[4785]: I1126 15:07:57.385695 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:07:57 crc kubenswrapper[4785]: I1126 15:07:57.385739 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:07:57 crc kubenswrapper[4785]: I1126 15:07:57.385750 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:07:57 crc kubenswrapper[4785]: I1126 15:07:57.385764 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:07:57 crc kubenswrapper[4785]: I1126 15:07:57.385774 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:07:57Z","lastTransitionTime":"2025-11-26T15:07:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:07:57 crc kubenswrapper[4785]: I1126 15:07:57.488432 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:07:57 crc kubenswrapper[4785]: I1126 15:07:57.488464 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:07:57 crc kubenswrapper[4785]: I1126 15:07:57.488472 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:07:57 crc kubenswrapper[4785]: I1126 15:07:57.488486 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:07:57 crc kubenswrapper[4785]: I1126 15:07:57.488496 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:07:57Z","lastTransitionTime":"2025-11-26T15:07:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:07:57 crc kubenswrapper[4785]: I1126 15:07:57.591281 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:07:57 crc kubenswrapper[4785]: I1126 15:07:57.591340 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:07:57 crc kubenswrapper[4785]: I1126 15:07:57.591358 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:07:57 crc kubenswrapper[4785]: I1126 15:07:57.591383 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:07:57 crc kubenswrapper[4785]: I1126 15:07:57.591401 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:07:57Z","lastTransitionTime":"2025-11-26T15:07:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:07:57 crc kubenswrapper[4785]: I1126 15:07:57.694966 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:07:57 crc kubenswrapper[4785]: I1126 15:07:57.695019 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:07:57 crc kubenswrapper[4785]: I1126 15:07:57.695038 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:07:57 crc kubenswrapper[4785]: I1126 15:07:57.695062 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:07:57 crc kubenswrapper[4785]: I1126 15:07:57.695081 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:07:57Z","lastTransitionTime":"2025-11-26T15:07:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:07:57 crc kubenswrapper[4785]: I1126 15:07:57.798446 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:07:57 crc kubenswrapper[4785]: I1126 15:07:57.798510 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:07:57 crc kubenswrapper[4785]: I1126 15:07:57.798527 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:07:57 crc kubenswrapper[4785]: I1126 15:07:57.798608 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:07:57 crc kubenswrapper[4785]: I1126 15:07:57.798632 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:07:57Z","lastTransitionTime":"2025-11-26T15:07:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:07:57 crc kubenswrapper[4785]: I1126 15:07:57.901209 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:07:57 crc kubenswrapper[4785]: I1126 15:07:57.901303 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:07:57 crc kubenswrapper[4785]: I1126 15:07:57.901322 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:07:57 crc kubenswrapper[4785]: I1126 15:07:57.901345 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:07:57 crc kubenswrapper[4785]: I1126 15:07:57.901361 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:07:57Z","lastTransitionTime":"2025-11-26T15:07:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:07:58 crc kubenswrapper[4785]: I1126 15:07:58.004419 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:07:58 crc kubenswrapper[4785]: I1126 15:07:58.004484 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:07:58 crc kubenswrapper[4785]: I1126 15:07:58.004503 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:07:58 crc kubenswrapper[4785]: I1126 15:07:58.004529 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:07:58 crc kubenswrapper[4785]: I1126 15:07:58.004546 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:07:58Z","lastTransitionTime":"2025-11-26T15:07:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:07:58 crc kubenswrapper[4785]: I1126 15:07:58.036001 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qdfwp" Nov 26 15:07:58 crc kubenswrapper[4785]: E1126 15:07:58.036376 4785 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qdfwp" podUID="72903df2-b694-4229-96b5-167500cab723" Nov 26 15:07:58 crc kubenswrapper[4785]: I1126 15:07:58.036078 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 26 15:07:58 crc kubenswrapper[4785]: E1126 15:07:58.036666 4785 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 26 15:07:58 crc kubenswrapper[4785]: I1126 15:07:58.107306 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:07:58 crc kubenswrapper[4785]: I1126 15:07:58.107660 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:07:58 crc kubenswrapper[4785]: I1126 15:07:58.107769 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:07:58 crc kubenswrapper[4785]: I1126 15:07:58.107871 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:07:58 crc kubenswrapper[4785]: I1126 15:07:58.107969 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:07:58Z","lastTransitionTime":"2025-11-26T15:07:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:07:58 crc kubenswrapper[4785]: I1126 15:07:58.211096 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:07:58 crc kubenswrapper[4785]: I1126 15:07:58.211122 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:07:58 crc kubenswrapper[4785]: I1126 15:07:58.211130 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:07:58 crc kubenswrapper[4785]: I1126 15:07:58.211142 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:07:58 crc kubenswrapper[4785]: I1126 15:07:58.211151 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:07:58Z","lastTransitionTime":"2025-11-26T15:07:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:07:58 crc kubenswrapper[4785]: I1126 15:07:58.314184 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:07:58 crc kubenswrapper[4785]: I1126 15:07:58.314237 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:07:58 crc kubenswrapper[4785]: I1126 15:07:58.314257 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:07:58 crc kubenswrapper[4785]: I1126 15:07:58.314285 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:07:58 crc kubenswrapper[4785]: I1126 15:07:58.314305 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:07:58Z","lastTransitionTime":"2025-11-26T15:07:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:07:58 crc kubenswrapper[4785]: I1126 15:07:58.417629 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:07:58 crc kubenswrapper[4785]: I1126 15:07:58.417701 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:07:58 crc kubenswrapper[4785]: I1126 15:07:58.417723 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:07:58 crc kubenswrapper[4785]: I1126 15:07:58.417752 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:07:58 crc kubenswrapper[4785]: I1126 15:07:58.417774 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:07:58Z","lastTransitionTime":"2025-11-26T15:07:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:07:58 crc kubenswrapper[4785]: I1126 15:07:58.520594 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:07:58 crc kubenswrapper[4785]: I1126 15:07:58.520659 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:07:58 crc kubenswrapper[4785]: I1126 15:07:58.520679 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:07:58 crc kubenswrapper[4785]: I1126 15:07:58.520703 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:07:58 crc kubenswrapper[4785]: I1126 15:07:58.520721 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:07:58Z","lastTransitionTime":"2025-11-26T15:07:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:07:58 crc kubenswrapper[4785]: I1126 15:07:58.625445 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:07:58 crc kubenswrapper[4785]: I1126 15:07:58.625481 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:07:58 crc kubenswrapper[4785]: I1126 15:07:58.625490 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:07:58 crc kubenswrapper[4785]: I1126 15:07:58.625502 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:07:58 crc kubenswrapper[4785]: I1126 15:07:58.625511 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:07:58Z","lastTransitionTime":"2025-11-26T15:07:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:07:58 crc kubenswrapper[4785]: I1126 15:07:58.728647 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:07:58 crc kubenswrapper[4785]: I1126 15:07:58.728713 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:07:58 crc kubenswrapper[4785]: I1126 15:07:58.728734 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:07:58 crc kubenswrapper[4785]: I1126 15:07:58.728760 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:07:58 crc kubenswrapper[4785]: I1126 15:07:58.728777 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:07:58Z","lastTransitionTime":"2025-11-26T15:07:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:07:58 crc kubenswrapper[4785]: I1126 15:07:58.831665 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:07:58 crc kubenswrapper[4785]: I1126 15:07:58.831721 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:07:58 crc kubenswrapper[4785]: I1126 15:07:58.831739 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:07:58 crc kubenswrapper[4785]: I1126 15:07:58.831765 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:07:58 crc kubenswrapper[4785]: I1126 15:07:58.831783 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:07:58Z","lastTransitionTime":"2025-11-26T15:07:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:07:58 crc kubenswrapper[4785]: I1126 15:07:58.934946 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:07:58 crc kubenswrapper[4785]: I1126 15:07:58.935001 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:07:58 crc kubenswrapper[4785]: I1126 15:07:58.935018 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:07:58 crc kubenswrapper[4785]: I1126 15:07:58.935050 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:07:58 crc kubenswrapper[4785]: I1126 15:07:58.935068 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:07:58Z","lastTransitionTime":"2025-11-26T15:07:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:07:59 crc kubenswrapper[4785]: I1126 15:07:59.035594 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 26 15:07:59 crc kubenswrapper[4785]: I1126 15:07:59.035664 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 26 15:07:59 crc kubenswrapper[4785]: E1126 15:07:59.035820 4785 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 26 15:07:59 crc kubenswrapper[4785]: E1126 15:07:59.036024 4785 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 26 15:07:59 crc kubenswrapper[4785]: I1126 15:07:59.037985 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:07:59 crc kubenswrapper[4785]: I1126 15:07:59.038050 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:07:59 crc kubenswrapper[4785]: I1126 15:07:59.038074 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:07:59 crc kubenswrapper[4785]: I1126 15:07:59.038131 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:07:59 crc kubenswrapper[4785]: I1126 15:07:59.038156 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:07:59Z","lastTransitionTime":"2025-11-26T15:07:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:07:59 crc kubenswrapper[4785]: I1126 15:07:59.141028 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:07:59 crc kubenswrapper[4785]: I1126 15:07:59.141094 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:07:59 crc kubenswrapper[4785]: I1126 15:07:59.141117 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:07:59 crc kubenswrapper[4785]: I1126 15:07:59.141145 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:07:59 crc kubenswrapper[4785]: I1126 15:07:59.141167 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:07:59Z","lastTransitionTime":"2025-11-26T15:07:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:07:59 crc kubenswrapper[4785]: I1126 15:07:59.244312 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:07:59 crc kubenswrapper[4785]: I1126 15:07:59.244396 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:07:59 crc kubenswrapper[4785]: I1126 15:07:59.244414 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:07:59 crc kubenswrapper[4785]: I1126 15:07:59.244441 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:07:59 crc kubenswrapper[4785]: I1126 15:07:59.244460 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:07:59Z","lastTransitionTime":"2025-11-26T15:07:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:07:59 crc kubenswrapper[4785]: I1126 15:07:59.353888 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:07:59 crc kubenswrapper[4785]: I1126 15:07:59.353925 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:07:59 crc kubenswrapper[4785]: I1126 15:07:59.353934 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:07:59 crc kubenswrapper[4785]: I1126 15:07:59.353947 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:07:59 crc kubenswrapper[4785]: I1126 15:07:59.353958 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:07:59Z","lastTransitionTime":"2025-11-26T15:07:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:07:59 crc kubenswrapper[4785]: I1126 15:07:59.456397 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:07:59 crc kubenswrapper[4785]: I1126 15:07:59.456464 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:07:59 crc kubenswrapper[4785]: I1126 15:07:59.456486 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:07:59 crc kubenswrapper[4785]: I1126 15:07:59.456516 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:07:59 crc kubenswrapper[4785]: I1126 15:07:59.456539 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:07:59Z","lastTransitionTime":"2025-11-26T15:07:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:07:59 crc kubenswrapper[4785]: I1126 15:07:59.559328 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:07:59 crc kubenswrapper[4785]: I1126 15:07:59.559402 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:07:59 crc kubenswrapper[4785]: I1126 15:07:59.559415 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:07:59 crc kubenswrapper[4785]: I1126 15:07:59.559436 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:07:59 crc kubenswrapper[4785]: I1126 15:07:59.559449 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:07:59Z","lastTransitionTime":"2025-11-26T15:07:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:07:59 crc kubenswrapper[4785]: I1126 15:07:59.662493 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:07:59 crc kubenswrapper[4785]: I1126 15:07:59.662615 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:07:59 crc kubenswrapper[4785]: I1126 15:07:59.662640 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:07:59 crc kubenswrapper[4785]: I1126 15:07:59.662675 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:07:59 crc kubenswrapper[4785]: I1126 15:07:59.662697 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:07:59Z","lastTransitionTime":"2025-11-26T15:07:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:07:59 crc kubenswrapper[4785]: I1126 15:07:59.771811 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:07:59 crc kubenswrapper[4785]: I1126 15:07:59.771872 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:07:59 crc kubenswrapper[4785]: I1126 15:07:59.771889 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:07:59 crc kubenswrapper[4785]: I1126 15:07:59.771913 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:07:59 crc kubenswrapper[4785]: I1126 15:07:59.771930 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:07:59Z","lastTransitionTime":"2025-11-26T15:07:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:07:59 crc kubenswrapper[4785]: I1126 15:07:59.876031 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:07:59 crc kubenswrapper[4785]: I1126 15:07:59.876211 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:07:59 crc kubenswrapper[4785]: I1126 15:07:59.876228 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:07:59 crc kubenswrapper[4785]: I1126 15:07:59.876252 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:07:59 crc kubenswrapper[4785]: I1126 15:07:59.876271 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:07:59Z","lastTransitionTime":"2025-11-26T15:07:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:07:59 crc kubenswrapper[4785]: I1126 15:07:59.979007 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:07:59 crc kubenswrapper[4785]: I1126 15:07:59.979069 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:07:59 crc kubenswrapper[4785]: I1126 15:07:59.979087 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:07:59 crc kubenswrapper[4785]: I1126 15:07:59.979114 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:07:59 crc kubenswrapper[4785]: I1126 15:07:59.979131 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:07:59Z","lastTransitionTime":"2025-11-26T15:07:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:00 crc kubenswrapper[4785]: I1126 15:08:00.036184 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 26 15:08:00 crc kubenswrapper[4785]: I1126 15:08:00.036184 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qdfwp" Nov 26 15:08:00 crc kubenswrapper[4785]: E1126 15:08:00.036496 4785 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 26 15:08:00 crc kubenswrapper[4785]: E1126 15:08:00.036745 4785 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qdfwp" podUID="72903df2-b694-4229-96b5-167500cab723" Nov 26 15:08:00 crc kubenswrapper[4785]: I1126 15:08:00.082046 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:00 crc kubenswrapper[4785]: I1126 15:08:00.082109 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:00 crc kubenswrapper[4785]: I1126 15:08:00.082127 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:00 crc kubenswrapper[4785]: I1126 15:08:00.082150 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:00 crc kubenswrapper[4785]: I1126 15:08:00.082167 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:00Z","lastTransitionTime":"2025-11-26T15:08:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:00 crc kubenswrapper[4785]: I1126 15:08:00.185779 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:00 crc kubenswrapper[4785]: I1126 15:08:00.185858 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:00 crc kubenswrapper[4785]: I1126 15:08:00.185880 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:00 crc kubenswrapper[4785]: I1126 15:08:00.185912 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:00 crc kubenswrapper[4785]: I1126 15:08:00.185936 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:00Z","lastTransitionTime":"2025-11-26T15:08:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:00 crc kubenswrapper[4785]: I1126 15:08:00.289799 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:00 crc kubenswrapper[4785]: I1126 15:08:00.289875 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:00 crc kubenswrapper[4785]: I1126 15:08:00.289892 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:00 crc kubenswrapper[4785]: I1126 15:08:00.289918 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:00 crc kubenswrapper[4785]: I1126 15:08:00.289936 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:00Z","lastTransitionTime":"2025-11-26T15:08:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:00 crc kubenswrapper[4785]: I1126 15:08:00.392105 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:00 crc kubenswrapper[4785]: I1126 15:08:00.392145 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:00 crc kubenswrapper[4785]: I1126 15:08:00.392156 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:00 crc kubenswrapper[4785]: I1126 15:08:00.392171 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:00 crc kubenswrapper[4785]: I1126 15:08:00.392182 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:00Z","lastTransitionTime":"2025-11-26T15:08:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:00 crc kubenswrapper[4785]: I1126 15:08:00.496243 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:00 crc kubenswrapper[4785]: I1126 15:08:00.496319 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:00 crc kubenswrapper[4785]: I1126 15:08:00.496342 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:00 crc kubenswrapper[4785]: I1126 15:08:00.496371 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:00 crc kubenswrapper[4785]: I1126 15:08:00.496390 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:00Z","lastTransitionTime":"2025-11-26T15:08:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:00 crc kubenswrapper[4785]: I1126 15:08:00.499965 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/72903df2-b694-4229-96b5-167500cab723-metrics-certs\") pod \"network-metrics-daemon-qdfwp\" (UID: \"72903df2-b694-4229-96b5-167500cab723\") " pod="openshift-multus/network-metrics-daemon-qdfwp" Nov 26 15:08:00 crc kubenswrapper[4785]: E1126 15:08:00.500114 4785 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Nov 26 15:08:00 crc kubenswrapper[4785]: E1126 15:08:00.500194 4785 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/72903df2-b694-4229-96b5-167500cab723-metrics-certs podName:72903df2-b694-4229-96b5-167500cab723 nodeName:}" failed. No retries permitted until 2025-11-26 15:08:08.500172489 +0000 UTC m=+52.178538263 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/72903df2-b694-4229-96b5-167500cab723-metrics-certs") pod "network-metrics-daemon-qdfwp" (UID: "72903df2-b694-4229-96b5-167500cab723") : object "openshift-multus"/"metrics-daemon-secret" not registered Nov 26 15:08:00 crc kubenswrapper[4785]: I1126 15:08:00.600042 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:00 crc kubenswrapper[4785]: I1126 15:08:00.600095 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:00 crc kubenswrapper[4785]: I1126 15:08:00.600120 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:00 crc kubenswrapper[4785]: I1126 15:08:00.600152 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:00 crc kubenswrapper[4785]: I1126 15:08:00.600175 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:00Z","lastTransitionTime":"2025-11-26T15:08:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:00 crc kubenswrapper[4785]: I1126 15:08:00.702823 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:00 crc kubenswrapper[4785]: I1126 15:08:00.702890 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:00 crc kubenswrapper[4785]: I1126 15:08:00.702915 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:00 crc kubenswrapper[4785]: I1126 15:08:00.702946 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:00 crc kubenswrapper[4785]: I1126 15:08:00.702967 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:00Z","lastTransitionTime":"2025-11-26T15:08:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:00 crc kubenswrapper[4785]: I1126 15:08:00.805083 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:00 crc kubenswrapper[4785]: I1126 15:08:00.805142 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:00 crc kubenswrapper[4785]: I1126 15:08:00.805164 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:00 crc kubenswrapper[4785]: I1126 15:08:00.805193 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:00 crc kubenswrapper[4785]: I1126 15:08:00.805207 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:00Z","lastTransitionTime":"2025-11-26T15:08:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:00 crc kubenswrapper[4785]: I1126 15:08:00.907818 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:00 crc kubenswrapper[4785]: I1126 15:08:00.907867 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:00 crc kubenswrapper[4785]: I1126 15:08:00.907879 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:00 crc kubenswrapper[4785]: I1126 15:08:00.907893 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:00 crc kubenswrapper[4785]: I1126 15:08:00.907903 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:00Z","lastTransitionTime":"2025-11-26T15:08:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:01 crc kubenswrapper[4785]: I1126 15:08:01.011339 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:01 crc kubenswrapper[4785]: I1126 15:08:01.011397 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:01 crc kubenswrapper[4785]: I1126 15:08:01.011412 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:01 crc kubenswrapper[4785]: I1126 15:08:01.011433 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:01 crc kubenswrapper[4785]: I1126 15:08:01.011445 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:01Z","lastTransitionTime":"2025-11-26T15:08:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:01 crc kubenswrapper[4785]: I1126 15:08:01.035834 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 26 15:08:01 crc kubenswrapper[4785]: I1126 15:08:01.035870 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 26 15:08:01 crc kubenswrapper[4785]: E1126 15:08:01.036016 4785 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 26 15:08:01 crc kubenswrapper[4785]: E1126 15:08:01.036105 4785 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 26 15:08:01 crc kubenswrapper[4785]: I1126 15:08:01.114379 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:01 crc kubenswrapper[4785]: I1126 15:08:01.114452 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:01 crc kubenswrapper[4785]: I1126 15:08:01.114475 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:01 crc kubenswrapper[4785]: I1126 15:08:01.114502 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:01 crc kubenswrapper[4785]: I1126 15:08:01.114530 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:01Z","lastTransitionTime":"2025-11-26T15:08:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:01 crc kubenswrapper[4785]: I1126 15:08:01.216858 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:01 crc kubenswrapper[4785]: I1126 15:08:01.216913 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:01 crc kubenswrapper[4785]: I1126 15:08:01.216931 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:01 crc kubenswrapper[4785]: I1126 15:08:01.216953 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:01 crc kubenswrapper[4785]: I1126 15:08:01.216970 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:01Z","lastTransitionTime":"2025-11-26T15:08:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:01 crc kubenswrapper[4785]: I1126 15:08:01.320074 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:01 crc kubenswrapper[4785]: I1126 15:08:01.320159 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:01 crc kubenswrapper[4785]: I1126 15:08:01.320184 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:01 crc kubenswrapper[4785]: I1126 15:08:01.320209 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:01 crc kubenswrapper[4785]: I1126 15:08:01.320228 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:01Z","lastTransitionTime":"2025-11-26T15:08:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:01 crc kubenswrapper[4785]: I1126 15:08:01.422612 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:01 crc kubenswrapper[4785]: I1126 15:08:01.422686 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:01 crc kubenswrapper[4785]: I1126 15:08:01.422709 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:01 crc kubenswrapper[4785]: I1126 15:08:01.422737 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:01 crc kubenswrapper[4785]: I1126 15:08:01.422759 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:01Z","lastTransitionTime":"2025-11-26T15:08:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:01 crc kubenswrapper[4785]: I1126 15:08:01.526097 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:01 crc kubenswrapper[4785]: I1126 15:08:01.526138 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:01 crc kubenswrapper[4785]: I1126 15:08:01.526146 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:01 crc kubenswrapper[4785]: I1126 15:08:01.526162 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:01 crc kubenswrapper[4785]: I1126 15:08:01.526172 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:01Z","lastTransitionTime":"2025-11-26T15:08:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:01 crc kubenswrapper[4785]: I1126 15:08:01.628935 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:01 crc kubenswrapper[4785]: I1126 15:08:01.628991 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:01 crc kubenswrapper[4785]: I1126 15:08:01.629008 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:01 crc kubenswrapper[4785]: I1126 15:08:01.629031 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:01 crc kubenswrapper[4785]: I1126 15:08:01.629049 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:01Z","lastTransitionTime":"2025-11-26T15:08:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:01 crc kubenswrapper[4785]: I1126 15:08:01.731074 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:01 crc kubenswrapper[4785]: I1126 15:08:01.731104 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:01 crc kubenswrapper[4785]: I1126 15:08:01.731115 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:01 crc kubenswrapper[4785]: I1126 15:08:01.731131 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:01 crc kubenswrapper[4785]: I1126 15:08:01.731146 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:01Z","lastTransitionTime":"2025-11-26T15:08:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:01 crc kubenswrapper[4785]: I1126 15:08:01.833931 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:01 crc kubenswrapper[4785]: I1126 15:08:01.833968 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:01 crc kubenswrapper[4785]: I1126 15:08:01.833980 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:01 crc kubenswrapper[4785]: I1126 15:08:01.833995 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:01 crc kubenswrapper[4785]: I1126 15:08:01.834006 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:01Z","lastTransitionTime":"2025-11-26T15:08:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:01 crc kubenswrapper[4785]: I1126 15:08:01.936937 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:01 crc kubenswrapper[4785]: I1126 15:08:01.936992 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:01 crc kubenswrapper[4785]: I1126 15:08:01.937010 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:01 crc kubenswrapper[4785]: I1126 15:08:01.937034 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:01 crc kubenswrapper[4785]: I1126 15:08:01.937052 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:01Z","lastTransitionTime":"2025-11-26T15:08:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:02 crc kubenswrapper[4785]: I1126 15:08:02.035957 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 26 15:08:02 crc kubenswrapper[4785]: I1126 15:08:02.036065 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qdfwp" Nov 26 15:08:02 crc kubenswrapper[4785]: E1126 15:08:02.036123 4785 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 26 15:08:02 crc kubenswrapper[4785]: E1126 15:08:02.036254 4785 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qdfwp" podUID="72903df2-b694-4229-96b5-167500cab723" Nov 26 15:08:02 crc kubenswrapper[4785]: I1126 15:08:02.040904 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:02 crc kubenswrapper[4785]: I1126 15:08:02.040975 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:02 crc kubenswrapper[4785]: I1126 15:08:02.041001 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:02 crc kubenswrapper[4785]: I1126 15:08:02.041035 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:02 crc kubenswrapper[4785]: I1126 15:08:02.041059 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:02Z","lastTransitionTime":"2025-11-26T15:08:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:02 crc kubenswrapper[4785]: I1126 15:08:02.144245 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:02 crc kubenswrapper[4785]: I1126 15:08:02.144301 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:02 crc kubenswrapper[4785]: I1126 15:08:02.144323 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:02 crc kubenswrapper[4785]: I1126 15:08:02.144347 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:02 crc kubenswrapper[4785]: I1126 15:08:02.144365 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:02Z","lastTransitionTime":"2025-11-26T15:08:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:02 crc kubenswrapper[4785]: I1126 15:08:02.247249 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:02 crc kubenswrapper[4785]: I1126 15:08:02.247312 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:02 crc kubenswrapper[4785]: I1126 15:08:02.247331 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:02 crc kubenswrapper[4785]: I1126 15:08:02.247354 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:02 crc kubenswrapper[4785]: I1126 15:08:02.247372 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:02Z","lastTransitionTime":"2025-11-26T15:08:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:02 crc kubenswrapper[4785]: I1126 15:08:02.350991 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:02 crc kubenswrapper[4785]: I1126 15:08:02.351063 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:02 crc kubenswrapper[4785]: I1126 15:08:02.351088 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:02 crc kubenswrapper[4785]: I1126 15:08:02.351115 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:02 crc kubenswrapper[4785]: I1126 15:08:02.351141 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:02Z","lastTransitionTime":"2025-11-26T15:08:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:02 crc kubenswrapper[4785]: I1126 15:08:02.461102 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:02 crc kubenswrapper[4785]: I1126 15:08:02.461220 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:02 crc kubenswrapper[4785]: I1126 15:08:02.461249 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:02 crc kubenswrapper[4785]: I1126 15:08:02.461283 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:02 crc kubenswrapper[4785]: I1126 15:08:02.461317 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:02Z","lastTransitionTime":"2025-11-26T15:08:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:02 crc kubenswrapper[4785]: I1126 15:08:02.564482 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:02 crc kubenswrapper[4785]: I1126 15:08:02.564542 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:02 crc kubenswrapper[4785]: I1126 15:08:02.564592 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:02 crc kubenswrapper[4785]: I1126 15:08:02.564616 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:02 crc kubenswrapper[4785]: I1126 15:08:02.564633 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:02Z","lastTransitionTime":"2025-11-26T15:08:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:02 crc kubenswrapper[4785]: I1126 15:08:02.667529 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:02 crc kubenswrapper[4785]: I1126 15:08:02.667647 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:02 crc kubenswrapper[4785]: I1126 15:08:02.667665 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:02 crc kubenswrapper[4785]: I1126 15:08:02.667685 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:02 crc kubenswrapper[4785]: I1126 15:08:02.667698 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:02Z","lastTransitionTime":"2025-11-26T15:08:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:02 crc kubenswrapper[4785]: I1126 15:08:02.769836 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:02 crc kubenswrapper[4785]: I1126 15:08:02.769892 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:02 crc kubenswrapper[4785]: I1126 15:08:02.769904 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:02 crc kubenswrapper[4785]: I1126 15:08:02.769920 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:02 crc kubenswrapper[4785]: I1126 15:08:02.769932 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:02Z","lastTransitionTime":"2025-11-26T15:08:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:02 crc kubenswrapper[4785]: I1126 15:08:02.873070 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:02 crc kubenswrapper[4785]: I1126 15:08:02.873197 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:02 crc kubenswrapper[4785]: I1126 15:08:02.873219 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:02 crc kubenswrapper[4785]: I1126 15:08:02.873540 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:02 crc kubenswrapper[4785]: I1126 15:08:02.873761 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:02Z","lastTransitionTime":"2025-11-26T15:08:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:02 crc kubenswrapper[4785]: I1126 15:08:02.977378 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:02 crc kubenswrapper[4785]: I1126 15:08:02.977466 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:02 crc kubenswrapper[4785]: I1126 15:08:02.977487 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:02 crc kubenswrapper[4785]: I1126 15:08:02.977509 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:02 crc kubenswrapper[4785]: I1126 15:08:02.977525 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:02Z","lastTransitionTime":"2025-11-26T15:08:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:03 crc kubenswrapper[4785]: I1126 15:08:03.035930 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 26 15:08:03 crc kubenswrapper[4785]: I1126 15:08:03.035939 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 26 15:08:03 crc kubenswrapper[4785]: E1126 15:08:03.036154 4785 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 26 15:08:03 crc kubenswrapper[4785]: E1126 15:08:03.036282 4785 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 26 15:08:03 crc kubenswrapper[4785]: I1126 15:08:03.079925 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:03 crc kubenswrapper[4785]: I1126 15:08:03.079996 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:03 crc kubenswrapper[4785]: I1126 15:08:03.080018 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:03 crc kubenswrapper[4785]: I1126 15:08:03.080045 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:03 crc kubenswrapper[4785]: I1126 15:08:03.080068 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:03Z","lastTransitionTime":"2025-11-26T15:08:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:03 crc kubenswrapper[4785]: I1126 15:08:03.183796 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:03 crc kubenswrapper[4785]: I1126 15:08:03.183848 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:03 crc kubenswrapper[4785]: I1126 15:08:03.183865 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:03 crc kubenswrapper[4785]: I1126 15:08:03.183887 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:03 crc kubenswrapper[4785]: I1126 15:08:03.183906 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:03Z","lastTransitionTime":"2025-11-26T15:08:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:03 crc kubenswrapper[4785]: I1126 15:08:03.286776 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:03 crc kubenswrapper[4785]: I1126 15:08:03.286835 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:03 crc kubenswrapper[4785]: I1126 15:08:03.286852 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:03 crc kubenswrapper[4785]: I1126 15:08:03.286876 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:03 crc kubenswrapper[4785]: I1126 15:08:03.286896 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:03Z","lastTransitionTime":"2025-11-26T15:08:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:03 crc kubenswrapper[4785]: I1126 15:08:03.389661 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:03 crc kubenswrapper[4785]: I1126 15:08:03.389719 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:03 crc kubenswrapper[4785]: I1126 15:08:03.389744 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:03 crc kubenswrapper[4785]: I1126 15:08:03.389770 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:03 crc kubenswrapper[4785]: I1126 15:08:03.389791 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:03Z","lastTransitionTime":"2025-11-26T15:08:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:03 crc kubenswrapper[4785]: I1126 15:08:03.492479 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:03 crc kubenswrapper[4785]: I1126 15:08:03.492541 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:03 crc kubenswrapper[4785]: I1126 15:08:03.492578 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:03 crc kubenswrapper[4785]: I1126 15:08:03.492600 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:03 crc kubenswrapper[4785]: I1126 15:08:03.492610 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:03Z","lastTransitionTime":"2025-11-26T15:08:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:03 crc kubenswrapper[4785]: I1126 15:08:03.595805 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:03 crc kubenswrapper[4785]: I1126 15:08:03.595851 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:03 crc kubenswrapper[4785]: I1126 15:08:03.595865 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:03 crc kubenswrapper[4785]: I1126 15:08:03.595880 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:03 crc kubenswrapper[4785]: I1126 15:08:03.595893 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:03Z","lastTransitionTime":"2025-11-26T15:08:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:03 crc kubenswrapper[4785]: I1126 15:08:03.700047 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:03 crc kubenswrapper[4785]: I1126 15:08:03.700116 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:03 crc kubenswrapper[4785]: I1126 15:08:03.700133 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:03 crc kubenswrapper[4785]: I1126 15:08:03.700159 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:03 crc kubenswrapper[4785]: I1126 15:08:03.700178 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:03Z","lastTransitionTime":"2025-11-26T15:08:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:03 crc kubenswrapper[4785]: I1126 15:08:03.803065 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:03 crc kubenswrapper[4785]: I1126 15:08:03.803139 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:03 crc kubenswrapper[4785]: I1126 15:08:03.803160 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:03 crc kubenswrapper[4785]: I1126 15:08:03.803192 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:03 crc kubenswrapper[4785]: I1126 15:08:03.803228 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:03Z","lastTransitionTime":"2025-11-26T15:08:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:03 crc kubenswrapper[4785]: I1126 15:08:03.891496 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:03 crc kubenswrapper[4785]: I1126 15:08:03.891601 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:03 crc kubenswrapper[4785]: I1126 15:08:03.891626 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:03 crc kubenswrapper[4785]: I1126 15:08:03.891654 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:03 crc kubenswrapper[4785]: I1126 15:08:03.891684 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:03Z","lastTransitionTime":"2025-11-26T15:08:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:03 crc kubenswrapper[4785]: E1126 15:08:03.913042 4785 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T15:08:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T15:08:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T15:08:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T15:08:03Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T15:08:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T15:08:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T15:08:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T15:08:03Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"564b0a72-079b-4004-bac6-7e7947bc6860\\\",\\\"systemUUID\\\":\\\"0559caf5-1d73-4afa-a2e0-e8d6b738bfd5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:08:03Z is after 2025-08-24T17:21:41Z" Nov 26 15:08:03 crc kubenswrapper[4785]: I1126 15:08:03.918204 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:03 crc kubenswrapper[4785]: I1126 15:08:03.918271 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:03 crc kubenswrapper[4785]: I1126 15:08:03.918294 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:03 crc kubenswrapper[4785]: I1126 15:08:03.918324 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:03 crc kubenswrapper[4785]: I1126 15:08:03.918347 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:03Z","lastTransitionTime":"2025-11-26T15:08:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:03 crc kubenswrapper[4785]: E1126 15:08:03.938754 4785 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T15:08:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T15:08:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T15:08:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T15:08:03Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T15:08:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T15:08:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T15:08:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T15:08:03Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"564b0a72-079b-4004-bac6-7e7947bc6860\\\",\\\"systemUUID\\\":\\\"0559caf5-1d73-4afa-a2e0-e8d6b738bfd5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:08:03Z is after 2025-08-24T17:21:41Z" Nov 26 15:08:03 crc kubenswrapper[4785]: I1126 15:08:03.944196 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:03 crc kubenswrapper[4785]: I1126 15:08:03.944348 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:03 crc kubenswrapper[4785]: I1126 15:08:03.944373 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:03 crc kubenswrapper[4785]: I1126 15:08:03.944395 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:03 crc kubenswrapper[4785]: I1126 15:08:03.944412 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:03Z","lastTransitionTime":"2025-11-26T15:08:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:03 crc kubenswrapper[4785]: E1126 15:08:03.965946 4785 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T15:08:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T15:08:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T15:08:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T15:08:03Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T15:08:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T15:08:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T15:08:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T15:08:03Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"564b0a72-079b-4004-bac6-7e7947bc6860\\\",\\\"systemUUID\\\":\\\"0559caf5-1d73-4afa-a2e0-e8d6b738bfd5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:08:03Z is after 2025-08-24T17:21:41Z" Nov 26 15:08:03 crc kubenswrapper[4785]: I1126 15:08:03.971083 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:03 crc kubenswrapper[4785]: I1126 15:08:03.971139 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:03 crc kubenswrapper[4785]: I1126 15:08:03.971157 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:03 crc kubenswrapper[4785]: I1126 15:08:03.971183 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:03 crc kubenswrapper[4785]: I1126 15:08:03.971203 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:03Z","lastTransitionTime":"2025-11-26T15:08:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:03 crc kubenswrapper[4785]: E1126 15:08:03.992016 4785 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T15:08:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T15:08:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T15:08:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T15:08:03Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T15:08:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T15:08:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T15:08:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T15:08:03Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"564b0a72-079b-4004-bac6-7e7947bc6860\\\",\\\"systemUUID\\\":\\\"0559caf5-1d73-4afa-a2e0-e8d6b738bfd5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:08:03Z is after 2025-08-24T17:21:41Z" Nov 26 15:08:03 crc kubenswrapper[4785]: I1126 15:08:03.997744 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:03 crc kubenswrapper[4785]: I1126 15:08:03.997828 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:03 crc kubenswrapper[4785]: I1126 15:08:03.997848 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:03 crc kubenswrapper[4785]: I1126 15:08:03.997875 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:03 crc kubenswrapper[4785]: I1126 15:08:03.997894 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:03Z","lastTransitionTime":"2025-11-26T15:08:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:04 crc kubenswrapper[4785]: E1126 15:08:04.017983 4785 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T15:08:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T15:08:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T15:08:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T15:08:03Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T15:08:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T15:08:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T15:08:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T15:08:03Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"564b0a72-079b-4004-bac6-7e7947bc6860\\\",\\\"systemUUID\\\":\\\"0559caf5-1d73-4afa-a2e0-e8d6b738bfd5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:08:04Z is after 2025-08-24T17:21:41Z" Nov 26 15:08:04 crc kubenswrapper[4785]: E1126 15:08:04.018149 4785 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Nov 26 15:08:04 crc kubenswrapper[4785]: I1126 15:08:04.023086 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:04 crc kubenswrapper[4785]: I1126 15:08:04.023167 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:04 crc kubenswrapper[4785]: I1126 15:08:04.023189 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:04 crc kubenswrapper[4785]: I1126 15:08:04.023217 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:04 crc kubenswrapper[4785]: I1126 15:08:04.023244 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:04Z","lastTransitionTime":"2025-11-26T15:08:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:04 crc kubenswrapper[4785]: I1126 15:08:04.035866 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qdfwp" Nov 26 15:08:04 crc kubenswrapper[4785]: I1126 15:08:04.035887 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 26 15:08:04 crc kubenswrapper[4785]: E1126 15:08:04.036118 4785 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qdfwp" podUID="72903df2-b694-4229-96b5-167500cab723" Nov 26 15:08:04 crc kubenswrapper[4785]: E1126 15:08:04.036195 4785 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 26 15:08:04 crc kubenswrapper[4785]: I1126 15:08:04.127459 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:04 crc kubenswrapper[4785]: I1126 15:08:04.127524 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:04 crc kubenswrapper[4785]: I1126 15:08:04.127541 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:04 crc kubenswrapper[4785]: I1126 15:08:04.127591 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:04 crc kubenswrapper[4785]: I1126 15:08:04.127611 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:04Z","lastTransitionTime":"2025-11-26T15:08:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:04 crc kubenswrapper[4785]: I1126 15:08:04.229974 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:04 crc kubenswrapper[4785]: I1126 15:08:04.230061 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:04 crc kubenswrapper[4785]: I1126 15:08:04.230086 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:04 crc kubenswrapper[4785]: I1126 15:08:04.230117 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:04 crc kubenswrapper[4785]: I1126 15:08:04.230139 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:04Z","lastTransitionTime":"2025-11-26T15:08:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:04 crc kubenswrapper[4785]: I1126 15:08:04.333452 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:04 crc kubenswrapper[4785]: I1126 15:08:04.333536 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:04 crc kubenswrapper[4785]: I1126 15:08:04.333587 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:04 crc kubenswrapper[4785]: I1126 15:08:04.333614 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:04 crc kubenswrapper[4785]: I1126 15:08:04.333632 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:04Z","lastTransitionTime":"2025-11-26T15:08:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:04 crc kubenswrapper[4785]: I1126 15:08:04.436133 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:04 crc kubenswrapper[4785]: I1126 15:08:04.436192 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:04 crc kubenswrapper[4785]: I1126 15:08:04.436209 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:04 crc kubenswrapper[4785]: I1126 15:08:04.436237 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:04 crc kubenswrapper[4785]: I1126 15:08:04.436260 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:04Z","lastTransitionTime":"2025-11-26T15:08:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:04 crc kubenswrapper[4785]: I1126 15:08:04.538757 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:04 crc kubenswrapper[4785]: I1126 15:08:04.538818 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:04 crc kubenswrapper[4785]: I1126 15:08:04.538838 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:04 crc kubenswrapper[4785]: I1126 15:08:04.538862 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:04 crc kubenswrapper[4785]: I1126 15:08:04.538881 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:04Z","lastTransitionTime":"2025-11-26T15:08:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:04 crc kubenswrapper[4785]: I1126 15:08:04.642402 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:04 crc kubenswrapper[4785]: I1126 15:08:04.642481 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:04 crc kubenswrapper[4785]: I1126 15:08:04.642504 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:04 crc kubenswrapper[4785]: I1126 15:08:04.642533 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:04 crc kubenswrapper[4785]: I1126 15:08:04.642593 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:04Z","lastTransitionTime":"2025-11-26T15:08:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:04 crc kubenswrapper[4785]: I1126 15:08:04.745430 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:04 crc kubenswrapper[4785]: I1126 15:08:04.745542 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:04 crc kubenswrapper[4785]: I1126 15:08:04.745582 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:04 crc kubenswrapper[4785]: I1126 15:08:04.745608 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:04 crc kubenswrapper[4785]: I1126 15:08:04.745628 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:04Z","lastTransitionTime":"2025-11-26T15:08:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:04 crc kubenswrapper[4785]: I1126 15:08:04.848105 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:04 crc kubenswrapper[4785]: I1126 15:08:04.848184 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:04 crc kubenswrapper[4785]: I1126 15:08:04.848206 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:04 crc kubenswrapper[4785]: I1126 15:08:04.848238 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:04 crc kubenswrapper[4785]: I1126 15:08:04.848260 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:04Z","lastTransitionTime":"2025-11-26T15:08:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:04 crc kubenswrapper[4785]: I1126 15:08:04.950601 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:04 crc kubenswrapper[4785]: I1126 15:08:04.950676 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:04 crc kubenswrapper[4785]: I1126 15:08:04.950698 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:04 crc kubenswrapper[4785]: I1126 15:08:04.950731 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:04 crc kubenswrapper[4785]: I1126 15:08:04.950755 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:04Z","lastTransitionTime":"2025-11-26T15:08:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:05 crc kubenswrapper[4785]: I1126 15:08:05.035731 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 26 15:08:05 crc kubenswrapper[4785]: I1126 15:08:05.035757 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 26 15:08:05 crc kubenswrapper[4785]: E1126 15:08:05.036030 4785 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 26 15:08:05 crc kubenswrapper[4785]: E1126 15:08:05.036352 4785 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 26 15:08:05 crc kubenswrapper[4785]: I1126 15:08:05.038020 4785 scope.go:117] "RemoveContainer" containerID="70cfa5c92e3afde972c24f4c6d70dcfe2bf73b25223abd8628949f267b3c30c5" Nov 26 15:08:05 crc kubenswrapper[4785]: I1126 15:08:05.054529 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:05 crc kubenswrapper[4785]: I1126 15:08:05.054868 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:05 crc kubenswrapper[4785]: I1126 15:08:05.055329 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:05 crc kubenswrapper[4785]: I1126 15:08:05.055784 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:05 crc kubenswrapper[4785]: I1126 15:08:05.056217 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:05Z","lastTransitionTime":"2025-11-26T15:08:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:05 crc kubenswrapper[4785]: I1126 15:08:05.160207 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:05 crc kubenswrapper[4785]: I1126 15:08:05.160246 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:05 crc kubenswrapper[4785]: I1126 15:08:05.160255 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:05 crc kubenswrapper[4785]: I1126 15:08:05.160268 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:05 crc kubenswrapper[4785]: I1126 15:08:05.160281 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:05Z","lastTransitionTime":"2025-11-26T15:08:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:05 crc kubenswrapper[4785]: I1126 15:08:05.262885 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:05 crc kubenswrapper[4785]: I1126 15:08:05.263258 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:05 crc kubenswrapper[4785]: I1126 15:08:05.263272 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:05 crc kubenswrapper[4785]: I1126 15:08:05.263292 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:05 crc kubenswrapper[4785]: I1126 15:08:05.263306 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:05Z","lastTransitionTime":"2025-11-26T15:08:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:05 crc kubenswrapper[4785]: I1126 15:08:05.365518 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:05 crc kubenswrapper[4785]: I1126 15:08:05.365616 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:05 crc kubenswrapper[4785]: I1126 15:08:05.365647 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:05 crc kubenswrapper[4785]: I1126 15:08:05.365676 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:05 crc kubenswrapper[4785]: I1126 15:08:05.365698 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:05Z","lastTransitionTime":"2025-11-26T15:08:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:05 crc kubenswrapper[4785]: I1126 15:08:05.380000 4785 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-925q9_862c58fd-3f79-4276-bd76-ce689d32cbd6/ovnkube-controller/1.log" Nov 26 15:08:05 crc kubenswrapper[4785]: I1126 15:08:05.383216 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-925q9" event={"ID":"862c58fd-3f79-4276-bd76-ce689d32cbd6","Type":"ContainerStarted","Data":"e098d64d0baa7671b1693be648e0a05bcf267a0fbb27f8df8341589e7f2403a8"} Nov 26 15:08:05 crc kubenswrapper[4785]: I1126 15:08:05.383711 4785 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-925q9" Nov 26 15:08:05 crc kubenswrapper[4785]: I1126 15:08:05.396281 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zk6jt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"09c65f70-203d-40eb-a45e-ed8d7e36912f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69bcee7670290dd116b68ea6d15585455861981551919b4b553fc699f43161e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dg52p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba1f2e54aa3b8e81f5bc6ad4531d5976cfd8d6c4c15f16cefecc75c3c6189ef4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dg52p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-zk6jt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:08:05Z is after 2025-08-24T17:21:41Z" Nov 26 15:08:05 crc kubenswrapper[4785]: I1126 15:08:05.410087 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:08:05Z is after 2025-08-24T17:21:41Z" Nov 26 15:08:05 crc kubenswrapper[4785]: I1126 15:08:05.424530 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-smv28" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d77660b-5b10-4573-84ab-3dc318d4b4ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3bb62d64073e9a5efa6ec04394fb33bf1498892d59e0843f323a728dff2e7b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mwfft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:38Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-smv28\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:08:05Z is after 2025-08-24T17:21:41Z" Nov 26 15:08:05 crc kubenswrapper[4785]: I1126 15:08:05.447366 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ce2473df-8540-437a-9b68-a0c79c8f1189\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26237e500f4655636096b017418a9029308b9ad9e5c12041546e51d919766529\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7904d95c1ece08fadcdc54e83f80b295f3904cc950fb84624bc2a6967e4d5bdd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://936454b4c6cdedceb140c50e2dba0e677ce2b126075b203bdae8a77bbdefde76\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2202f005c9b704cc7a97a7387e4563a14bda9e52001c63918f1067a7e676d34d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a6d4414478527bb76f539125e4be6ebf2a6470ddb46fd35e07180319fa98ede\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-26T15:07:36Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1126 15:07:30.557519 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1126 15:07:30.559919 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2275430498/tls.crt::/tmp/serving-cert-2275430498/tls.key\\\\\\\"\\\\nI1126 15:07:36.843783 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1126 15:07:36.847546 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1126 15:07:36.847601 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1126 15:07:36.847630 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1126 15:07:36.847642 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1126 15:07:36.858915 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1126 15:07:36.858955 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1126 15:07:36.858960 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1126 15:07:36.858971 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1126 15:07:36.858975 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1126 15:07:36.858978 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1126 15:07:36.858982 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1126 15:07:36.859422 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1126 15:07:36.862125 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:20Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcce80995d95ff6993051a4f1ca260133d2513d216a291e3f2c905ec29f88d1b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e86ca1ef167bd0c31ad5ae1283d40945c72c4e528f3702b577bbbf9818e84c08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e86ca1ef167bd0c31ad5ae1283d40945c72c4e528f3702b577bbbf9818e84c08\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:08:05Z is after 2025-08-24T17:21:41Z" Nov 26 15:08:05 crc kubenswrapper[4785]: I1126 15:08:05.468218 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:05 crc kubenswrapper[4785]: I1126 15:08:05.468285 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:05 crc kubenswrapper[4785]: I1126 15:08:05.468302 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:05 crc kubenswrapper[4785]: I1126 15:08:05.468328 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:05 crc kubenswrapper[4785]: I1126 15:08:05.468347 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:05Z","lastTransitionTime":"2025-11-26T15:08:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:05 crc kubenswrapper[4785]: I1126 15:08:05.474512 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f369bd7df062792e57a0cee19e7be96c6850b20d6e739a0c7fa5eccc14ae1d1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:08:05Z is after 2025-08-24T17:21:41Z" Nov 26 15:08:05 crc kubenswrapper[4785]: I1126 15:08:05.495379 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:08:05Z is after 2025-08-24T17:21:41Z" Nov 26 15:08:05 crc kubenswrapper[4785]: I1126 15:08:05.519836 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8acfa933b476aa5e196c195906225ae92af5cca3353292e23f29f87ee374d0ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f432643beb27bff31e146230a60837b1292cddb8595ede8ed7f929170fceeb17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:08:05Z is after 2025-08-24T17:21:41Z" Nov 26 15:08:05 crc kubenswrapper[4785]: I1126 15:08:05.537020 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ab8b8fda3321443cb24c214c10ba7171f01c80adf81860f4aa931062ec886d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:08:05Z is after 2025-08-24T17:21:41Z" Nov 26 15:08:05 crc kubenswrapper[4785]: I1126 15:08:05.554223 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:08:05Z is after 2025-08-24T17:21:41Z" Nov 26 15:08:05 crc kubenswrapper[4785]: I1126 15:08:05.570717 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:05 crc kubenswrapper[4785]: I1126 15:08:05.571115 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xbz7b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84d83039-5d86-45bb-a5c1-ca5b94ed92c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5459c3414a7754407ac6d182e88047bbe1b7b9750d15c7a88767c50c4b9dc720\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gdb7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f7072a2a9067489a27300fea823075b89d71d45a1d972bd9d94226b01c2c123\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f7072a2a9067489a27300fea823075b89d71d45a1d972bd9d94226b01c2c123\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gdb7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://538d65a0b5f3e2c45c1b9761f2c73e958d5de0873652f20f4047631423e538eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://538d65a0b5f3e2c45c1b9761f2c73e958d5de0873652f20f4047631423e538eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gdb7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1abf2b4c1a00c0bc17acaf7547130008127579df59592382ae6e81430936e6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d1abf2b4c1a00c0bc17acaf7547130008127579df59592382ae6e81430936e6d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gdb7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://35a2151729e5cb2ae347214fc91e6de116a4062943e4cb4a27ab0553436953ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35a2151729e5cb2ae347214fc91e6de116a4062943e4cb4a27ab0553436953ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gdb7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b51e95d33c83c57a3b5a7f580eb9cac5e50575e699c066694a9ae8fd4dcdfb20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b51e95d33c83c57a3b5a7f580eb9cac5e50575e699c066694a9ae8fd4dcdfb20\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gdb7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20f87e07ed26b3b93d4dad87f971e4ee3bf3c46299491a95990ab34c63167f93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://20f87e07ed26b3b93d4dad87f971e4ee3bf3c46299491a95990ab34c63167f93\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gdb7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xbz7b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:08:05Z is after 2025-08-24T17:21:41Z" Nov 26 15:08:05 crc kubenswrapper[4785]: I1126 15:08:05.571704 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:05 crc kubenswrapper[4785]: I1126 15:08:05.571743 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:05 crc kubenswrapper[4785]: I1126 15:08:05.571782 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:05 crc kubenswrapper[4785]: I1126 15:08:05.571804 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:05Z","lastTransitionTime":"2025-11-26T15:08:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:05 crc kubenswrapper[4785]: I1126 15:08:05.588214 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-gkxdl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5539d39a-e2bc-4e7f-8b5a-a5e3e10c4ba4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://183451ec5e6ad1983b6e56a53adfcf2eaf3489dcfbc3cbc709f67002691b9010\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqzs8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5538d579d2e0ac36c778f58da6e8176f8f7a0995797e8bc7c3ad21a054346867\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqzs8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-gkxdl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:08:05Z is after 2025-08-24T17:21:41Z" Nov 26 15:08:05 crc kubenswrapper[4785]: I1126 15:08:05.599774 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dbac4f7f-578e-41e6-88a0-4ad8d2f5eef2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://320fc519a91dfed4d5c233e7c1c2f21aadf25336ccade8830069aae4bd804a68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b79ab88118b1e683ad26fc3bd390961d15af99cce73126c56587c0df819f3a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6c964ff713703fe4484d76d86473279ffc4a3eef0cf3dd8b29aea7fdc6271616\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03ac9c745ec9eb21263a599561496630202872ef631f5e846e0475efedee6943\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:08:05Z is after 2025-08-24T17:21:41Z" Nov 26 15:08:05 crc kubenswrapper[4785]: I1126 15:08:05.618289 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ec3c2133-1999-46b4-b5a6-c1b800f04c30\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee866b365ef5e15c3cc262ab6ad0649b10fa7fdf0c10c3f747d9e20d48535e76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79b8f6bd2047642cacd4effcee8f9760c7c9480092993c7a25eabe4727bd8238\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dee1454ae94951e83f437b999f871e4965a9199a20bcd6392a6e07a7bfa9c6aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://257e51cb9b93727ccb616e193abc21dfed1ee9ee2f1ce459e35728457dae5280\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc743861f85f56a87c6a20d7152af59e554284fe13b8212c5dae854438771406\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ff60b31932a511f9156f1416e112d115fcbc79a0a6030143ba9f5ddd327577a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ff60b31932a511f9156f1416e112d115fcbc79a0a6030143ba9f5ddd327577a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff524be439c6bf410b9377cd503516b43be3826e4c2fcd03a6ed9bf3fcfab95b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff524be439c6bf410b9377cd503516b43be3826e4c2fcd03a6ed9bf3fcfab95b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:19Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://26cb5ba5508ef320a24ba05e352547c9feb82f37ebeab58ad7c0a073d03b8c2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26cb5ba5508ef320a24ba05e352547c9feb82f37ebeab58ad7c0a073d03b8c2d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:17Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:08:05Z is after 2025-08-24T17:21:41Z" Nov 26 15:08:05 crc kubenswrapper[4785]: I1126 15:08:05.636769 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-925q9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"862c58fd-3f79-4276-bd76-ce689d32cbd6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b754df1d5f2d20f3afdb10f2e77f4716de9d953c7322bb3e289d989d2699d17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://018b59a73e9d454f9156de17b00a8c5cccc3f321164cac374187afa18466bc45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e1a2013c12cbd23c8c9e4e0d5bd9f76df1c2c2f30f0d55fdb8c8cb03698ea576\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bc34631383b7470b2d582d037ea922f2c46d09ca9e37011d758e16837336a78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://77387f2d556a76807d809c75aa4d9e2fdf5de82d7a04e23d3f5461ccef860196\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://47be007cc11c8c2db5bc6dd5786cad4e45c1f8f08f872d060be0211012aad935\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e098d64d0baa7671b1693be648e0a05bcf267a0fbb27f8df8341589e7f2403a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://70cfa5c92e3afde972c24f4c6d70dcfe2bf73b25223abd8628949f267b3c30c5\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-26T15:07:50Z\\\",\\\"message\\\":\\\"Built service openshift-config-operator/metrics per-node LB for network=default: []services.LB{}\\\\nI1126 15:07:50.056310 6208 services_controller.go:453] Built service openshift-config-operator/metrics template LB for network=default: []services.LB{}\\\\nI1126 15:07:50.056320 6208 services_controller.go:454] Service openshift-config-operator/metrics for network=default has 1 cluster-wide, 0 per-node configs, 0 template configs, making 1 (cluster) 0 (per node) and 0 (template) load balancers\\\\nI1126 15:07:50.056218 6208 lb_config.go:1031] Cluster endpoints for openshift-ingress-canary/ingress-canary for network=default are: map[]\\\\nI1126 15:07:50.055930 6208 ovn.go:134] Ensuring zone local for Pod openshift-machine-config-operator/machine-config-daemon-gkxdl in node crc\\\\nI1126 15:07:50.056357 6208 services_controller.go:443] Built service openshift-ingress-canary/ingress-canary LB cluster-wide configs for network=default: []services.lbConfig{services.lbConfig{vips:[]string{\\\\\\\"10.217.5.34\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:8443, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]string(nil), V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}, services.lbConfig{vips:[]string{\\\\\\\"10.217.5.34\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:8888, clusterEndpoints:services.lbEndpoints\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:49Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84b98c7c476182b515072f71c6d81656debe5c03855324623b1eb31f53e6c5c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://049fc6de07d4b51e1c898035109c3427dff189cc509f5973db7cb75c3a022bc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://049fc6de07d4b51e1c898035109c3427dff189cc509f5973db7cb75c3a022bc1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-925q9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:08:05Z is after 2025-08-24T17:21:41Z" Nov 26 15:08:05 crc kubenswrapper[4785]: I1126 15:08:05.645919 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-hk884" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b9a87c55-b930-4993-88b8-15902e000caa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://018c4591e8ee49a626a6357d6672b723d221f62eaa62face83fd6c3d5f1b3bfa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n9rk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:38Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-hk884\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:08:05Z is after 2025-08-24T17:21:41Z" Nov 26 15:08:05 crc kubenswrapper[4785]: I1126 15:08:05.657422 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6q4xd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"855bd894-cca9-4fe1-a0d5-8b72afe7c93a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://041064cd63f3d654e02e5e310fbc190ca527a986959c9d7a35ab87962184f2a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7zv2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6q4xd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:08:05Z is after 2025-08-24T17:21:41Z" Nov 26 15:08:05 crc kubenswrapper[4785]: I1126 15:08:05.666895 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-qdfwp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72903df2-b694-4229-96b5-167500cab723\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrprh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrprh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:52Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-qdfwp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:08:05Z is after 2025-08-24T17:21:41Z" Nov 26 15:08:05 crc kubenswrapper[4785]: I1126 15:08:05.674681 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:05 crc kubenswrapper[4785]: I1126 15:08:05.674715 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:05 crc kubenswrapper[4785]: I1126 15:08:05.674726 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:05 crc kubenswrapper[4785]: I1126 15:08:05.674740 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:05 crc kubenswrapper[4785]: I1126 15:08:05.674752 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:05Z","lastTransitionTime":"2025-11-26T15:08:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:05 crc kubenswrapper[4785]: I1126 15:08:05.777025 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:05 crc kubenswrapper[4785]: I1126 15:08:05.777064 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:05 crc kubenswrapper[4785]: I1126 15:08:05.777075 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:05 crc kubenswrapper[4785]: I1126 15:08:05.777089 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:05 crc kubenswrapper[4785]: I1126 15:08:05.777099 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:05Z","lastTransitionTime":"2025-11-26T15:08:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:05 crc kubenswrapper[4785]: I1126 15:08:05.891028 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:05 crc kubenswrapper[4785]: I1126 15:08:05.891068 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:05 crc kubenswrapper[4785]: I1126 15:08:05.891078 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:05 crc kubenswrapper[4785]: I1126 15:08:05.891094 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:05 crc kubenswrapper[4785]: I1126 15:08:05.891105 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:05Z","lastTransitionTime":"2025-11-26T15:08:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:05 crc kubenswrapper[4785]: I1126 15:08:05.993456 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:05 crc kubenswrapper[4785]: I1126 15:08:05.993484 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:05 crc kubenswrapper[4785]: I1126 15:08:05.993492 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:05 crc kubenswrapper[4785]: I1126 15:08:05.993503 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:05 crc kubenswrapper[4785]: I1126 15:08:05.993512 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:05Z","lastTransitionTime":"2025-11-26T15:08:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:06 crc kubenswrapper[4785]: I1126 15:08:06.035476 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qdfwp" Nov 26 15:08:06 crc kubenswrapper[4785]: I1126 15:08:06.035521 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 26 15:08:06 crc kubenswrapper[4785]: E1126 15:08:06.035614 4785 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qdfwp" podUID="72903df2-b694-4229-96b5-167500cab723" Nov 26 15:08:06 crc kubenswrapper[4785]: E1126 15:08:06.035711 4785 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 26 15:08:06 crc kubenswrapper[4785]: I1126 15:08:06.096024 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:06 crc kubenswrapper[4785]: I1126 15:08:06.096055 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:06 crc kubenswrapper[4785]: I1126 15:08:06.096064 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:06 crc kubenswrapper[4785]: I1126 15:08:06.096081 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:06 crc kubenswrapper[4785]: I1126 15:08:06.096094 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:06Z","lastTransitionTime":"2025-11-26T15:08:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:06 crc kubenswrapper[4785]: I1126 15:08:06.199168 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:06 crc kubenswrapper[4785]: I1126 15:08:06.199231 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:06 crc kubenswrapper[4785]: I1126 15:08:06.199247 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:06 crc kubenswrapper[4785]: I1126 15:08:06.199271 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:06 crc kubenswrapper[4785]: I1126 15:08:06.199289 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:06Z","lastTransitionTime":"2025-11-26T15:08:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:06 crc kubenswrapper[4785]: I1126 15:08:06.302238 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:06 crc kubenswrapper[4785]: I1126 15:08:06.302299 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:06 crc kubenswrapper[4785]: I1126 15:08:06.302309 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:06 crc kubenswrapper[4785]: I1126 15:08:06.302324 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:06 crc kubenswrapper[4785]: I1126 15:08:06.302334 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:06Z","lastTransitionTime":"2025-11-26T15:08:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:06 crc kubenswrapper[4785]: I1126 15:08:06.389775 4785 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-925q9_862c58fd-3f79-4276-bd76-ce689d32cbd6/ovnkube-controller/2.log" Nov 26 15:08:06 crc kubenswrapper[4785]: I1126 15:08:06.390522 4785 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-925q9_862c58fd-3f79-4276-bd76-ce689d32cbd6/ovnkube-controller/1.log" Nov 26 15:08:06 crc kubenswrapper[4785]: I1126 15:08:06.393805 4785 generic.go:334] "Generic (PLEG): container finished" podID="862c58fd-3f79-4276-bd76-ce689d32cbd6" containerID="e098d64d0baa7671b1693be648e0a05bcf267a0fbb27f8df8341589e7f2403a8" exitCode=1 Nov 26 15:08:06 crc kubenswrapper[4785]: I1126 15:08:06.393848 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-925q9" event={"ID":"862c58fd-3f79-4276-bd76-ce689d32cbd6","Type":"ContainerDied","Data":"e098d64d0baa7671b1693be648e0a05bcf267a0fbb27f8df8341589e7f2403a8"} Nov 26 15:08:06 crc kubenswrapper[4785]: I1126 15:08:06.393889 4785 scope.go:117] "RemoveContainer" containerID="70cfa5c92e3afde972c24f4c6d70dcfe2bf73b25223abd8628949f267b3c30c5" Nov 26 15:08:06 crc kubenswrapper[4785]: I1126 15:08:06.394527 4785 scope.go:117] "RemoveContainer" containerID="e098d64d0baa7671b1693be648e0a05bcf267a0fbb27f8df8341589e7f2403a8" Nov 26 15:08:06 crc kubenswrapper[4785]: E1126 15:08:06.394691 4785 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-925q9_openshift-ovn-kubernetes(862c58fd-3f79-4276-bd76-ce689d32cbd6)\"" pod="openshift-ovn-kubernetes/ovnkube-node-925q9" podUID="862c58fd-3f79-4276-bd76-ce689d32cbd6" Nov 26 15:08:06 crc kubenswrapper[4785]: I1126 15:08:06.406020 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:06 crc kubenswrapper[4785]: I1126 15:08:06.406048 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:06 crc kubenswrapper[4785]: I1126 15:08:06.406058 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:06 crc kubenswrapper[4785]: I1126 15:08:06.406071 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:06 crc kubenswrapper[4785]: I1126 15:08:06.406080 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:06Z","lastTransitionTime":"2025-11-26T15:08:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:06 crc kubenswrapper[4785]: I1126 15:08:06.417580 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zk6jt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"09c65f70-203d-40eb-a45e-ed8d7e36912f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69bcee7670290dd116b68ea6d15585455861981551919b4b553fc699f43161e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dg52p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba1f2e54aa3b8e81f5bc6ad4531d5976cfd8d6c4c15f16cefecc75c3c6189ef4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dg52p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-zk6jt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:08:06Z is after 2025-08-24T17:21:41Z" Nov 26 15:08:06 crc kubenswrapper[4785]: I1126 15:08:06.431371 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:08:06Z is after 2025-08-24T17:21:41Z" Nov 26 15:08:06 crc kubenswrapper[4785]: I1126 15:08:06.445714 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-smv28" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d77660b-5b10-4573-84ab-3dc318d4b4ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3bb62d64073e9a5efa6ec04394fb33bf1498892d59e0843f323a728dff2e7b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mwfft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:38Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-smv28\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:08:06Z is after 2025-08-24T17:21:41Z" Nov 26 15:08:06 crc kubenswrapper[4785]: I1126 15:08:06.466463 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ce2473df-8540-437a-9b68-a0c79c8f1189\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26237e500f4655636096b017418a9029308b9ad9e5c12041546e51d919766529\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7904d95c1ece08fadcdc54e83f80b295f3904cc950fb84624bc2a6967e4d5bdd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://936454b4c6cdedceb140c50e2dba0e677ce2b126075b203bdae8a77bbdefde76\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2202f005c9b704cc7a97a7387e4563a14bda9e52001c63918f1067a7e676d34d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a6d4414478527bb76f539125e4be6ebf2a6470ddb46fd35e07180319fa98ede\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-26T15:07:36Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1126 15:07:30.557519 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1126 15:07:30.559919 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2275430498/tls.crt::/tmp/serving-cert-2275430498/tls.key\\\\\\\"\\\\nI1126 15:07:36.843783 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1126 15:07:36.847546 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1126 15:07:36.847601 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1126 15:07:36.847630 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1126 15:07:36.847642 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1126 15:07:36.858915 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1126 15:07:36.858955 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1126 15:07:36.858960 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1126 15:07:36.858971 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1126 15:07:36.858975 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1126 15:07:36.858978 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1126 15:07:36.858982 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1126 15:07:36.859422 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1126 15:07:36.862125 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:20Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcce80995d95ff6993051a4f1ca260133d2513d216a291e3f2c905ec29f88d1b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e86ca1ef167bd0c31ad5ae1283d40945c72c4e528f3702b577bbbf9818e84c08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e86ca1ef167bd0c31ad5ae1283d40945c72c4e528f3702b577bbbf9818e84c08\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:08:06Z is after 2025-08-24T17:21:41Z" Nov 26 15:08:06 crc kubenswrapper[4785]: I1126 15:08:06.481773 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f369bd7df062792e57a0cee19e7be96c6850b20d6e739a0c7fa5eccc14ae1d1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:08:06Z is after 2025-08-24T17:21:41Z" Nov 26 15:08:06 crc kubenswrapper[4785]: I1126 15:08:06.494384 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:08:06Z is after 2025-08-24T17:21:41Z" Nov 26 15:08:06 crc kubenswrapper[4785]: I1126 15:08:06.507581 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8acfa933b476aa5e196c195906225ae92af5cca3353292e23f29f87ee374d0ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f432643beb27bff31e146230a60837b1292cddb8595ede8ed7f929170fceeb17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:08:06Z is after 2025-08-24T17:21:41Z" Nov 26 15:08:06 crc kubenswrapper[4785]: I1126 15:08:06.508389 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:06 crc kubenswrapper[4785]: I1126 15:08:06.508438 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:06 crc kubenswrapper[4785]: I1126 15:08:06.508454 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:06 crc kubenswrapper[4785]: I1126 15:08:06.508472 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:06 crc kubenswrapper[4785]: I1126 15:08:06.508486 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:06Z","lastTransitionTime":"2025-11-26T15:08:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:06 crc kubenswrapper[4785]: I1126 15:08:06.520272 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ab8b8fda3321443cb24c214c10ba7171f01c80adf81860f4aa931062ec886d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:08:06Z is after 2025-08-24T17:21:41Z" Nov 26 15:08:06 crc kubenswrapper[4785]: I1126 15:08:06.532691 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:08:06Z is after 2025-08-24T17:21:41Z" Nov 26 15:08:06 crc kubenswrapper[4785]: I1126 15:08:06.547703 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xbz7b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84d83039-5d86-45bb-a5c1-ca5b94ed92c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5459c3414a7754407ac6d182e88047bbe1b7b9750d15c7a88767c50c4b9dc720\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gdb7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f7072a2a9067489a27300fea823075b89d71d45a1d972bd9d94226b01c2c123\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f7072a2a9067489a27300fea823075b89d71d45a1d972bd9d94226b01c2c123\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gdb7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://538d65a0b5f3e2c45c1b9761f2c73e958d5de0873652f20f4047631423e538eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://538d65a0b5f3e2c45c1b9761f2c73e958d5de0873652f20f4047631423e538eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gdb7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1abf2b4c1a00c0bc17acaf7547130008127579df59592382ae6e81430936e6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d1abf2b4c1a00c0bc17acaf7547130008127579df59592382ae6e81430936e6d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gdb7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://35a2151729e5cb2ae347214fc91e6de116a4062943e4cb4a27ab0553436953ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35a2151729e5cb2ae347214fc91e6de116a4062943e4cb4a27ab0553436953ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gdb7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b51e95d33c83c57a3b5a7f580eb9cac5e50575e699c066694a9ae8fd4dcdfb20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b51e95d33c83c57a3b5a7f580eb9cac5e50575e699c066694a9ae8fd4dcdfb20\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gdb7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20f87e07ed26b3b93d4dad87f971e4ee3bf3c46299491a95990ab34c63167f93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://20f87e07ed26b3b93d4dad87f971e4ee3bf3c46299491a95990ab34c63167f93\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gdb7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xbz7b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:08:06Z is after 2025-08-24T17:21:41Z" Nov 26 15:08:06 crc kubenswrapper[4785]: I1126 15:08:06.562019 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-gkxdl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5539d39a-e2bc-4e7f-8b5a-a5e3e10c4ba4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://183451ec5e6ad1983b6e56a53adfcf2eaf3489dcfbc3cbc709f67002691b9010\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqzs8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5538d579d2e0ac36c778f58da6e8176f8f7a0995797e8bc7c3ad21a054346867\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqzs8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-gkxdl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:08:06Z is after 2025-08-24T17:21:41Z" Nov 26 15:08:06 crc kubenswrapper[4785]: I1126 15:08:06.574356 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dbac4f7f-578e-41e6-88a0-4ad8d2f5eef2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://320fc519a91dfed4d5c233e7c1c2f21aadf25336ccade8830069aae4bd804a68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b79ab88118b1e683ad26fc3bd390961d15af99cce73126c56587c0df819f3a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6c964ff713703fe4484d76d86473279ffc4a3eef0cf3dd8b29aea7fdc6271616\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03ac9c745ec9eb21263a599561496630202872ef631f5e846e0475efedee6943\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:08:06Z is after 2025-08-24T17:21:41Z" Nov 26 15:08:06 crc kubenswrapper[4785]: I1126 15:08:06.593881 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ec3c2133-1999-46b4-b5a6-c1b800f04c30\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee866b365ef5e15c3cc262ab6ad0649b10fa7fdf0c10c3f747d9e20d48535e76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79b8f6bd2047642cacd4effcee8f9760c7c9480092993c7a25eabe4727bd8238\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dee1454ae94951e83f437b999f871e4965a9199a20bcd6392a6e07a7bfa9c6aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://257e51cb9b93727ccb616e193abc21dfed1ee9ee2f1ce459e35728457dae5280\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc743861f85f56a87c6a20d7152af59e554284fe13b8212c5dae854438771406\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ff60b31932a511f9156f1416e112d115fcbc79a0a6030143ba9f5ddd327577a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ff60b31932a511f9156f1416e112d115fcbc79a0a6030143ba9f5ddd327577a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff524be439c6bf410b9377cd503516b43be3826e4c2fcd03a6ed9bf3fcfab95b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff524be439c6bf410b9377cd503516b43be3826e4c2fcd03a6ed9bf3fcfab95b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:19Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://26cb5ba5508ef320a24ba05e352547c9feb82f37ebeab58ad7c0a073d03b8c2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26cb5ba5508ef320a24ba05e352547c9feb82f37ebeab58ad7c0a073d03b8c2d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:17Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:08:06Z is after 2025-08-24T17:21:41Z" Nov 26 15:08:06 crc kubenswrapper[4785]: I1126 15:08:06.611047 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:06 crc kubenswrapper[4785]: I1126 15:08:06.611096 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:06 crc kubenswrapper[4785]: I1126 15:08:06.611104 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:06 crc kubenswrapper[4785]: I1126 15:08:06.611117 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:06 crc kubenswrapper[4785]: I1126 15:08:06.611127 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:06Z","lastTransitionTime":"2025-11-26T15:08:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:06 crc kubenswrapper[4785]: I1126 15:08:06.612929 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-925q9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"862c58fd-3f79-4276-bd76-ce689d32cbd6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b754df1d5f2d20f3afdb10f2e77f4716de9d953c7322bb3e289d989d2699d17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://018b59a73e9d454f9156de17b00a8c5cccc3f321164cac374187afa18466bc45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e1a2013c12cbd23c8c9e4e0d5bd9f76df1c2c2f30f0d55fdb8c8cb03698ea576\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bc34631383b7470b2d582d037ea922f2c46d09ca9e37011d758e16837336a78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://77387f2d556a76807d809c75aa4d9e2fdf5de82d7a04e23d3f5461ccef860196\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://47be007cc11c8c2db5bc6dd5786cad4e45c1f8f08f872d060be0211012aad935\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e098d64d0baa7671b1693be648e0a05bcf267a0fbb27f8df8341589e7f2403a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://70cfa5c92e3afde972c24f4c6d70dcfe2bf73b25223abd8628949f267b3c30c5\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-26T15:07:50Z\\\",\\\"message\\\":\\\"Built service openshift-config-operator/metrics per-node LB for network=default: []services.LB{}\\\\nI1126 15:07:50.056310 6208 services_controller.go:453] Built service openshift-config-operator/metrics template LB for network=default: []services.LB{}\\\\nI1126 15:07:50.056320 6208 services_controller.go:454] Service openshift-config-operator/metrics for network=default has 1 cluster-wide, 0 per-node configs, 0 template configs, making 1 (cluster) 0 (per node) and 0 (template) load balancers\\\\nI1126 15:07:50.056218 6208 lb_config.go:1031] Cluster endpoints for openshift-ingress-canary/ingress-canary for network=default are: map[]\\\\nI1126 15:07:50.055930 6208 ovn.go:134] Ensuring zone local for Pod openshift-machine-config-operator/machine-config-daemon-gkxdl in node crc\\\\nI1126 15:07:50.056357 6208 services_controller.go:443] Built service openshift-ingress-canary/ingress-canary LB cluster-wide configs for network=default: []services.lbConfig{services.lbConfig{vips:[]string{\\\\\\\"10.217.5.34\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:8443, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]string(nil), V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}, services.lbConfig{vips:[]string{\\\\\\\"10.217.5.34\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:8888, clusterEndpoints:services.lbEndpoints\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:49Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e098d64d0baa7671b1693be648e0a05bcf267a0fbb27f8df8341589e7f2403a8\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-26T15:08:05Z\\\",\\\"message\\\":\\\" event handler 8\\\\nI1126 15:08:05.966708 6428 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1126 15:08:05.966737 6428 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1126 15:08:05.966799 6428 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1126 15:08:05.966828 6428 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1126 15:08:05.966851 6428 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1126 15:08:05.966838 6428 handler.go:208] Removed *v1.Node event handler 2\\\\nI1126 15:08:05.966854 6428 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1126 15:08:05.966902 6428 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1126 15:08:05.966906 6428 factory.go:656] Stopping watch factory\\\\nI1126 15:08:05.966902 6428 handler.go:208] Removed *v1.Node event handler 7\\\\nI1126 15:08:05.966903 6428 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI1126 15:08:05.966957 6428 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1126 15:08:05.967001 6428 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI1126 15:08:05.967033 6428 ovnkube.go:599] Stopped ovnkube\\\\nI1126 15:08:05.967055 6428 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1126 15:08:05.967109 6428 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-26T15:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84b98c7c476182b515072f71c6d81656debe5c03855324623b1eb31f53e6c5c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://049fc6de07d4b51e1c898035109c3427dff189cc509f5973db7cb75c3a022bc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://049fc6de07d4b51e1c898035109c3427dff189cc509f5973db7cb75c3a022bc1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-925q9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:08:06Z is after 2025-08-24T17:21:41Z" Nov 26 15:08:06 crc kubenswrapper[4785]: I1126 15:08:06.624996 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-hk884" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b9a87c55-b930-4993-88b8-15902e000caa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://018c4591e8ee49a626a6357d6672b723d221f62eaa62face83fd6c3d5f1b3bfa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n9rk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:38Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-hk884\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:08:06Z is after 2025-08-24T17:21:41Z" Nov 26 15:08:06 crc kubenswrapper[4785]: I1126 15:08:06.636002 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6q4xd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"855bd894-cca9-4fe1-a0d5-8b72afe7c93a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://041064cd63f3d654e02e5e310fbc190ca527a986959c9d7a35ab87962184f2a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7zv2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6q4xd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:08:06Z is after 2025-08-24T17:21:41Z" Nov 26 15:08:06 crc kubenswrapper[4785]: I1126 15:08:06.646499 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-qdfwp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72903df2-b694-4229-96b5-167500cab723\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrprh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrprh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:52Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-qdfwp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:08:06Z is after 2025-08-24T17:21:41Z" Nov 26 15:08:06 crc kubenswrapper[4785]: I1126 15:08:06.718094 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:06 crc kubenswrapper[4785]: I1126 15:08:06.718159 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:06 crc kubenswrapper[4785]: I1126 15:08:06.718178 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:06 crc kubenswrapper[4785]: I1126 15:08:06.718202 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:06 crc kubenswrapper[4785]: I1126 15:08:06.718541 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:06Z","lastTransitionTime":"2025-11-26T15:08:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:06 crc kubenswrapper[4785]: I1126 15:08:06.821929 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:06 crc kubenswrapper[4785]: I1126 15:08:06.821969 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:06 crc kubenswrapper[4785]: I1126 15:08:06.821981 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:06 crc kubenswrapper[4785]: I1126 15:08:06.821996 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:06 crc kubenswrapper[4785]: I1126 15:08:06.822007 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:06Z","lastTransitionTime":"2025-11-26T15:08:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:06 crc kubenswrapper[4785]: I1126 15:08:06.924217 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:06 crc kubenswrapper[4785]: I1126 15:08:06.924266 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:06 crc kubenswrapper[4785]: I1126 15:08:06.924282 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:06 crc kubenswrapper[4785]: I1126 15:08:06.924303 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:06 crc kubenswrapper[4785]: I1126 15:08:06.924315 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:06Z","lastTransitionTime":"2025-11-26T15:08:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:07 crc kubenswrapper[4785]: I1126 15:08:07.027588 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:07 crc kubenswrapper[4785]: I1126 15:08:07.027631 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:07 crc kubenswrapper[4785]: I1126 15:08:07.027646 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:07 crc kubenswrapper[4785]: I1126 15:08:07.027664 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:07 crc kubenswrapper[4785]: I1126 15:08:07.027677 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:07Z","lastTransitionTime":"2025-11-26T15:08:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:07 crc kubenswrapper[4785]: I1126 15:08:07.036367 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 26 15:08:07 crc kubenswrapper[4785]: I1126 15:08:07.036365 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 26 15:08:07 crc kubenswrapper[4785]: E1126 15:08:07.036519 4785 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 26 15:08:07 crc kubenswrapper[4785]: E1126 15:08:07.036664 4785 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 26 15:08:07 crc kubenswrapper[4785]: I1126 15:08:07.055141 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ce2473df-8540-437a-9b68-a0c79c8f1189\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26237e500f4655636096b017418a9029308b9ad9e5c12041546e51d919766529\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7904d95c1ece08fadcdc54e83f80b295f3904cc950fb84624bc2a6967e4d5bdd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://936454b4c6cdedceb140c50e2dba0e677ce2b126075b203bdae8a77bbdefde76\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2202f005c9b704cc7a97a7387e4563a14bda9e52001c63918f1067a7e676d34d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a6d4414478527bb76f539125e4be6ebf2a6470ddb46fd35e07180319fa98ede\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-26T15:07:36Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1126 15:07:30.557519 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1126 15:07:30.559919 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2275430498/tls.crt::/tmp/serving-cert-2275430498/tls.key\\\\\\\"\\\\nI1126 15:07:36.843783 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1126 15:07:36.847546 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1126 15:07:36.847601 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1126 15:07:36.847630 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1126 15:07:36.847642 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1126 15:07:36.858915 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1126 15:07:36.858955 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1126 15:07:36.858960 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1126 15:07:36.858971 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1126 15:07:36.858975 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1126 15:07:36.858978 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1126 15:07:36.858982 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1126 15:07:36.859422 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1126 15:07:36.862125 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:20Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcce80995d95ff6993051a4f1ca260133d2513d216a291e3f2c905ec29f88d1b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e86ca1ef167bd0c31ad5ae1283d40945c72c4e528f3702b577bbbf9818e84c08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e86ca1ef167bd0c31ad5ae1283d40945c72c4e528f3702b577bbbf9818e84c08\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:08:07Z is after 2025-08-24T17:21:41Z" Nov 26 15:08:07 crc kubenswrapper[4785]: I1126 15:08:07.074679 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f369bd7df062792e57a0cee19e7be96c6850b20d6e739a0c7fa5eccc14ae1d1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:08:07Z is after 2025-08-24T17:21:41Z" Nov 26 15:08:07 crc kubenswrapper[4785]: I1126 15:08:07.105250 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:08:07Z is after 2025-08-24T17:21:41Z" Nov 26 15:08:07 crc kubenswrapper[4785]: I1126 15:08:07.123759 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8acfa933b476aa5e196c195906225ae92af5cca3353292e23f29f87ee374d0ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f432643beb27bff31e146230a60837b1292cddb8595ede8ed7f929170fceeb17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:08:07Z is after 2025-08-24T17:21:41Z" Nov 26 15:08:07 crc kubenswrapper[4785]: I1126 15:08:07.131437 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:07 crc kubenswrapper[4785]: I1126 15:08:07.131489 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:07 crc kubenswrapper[4785]: I1126 15:08:07.131500 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:07 crc kubenswrapper[4785]: I1126 15:08:07.131514 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:07 crc kubenswrapper[4785]: I1126 15:08:07.131526 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:07Z","lastTransitionTime":"2025-11-26T15:08:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:07 crc kubenswrapper[4785]: I1126 15:08:07.135892 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ab8b8fda3321443cb24c214c10ba7171f01c80adf81860f4aa931062ec886d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:08:07Z is after 2025-08-24T17:21:41Z" Nov 26 15:08:07 crc kubenswrapper[4785]: I1126 15:08:07.149258 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:08:07Z is after 2025-08-24T17:21:41Z" Nov 26 15:08:07 crc kubenswrapper[4785]: I1126 15:08:07.171442 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xbz7b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84d83039-5d86-45bb-a5c1-ca5b94ed92c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5459c3414a7754407ac6d182e88047bbe1b7b9750d15c7a88767c50c4b9dc720\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gdb7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f7072a2a9067489a27300fea823075b89d71d45a1d972bd9d94226b01c2c123\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f7072a2a9067489a27300fea823075b89d71d45a1d972bd9d94226b01c2c123\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gdb7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://538d65a0b5f3e2c45c1b9761f2c73e958d5de0873652f20f4047631423e538eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://538d65a0b5f3e2c45c1b9761f2c73e958d5de0873652f20f4047631423e538eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gdb7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1abf2b4c1a00c0bc17acaf7547130008127579df59592382ae6e81430936e6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d1abf2b4c1a00c0bc17acaf7547130008127579df59592382ae6e81430936e6d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gdb7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://35a2151729e5cb2ae347214fc91e6de116a4062943e4cb4a27ab0553436953ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35a2151729e5cb2ae347214fc91e6de116a4062943e4cb4a27ab0553436953ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gdb7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b51e95d33c83c57a3b5a7f580eb9cac5e50575e699c066694a9ae8fd4dcdfb20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b51e95d33c83c57a3b5a7f580eb9cac5e50575e699c066694a9ae8fd4dcdfb20\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gdb7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20f87e07ed26b3b93d4dad87f971e4ee3bf3c46299491a95990ab34c63167f93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://20f87e07ed26b3b93d4dad87f971e4ee3bf3c46299491a95990ab34c63167f93\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gdb7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xbz7b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:08:07Z is after 2025-08-24T17:21:41Z" Nov 26 15:08:07 crc kubenswrapper[4785]: I1126 15:08:07.181924 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-gkxdl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5539d39a-e2bc-4e7f-8b5a-a5e3e10c4ba4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://183451ec5e6ad1983b6e56a53adfcf2eaf3489dcfbc3cbc709f67002691b9010\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqzs8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5538d579d2e0ac36c778f58da6e8176f8f7a0995797e8bc7c3ad21a054346867\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqzs8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-gkxdl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:08:07Z is after 2025-08-24T17:21:41Z" Nov 26 15:08:07 crc kubenswrapper[4785]: I1126 15:08:07.194667 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dbac4f7f-578e-41e6-88a0-4ad8d2f5eef2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://320fc519a91dfed4d5c233e7c1c2f21aadf25336ccade8830069aae4bd804a68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b79ab88118b1e683ad26fc3bd390961d15af99cce73126c56587c0df819f3a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6c964ff713703fe4484d76d86473279ffc4a3eef0cf3dd8b29aea7fdc6271616\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03ac9c745ec9eb21263a599561496630202872ef631f5e846e0475efedee6943\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:08:07Z is after 2025-08-24T17:21:41Z" Nov 26 15:08:07 crc kubenswrapper[4785]: I1126 15:08:07.212750 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ec3c2133-1999-46b4-b5a6-c1b800f04c30\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee866b365ef5e15c3cc262ab6ad0649b10fa7fdf0c10c3f747d9e20d48535e76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79b8f6bd2047642cacd4effcee8f9760c7c9480092993c7a25eabe4727bd8238\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dee1454ae94951e83f437b999f871e4965a9199a20bcd6392a6e07a7bfa9c6aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://257e51cb9b93727ccb616e193abc21dfed1ee9ee2f1ce459e35728457dae5280\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc743861f85f56a87c6a20d7152af59e554284fe13b8212c5dae854438771406\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ff60b31932a511f9156f1416e112d115fcbc79a0a6030143ba9f5ddd327577a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ff60b31932a511f9156f1416e112d115fcbc79a0a6030143ba9f5ddd327577a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff524be439c6bf410b9377cd503516b43be3826e4c2fcd03a6ed9bf3fcfab95b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff524be439c6bf410b9377cd503516b43be3826e4c2fcd03a6ed9bf3fcfab95b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:19Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://26cb5ba5508ef320a24ba05e352547c9feb82f37ebeab58ad7c0a073d03b8c2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26cb5ba5508ef320a24ba05e352547c9feb82f37ebeab58ad7c0a073d03b8c2d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:17Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:08:07Z is after 2025-08-24T17:21:41Z" Nov 26 15:08:07 crc kubenswrapper[4785]: I1126 15:08:07.228378 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-925q9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"862c58fd-3f79-4276-bd76-ce689d32cbd6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b754df1d5f2d20f3afdb10f2e77f4716de9d953c7322bb3e289d989d2699d17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://018b59a73e9d454f9156de17b00a8c5cccc3f321164cac374187afa18466bc45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e1a2013c12cbd23c8c9e4e0d5bd9f76df1c2c2f30f0d55fdb8c8cb03698ea576\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bc34631383b7470b2d582d037ea922f2c46d09ca9e37011d758e16837336a78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://77387f2d556a76807d809c75aa4d9e2fdf5de82d7a04e23d3f5461ccef860196\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://47be007cc11c8c2db5bc6dd5786cad4e45c1f8f08f872d060be0211012aad935\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e098d64d0baa7671b1693be648e0a05bcf267a0fbb27f8df8341589e7f2403a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://70cfa5c92e3afde972c24f4c6d70dcfe2bf73b25223abd8628949f267b3c30c5\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-26T15:07:50Z\\\",\\\"message\\\":\\\"Built service openshift-config-operator/metrics per-node LB for network=default: []services.LB{}\\\\nI1126 15:07:50.056310 6208 services_controller.go:453] Built service openshift-config-operator/metrics template LB for network=default: []services.LB{}\\\\nI1126 15:07:50.056320 6208 services_controller.go:454] Service openshift-config-operator/metrics for network=default has 1 cluster-wide, 0 per-node configs, 0 template configs, making 1 (cluster) 0 (per node) and 0 (template) load balancers\\\\nI1126 15:07:50.056218 6208 lb_config.go:1031] Cluster endpoints for openshift-ingress-canary/ingress-canary for network=default are: map[]\\\\nI1126 15:07:50.055930 6208 ovn.go:134] Ensuring zone local for Pod openshift-machine-config-operator/machine-config-daemon-gkxdl in node crc\\\\nI1126 15:07:50.056357 6208 services_controller.go:443] Built service openshift-ingress-canary/ingress-canary LB cluster-wide configs for network=default: []services.lbConfig{services.lbConfig{vips:[]string{\\\\\\\"10.217.5.34\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:8443, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]string(nil), V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}, services.lbConfig{vips:[]string{\\\\\\\"10.217.5.34\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:8888, clusterEndpoints:services.lbEndpoints\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:49Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e098d64d0baa7671b1693be648e0a05bcf267a0fbb27f8df8341589e7f2403a8\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-26T15:08:05Z\\\",\\\"message\\\":\\\" event handler 8\\\\nI1126 15:08:05.966708 6428 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1126 15:08:05.966737 6428 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1126 15:08:05.966799 6428 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1126 15:08:05.966828 6428 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1126 15:08:05.966851 6428 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1126 15:08:05.966838 6428 handler.go:208] Removed *v1.Node event handler 2\\\\nI1126 15:08:05.966854 6428 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1126 15:08:05.966902 6428 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1126 15:08:05.966906 6428 factory.go:656] Stopping watch factory\\\\nI1126 15:08:05.966902 6428 handler.go:208] Removed *v1.Node event handler 7\\\\nI1126 15:08:05.966903 6428 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI1126 15:08:05.966957 6428 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1126 15:08:05.967001 6428 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI1126 15:08:05.967033 6428 ovnkube.go:599] Stopped ovnkube\\\\nI1126 15:08:05.967055 6428 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1126 15:08:05.967109 6428 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-26T15:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84b98c7c476182b515072f71c6d81656debe5c03855324623b1eb31f53e6c5c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://049fc6de07d4b51e1c898035109c3427dff189cc509f5973db7cb75c3a022bc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://049fc6de07d4b51e1c898035109c3427dff189cc509f5973db7cb75c3a022bc1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-925q9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:08:07Z is after 2025-08-24T17:21:41Z" Nov 26 15:08:07 crc kubenswrapper[4785]: I1126 15:08:07.233910 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:07 crc kubenswrapper[4785]: I1126 15:08:07.233944 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:07 crc kubenswrapper[4785]: I1126 15:08:07.233953 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:07 crc kubenswrapper[4785]: I1126 15:08:07.233967 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:07 crc kubenswrapper[4785]: I1126 15:08:07.233975 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:07Z","lastTransitionTime":"2025-11-26T15:08:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:07 crc kubenswrapper[4785]: I1126 15:08:07.237053 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-hk884" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b9a87c55-b930-4993-88b8-15902e000caa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://018c4591e8ee49a626a6357d6672b723d221f62eaa62face83fd6c3d5f1b3bfa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n9rk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:38Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-hk884\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:08:07Z is after 2025-08-24T17:21:41Z" Nov 26 15:08:07 crc kubenswrapper[4785]: I1126 15:08:07.248742 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6q4xd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"855bd894-cca9-4fe1-a0d5-8b72afe7c93a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://041064cd63f3d654e02e5e310fbc190ca527a986959c9d7a35ab87962184f2a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7zv2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6q4xd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:08:07Z is after 2025-08-24T17:21:41Z" Nov 26 15:08:07 crc kubenswrapper[4785]: I1126 15:08:07.258839 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-qdfwp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72903df2-b694-4229-96b5-167500cab723\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrprh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrprh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:52Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-qdfwp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:08:07Z is after 2025-08-24T17:21:41Z" Nov 26 15:08:07 crc kubenswrapper[4785]: I1126 15:08:07.269932 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zk6jt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"09c65f70-203d-40eb-a45e-ed8d7e36912f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69bcee7670290dd116b68ea6d15585455861981551919b4b553fc699f43161e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dg52p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba1f2e54aa3b8e81f5bc6ad4531d5976cfd8d6c4c15f16cefecc75c3c6189ef4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dg52p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-zk6jt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:08:07Z is after 2025-08-24T17:21:41Z" Nov 26 15:08:07 crc kubenswrapper[4785]: I1126 15:08:07.284927 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:08:07Z is after 2025-08-24T17:21:41Z" Nov 26 15:08:07 crc kubenswrapper[4785]: I1126 15:08:07.295879 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-smv28" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d77660b-5b10-4573-84ab-3dc318d4b4ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3bb62d64073e9a5efa6ec04394fb33bf1498892d59e0843f323a728dff2e7b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mwfft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:38Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-smv28\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:08:07Z is after 2025-08-24T17:21:41Z" Nov 26 15:08:07 crc kubenswrapper[4785]: I1126 15:08:07.336169 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:07 crc kubenswrapper[4785]: I1126 15:08:07.336223 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:07 crc kubenswrapper[4785]: I1126 15:08:07.336236 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:07 crc kubenswrapper[4785]: I1126 15:08:07.336256 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:07 crc kubenswrapper[4785]: I1126 15:08:07.336269 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:07Z","lastTransitionTime":"2025-11-26T15:08:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:07 crc kubenswrapper[4785]: I1126 15:08:07.399941 4785 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-925q9_862c58fd-3f79-4276-bd76-ce689d32cbd6/ovnkube-controller/2.log" Nov 26 15:08:07 crc kubenswrapper[4785]: I1126 15:08:07.404111 4785 scope.go:117] "RemoveContainer" containerID="e098d64d0baa7671b1693be648e0a05bcf267a0fbb27f8df8341589e7f2403a8" Nov 26 15:08:07 crc kubenswrapper[4785]: E1126 15:08:07.404274 4785 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-925q9_openshift-ovn-kubernetes(862c58fd-3f79-4276-bd76-ce689d32cbd6)\"" pod="openshift-ovn-kubernetes/ovnkube-node-925q9" podUID="862c58fd-3f79-4276-bd76-ce689d32cbd6" Nov 26 15:08:07 crc kubenswrapper[4785]: I1126 15:08:07.421385 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:08:07Z is after 2025-08-24T17:21:41Z" Nov 26 15:08:07 crc kubenswrapper[4785]: I1126 15:08:07.439544 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:07 crc kubenswrapper[4785]: I1126 15:08:07.439598 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:07 crc kubenswrapper[4785]: I1126 15:08:07.439610 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:07 crc kubenswrapper[4785]: I1126 15:08:07.439626 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:07 crc kubenswrapper[4785]: I1126 15:08:07.439638 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:07Z","lastTransitionTime":"2025-11-26T15:08:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:07 crc kubenswrapper[4785]: I1126 15:08:07.440140 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ce2473df-8540-437a-9b68-a0c79c8f1189\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26237e500f4655636096b017418a9029308b9ad9e5c12041546e51d919766529\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7904d95c1ece08fadcdc54e83f80b295f3904cc950fb84624bc2a6967e4d5bdd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://936454b4c6cdedceb140c50e2dba0e677ce2b126075b203bdae8a77bbdefde76\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2202f005c9b704cc7a97a7387e4563a14bda9e52001c63918f1067a7e676d34d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a6d4414478527bb76f539125e4be6ebf2a6470ddb46fd35e07180319fa98ede\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-26T15:07:36Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1126 15:07:30.557519 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1126 15:07:30.559919 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2275430498/tls.crt::/tmp/serving-cert-2275430498/tls.key\\\\\\\"\\\\nI1126 15:07:36.843783 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1126 15:07:36.847546 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1126 15:07:36.847601 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1126 15:07:36.847630 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1126 15:07:36.847642 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1126 15:07:36.858915 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1126 15:07:36.858955 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1126 15:07:36.858960 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1126 15:07:36.858971 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1126 15:07:36.858975 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1126 15:07:36.858978 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1126 15:07:36.858982 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1126 15:07:36.859422 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1126 15:07:36.862125 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:20Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcce80995d95ff6993051a4f1ca260133d2513d216a291e3f2c905ec29f88d1b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e86ca1ef167bd0c31ad5ae1283d40945c72c4e528f3702b577bbbf9818e84c08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e86ca1ef167bd0c31ad5ae1283d40945c72c4e528f3702b577bbbf9818e84c08\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:08:07Z is after 2025-08-24T17:21:41Z" Nov 26 15:08:07 crc kubenswrapper[4785]: I1126 15:08:07.459091 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f369bd7df062792e57a0cee19e7be96c6850b20d6e739a0c7fa5eccc14ae1d1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:08:07Z is after 2025-08-24T17:21:41Z" Nov 26 15:08:07 crc kubenswrapper[4785]: I1126 15:08:07.496223 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ec3c2133-1999-46b4-b5a6-c1b800f04c30\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee866b365ef5e15c3cc262ab6ad0649b10fa7fdf0c10c3f747d9e20d48535e76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79b8f6bd2047642cacd4effcee8f9760c7c9480092993c7a25eabe4727bd8238\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dee1454ae94951e83f437b999f871e4965a9199a20bcd6392a6e07a7bfa9c6aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://257e51cb9b93727ccb616e193abc21dfed1ee9ee2f1ce459e35728457dae5280\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc743861f85f56a87c6a20d7152af59e554284fe13b8212c5dae854438771406\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ff60b31932a511f9156f1416e112d115fcbc79a0a6030143ba9f5ddd327577a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ff60b31932a511f9156f1416e112d115fcbc79a0a6030143ba9f5ddd327577a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff524be439c6bf410b9377cd503516b43be3826e4c2fcd03a6ed9bf3fcfab95b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff524be439c6bf410b9377cd503516b43be3826e4c2fcd03a6ed9bf3fcfab95b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:19Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://26cb5ba5508ef320a24ba05e352547c9feb82f37ebeab58ad7c0a073d03b8c2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26cb5ba5508ef320a24ba05e352547c9feb82f37ebeab58ad7c0a073d03b8c2d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:17Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:08:07Z is after 2025-08-24T17:21:41Z" Nov 26 15:08:07 crc kubenswrapper[4785]: I1126 15:08:07.516454 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8acfa933b476aa5e196c195906225ae92af5cca3353292e23f29f87ee374d0ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f432643beb27bff31e146230a60837b1292cddb8595ede8ed7f929170fceeb17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:08:07Z is after 2025-08-24T17:21:41Z" Nov 26 15:08:07 crc kubenswrapper[4785]: I1126 15:08:07.528917 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ab8b8fda3321443cb24c214c10ba7171f01c80adf81860f4aa931062ec886d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:08:07Z is after 2025-08-24T17:21:41Z" Nov 26 15:08:07 crc kubenswrapper[4785]: I1126 15:08:07.542472 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:08:07Z is after 2025-08-24T17:21:41Z" Nov 26 15:08:07 crc kubenswrapper[4785]: I1126 15:08:07.542863 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:07 crc kubenswrapper[4785]: I1126 15:08:07.542914 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:07 crc kubenswrapper[4785]: I1126 15:08:07.542926 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:07 crc kubenswrapper[4785]: I1126 15:08:07.542941 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:07 crc kubenswrapper[4785]: I1126 15:08:07.542951 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:07Z","lastTransitionTime":"2025-11-26T15:08:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:07 crc kubenswrapper[4785]: I1126 15:08:07.559166 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xbz7b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84d83039-5d86-45bb-a5c1-ca5b94ed92c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5459c3414a7754407ac6d182e88047bbe1b7b9750d15c7a88767c50c4b9dc720\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gdb7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f7072a2a9067489a27300fea823075b89d71d45a1d972bd9d94226b01c2c123\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f7072a2a9067489a27300fea823075b89d71d45a1d972bd9d94226b01c2c123\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gdb7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://538d65a0b5f3e2c45c1b9761f2c73e958d5de0873652f20f4047631423e538eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://538d65a0b5f3e2c45c1b9761f2c73e958d5de0873652f20f4047631423e538eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gdb7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1abf2b4c1a00c0bc17acaf7547130008127579df59592382ae6e81430936e6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d1abf2b4c1a00c0bc17acaf7547130008127579df59592382ae6e81430936e6d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gdb7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://35a2151729e5cb2ae347214fc91e6de116a4062943e4cb4a27ab0553436953ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35a2151729e5cb2ae347214fc91e6de116a4062943e4cb4a27ab0553436953ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gdb7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b51e95d33c83c57a3b5a7f580eb9cac5e50575e699c066694a9ae8fd4dcdfb20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b51e95d33c83c57a3b5a7f580eb9cac5e50575e699c066694a9ae8fd4dcdfb20\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gdb7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20f87e07ed26b3b93d4dad87f971e4ee3bf3c46299491a95990ab34c63167f93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://20f87e07ed26b3b93d4dad87f971e4ee3bf3c46299491a95990ab34c63167f93\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gdb7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xbz7b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:08:07Z is after 2025-08-24T17:21:41Z" Nov 26 15:08:07 crc kubenswrapper[4785]: I1126 15:08:07.571609 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-gkxdl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5539d39a-e2bc-4e7f-8b5a-a5e3e10c4ba4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://183451ec5e6ad1983b6e56a53adfcf2eaf3489dcfbc3cbc709f67002691b9010\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqzs8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5538d579d2e0ac36c778f58da6e8176f8f7a0995797e8bc7c3ad21a054346867\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqzs8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-gkxdl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:08:07Z is after 2025-08-24T17:21:41Z" Nov 26 15:08:07 crc kubenswrapper[4785]: I1126 15:08:07.586271 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dbac4f7f-578e-41e6-88a0-4ad8d2f5eef2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://320fc519a91dfed4d5c233e7c1c2f21aadf25336ccade8830069aae4bd804a68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b79ab88118b1e683ad26fc3bd390961d15af99cce73126c56587c0df819f3a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6c964ff713703fe4484d76d86473279ffc4a3eef0cf3dd8b29aea7fdc6271616\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03ac9c745ec9eb21263a599561496630202872ef631f5e846e0475efedee6943\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:08:07Z is after 2025-08-24T17:21:41Z" Nov 26 15:08:07 crc kubenswrapper[4785]: I1126 15:08:07.605451 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-925q9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"862c58fd-3f79-4276-bd76-ce689d32cbd6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b754df1d5f2d20f3afdb10f2e77f4716de9d953c7322bb3e289d989d2699d17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://018b59a73e9d454f9156de17b00a8c5cccc3f321164cac374187afa18466bc45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e1a2013c12cbd23c8c9e4e0d5bd9f76df1c2c2f30f0d55fdb8c8cb03698ea576\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bc34631383b7470b2d582d037ea922f2c46d09ca9e37011d758e16837336a78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://77387f2d556a76807d809c75aa4d9e2fdf5de82d7a04e23d3f5461ccef860196\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://47be007cc11c8c2db5bc6dd5786cad4e45c1f8f08f872d060be0211012aad935\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e098d64d0baa7671b1693be648e0a05bcf267a0fbb27f8df8341589e7f2403a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e098d64d0baa7671b1693be648e0a05bcf267a0fbb27f8df8341589e7f2403a8\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-26T15:08:05Z\\\",\\\"message\\\":\\\" event handler 8\\\\nI1126 15:08:05.966708 6428 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1126 15:08:05.966737 6428 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1126 15:08:05.966799 6428 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1126 15:08:05.966828 6428 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1126 15:08:05.966851 6428 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1126 15:08:05.966838 6428 handler.go:208] Removed *v1.Node event handler 2\\\\nI1126 15:08:05.966854 6428 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1126 15:08:05.966902 6428 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1126 15:08:05.966906 6428 factory.go:656] Stopping watch factory\\\\nI1126 15:08:05.966902 6428 handler.go:208] Removed *v1.Node event handler 7\\\\nI1126 15:08:05.966903 6428 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI1126 15:08:05.966957 6428 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1126 15:08:05.967001 6428 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI1126 15:08:05.967033 6428 ovnkube.go:599] Stopped ovnkube\\\\nI1126 15:08:05.967055 6428 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1126 15:08:05.967109 6428 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-26T15:08:05Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-925q9_openshift-ovn-kubernetes(862c58fd-3f79-4276-bd76-ce689d32cbd6)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84b98c7c476182b515072f71c6d81656debe5c03855324623b1eb31f53e6c5c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://049fc6de07d4b51e1c898035109c3427dff189cc509f5973db7cb75c3a022bc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://049fc6de07d4b51e1c898035109c3427dff189cc509f5973db7cb75c3a022bc1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-925q9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:08:07Z is after 2025-08-24T17:21:41Z" Nov 26 15:08:07 crc kubenswrapper[4785]: I1126 15:08:07.618737 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-qdfwp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72903df2-b694-4229-96b5-167500cab723\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrprh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrprh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:52Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-qdfwp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:08:07Z is after 2025-08-24T17:21:41Z" Nov 26 15:08:07 crc kubenswrapper[4785]: I1126 15:08:07.632824 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-hk884" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b9a87c55-b930-4993-88b8-15902e000caa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://018c4591e8ee49a626a6357d6672b723d221f62eaa62face83fd6c3d5f1b3bfa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n9rk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:38Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-hk884\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:08:07Z is after 2025-08-24T17:21:41Z" Nov 26 15:08:07 crc kubenswrapper[4785]: I1126 15:08:07.646517 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:07 crc kubenswrapper[4785]: I1126 15:08:07.646604 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:07 crc kubenswrapper[4785]: I1126 15:08:07.646621 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:07 crc kubenswrapper[4785]: I1126 15:08:07.646669 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:07 crc kubenswrapper[4785]: I1126 15:08:07.646689 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:07Z","lastTransitionTime":"2025-11-26T15:08:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:07 crc kubenswrapper[4785]: I1126 15:08:07.651236 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6q4xd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"855bd894-cca9-4fe1-a0d5-8b72afe7c93a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://041064cd63f3d654e02e5e310fbc190ca527a986959c9d7a35ab87962184f2a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7zv2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6q4xd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:08:07Z is after 2025-08-24T17:21:41Z" Nov 26 15:08:07 crc kubenswrapper[4785]: I1126 15:08:07.667533 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-smv28" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d77660b-5b10-4573-84ab-3dc318d4b4ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3bb62d64073e9a5efa6ec04394fb33bf1498892d59e0843f323a728dff2e7b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mwfft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:38Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-smv28\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:08:07Z is after 2025-08-24T17:21:41Z" Nov 26 15:08:07 crc kubenswrapper[4785]: I1126 15:08:07.683199 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zk6jt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"09c65f70-203d-40eb-a45e-ed8d7e36912f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69bcee7670290dd116b68ea6d15585455861981551919b4b553fc699f43161e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dg52p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba1f2e54aa3b8e81f5bc6ad4531d5976cfd8d6c4c15f16cefecc75c3c6189ef4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dg52p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-zk6jt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:08:07Z is after 2025-08-24T17:21:41Z" Nov 26 15:08:07 crc kubenswrapper[4785]: I1126 15:08:07.700773 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:08:07Z is after 2025-08-24T17:21:41Z" Nov 26 15:08:07 crc kubenswrapper[4785]: I1126 15:08:07.750157 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:07 crc kubenswrapper[4785]: I1126 15:08:07.750252 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:07 crc kubenswrapper[4785]: I1126 15:08:07.750272 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:07 crc kubenswrapper[4785]: I1126 15:08:07.750295 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:07 crc kubenswrapper[4785]: I1126 15:08:07.750313 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:07Z","lastTransitionTime":"2025-11-26T15:08:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:07 crc kubenswrapper[4785]: I1126 15:08:07.853618 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:07 crc kubenswrapper[4785]: I1126 15:08:07.853690 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:07 crc kubenswrapper[4785]: I1126 15:08:07.853735 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:07 crc kubenswrapper[4785]: I1126 15:08:07.853760 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:07 crc kubenswrapper[4785]: I1126 15:08:07.853778 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:07Z","lastTransitionTime":"2025-11-26T15:08:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:07 crc kubenswrapper[4785]: I1126 15:08:07.956287 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:07 crc kubenswrapper[4785]: I1126 15:08:07.956391 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:07 crc kubenswrapper[4785]: I1126 15:08:07.956448 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:07 crc kubenswrapper[4785]: I1126 15:08:07.956532 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:07 crc kubenswrapper[4785]: I1126 15:08:07.956577 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:07Z","lastTransitionTime":"2025-11-26T15:08:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:08 crc kubenswrapper[4785]: I1126 15:08:08.035759 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 26 15:08:08 crc kubenswrapper[4785]: I1126 15:08:08.035862 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qdfwp" Nov 26 15:08:08 crc kubenswrapper[4785]: E1126 15:08:08.035953 4785 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 26 15:08:08 crc kubenswrapper[4785]: E1126 15:08:08.036068 4785 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qdfwp" podUID="72903df2-b694-4229-96b5-167500cab723" Nov 26 15:08:08 crc kubenswrapper[4785]: I1126 15:08:08.059861 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:08 crc kubenswrapper[4785]: I1126 15:08:08.059917 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:08 crc kubenswrapper[4785]: I1126 15:08:08.059935 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:08 crc kubenswrapper[4785]: I1126 15:08:08.059962 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:08 crc kubenswrapper[4785]: I1126 15:08:08.059985 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:08Z","lastTransitionTime":"2025-11-26T15:08:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:08 crc kubenswrapper[4785]: I1126 15:08:08.163734 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:08 crc kubenswrapper[4785]: I1126 15:08:08.163795 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:08 crc kubenswrapper[4785]: I1126 15:08:08.163818 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:08 crc kubenswrapper[4785]: I1126 15:08:08.163846 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:08 crc kubenswrapper[4785]: I1126 15:08:08.163865 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:08Z","lastTransitionTime":"2025-11-26T15:08:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:08 crc kubenswrapper[4785]: I1126 15:08:08.267243 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:08 crc kubenswrapper[4785]: I1126 15:08:08.267322 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:08 crc kubenswrapper[4785]: I1126 15:08:08.267334 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:08 crc kubenswrapper[4785]: I1126 15:08:08.267351 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:08 crc kubenswrapper[4785]: I1126 15:08:08.267365 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:08Z","lastTransitionTime":"2025-11-26T15:08:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:08 crc kubenswrapper[4785]: I1126 15:08:08.370733 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:08 crc kubenswrapper[4785]: I1126 15:08:08.370838 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:08 crc kubenswrapper[4785]: I1126 15:08:08.370856 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:08 crc kubenswrapper[4785]: I1126 15:08:08.370905 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:08 crc kubenswrapper[4785]: I1126 15:08:08.370925 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:08Z","lastTransitionTime":"2025-11-26T15:08:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:08 crc kubenswrapper[4785]: I1126 15:08:08.474198 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:08 crc kubenswrapper[4785]: I1126 15:08:08.474288 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:08 crc kubenswrapper[4785]: I1126 15:08:08.474310 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:08 crc kubenswrapper[4785]: I1126 15:08:08.474345 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:08 crc kubenswrapper[4785]: I1126 15:08:08.474369 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:08Z","lastTransitionTime":"2025-11-26T15:08:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:08 crc kubenswrapper[4785]: I1126 15:08:08.570621 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/72903df2-b694-4229-96b5-167500cab723-metrics-certs\") pod \"network-metrics-daemon-qdfwp\" (UID: \"72903df2-b694-4229-96b5-167500cab723\") " pod="openshift-multus/network-metrics-daemon-qdfwp" Nov 26 15:08:08 crc kubenswrapper[4785]: E1126 15:08:08.570752 4785 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Nov 26 15:08:08 crc kubenswrapper[4785]: E1126 15:08:08.570824 4785 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/72903df2-b694-4229-96b5-167500cab723-metrics-certs podName:72903df2-b694-4229-96b5-167500cab723 nodeName:}" failed. No retries permitted until 2025-11-26 15:08:24.570805636 +0000 UTC m=+68.249171420 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/72903df2-b694-4229-96b5-167500cab723-metrics-certs") pod "network-metrics-daemon-qdfwp" (UID: "72903df2-b694-4229-96b5-167500cab723") : object "openshift-multus"/"metrics-daemon-secret" not registered Nov 26 15:08:08 crc kubenswrapper[4785]: I1126 15:08:08.577089 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:08 crc kubenswrapper[4785]: I1126 15:08:08.577138 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:08 crc kubenswrapper[4785]: I1126 15:08:08.577150 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:08 crc kubenswrapper[4785]: I1126 15:08:08.577167 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:08 crc kubenswrapper[4785]: I1126 15:08:08.577180 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:08Z","lastTransitionTime":"2025-11-26T15:08:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:08 crc kubenswrapper[4785]: I1126 15:08:08.680893 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:08 crc kubenswrapper[4785]: I1126 15:08:08.680960 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:08 crc kubenswrapper[4785]: I1126 15:08:08.680978 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:08 crc kubenswrapper[4785]: I1126 15:08:08.681006 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:08 crc kubenswrapper[4785]: I1126 15:08:08.681023 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:08Z","lastTransitionTime":"2025-11-26T15:08:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:08 crc kubenswrapper[4785]: I1126 15:08:08.772089 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 26 15:08:08 crc kubenswrapper[4785]: I1126 15:08:08.772267 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 26 15:08:08 crc kubenswrapper[4785]: E1126 15:08:08.772311 4785 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-26 15:08:40.772278982 +0000 UTC m=+84.450644766 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 15:08:08 crc kubenswrapper[4785]: I1126 15:08:08.772359 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 26 15:08:08 crc kubenswrapper[4785]: I1126 15:08:08.772414 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 26 15:08:08 crc kubenswrapper[4785]: E1126 15:08:08.772421 4785 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 26 15:08:08 crc kubenswrapper[4785]: I1126 15:08:08.772495 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 26 15:08:08 crc kubenswrapper[4785]: E1126 15:08:08.772547 4785 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-26 15:08:40.772530898 +0000 UTC m=+84.450896752 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 26 15:08:08 crc kubenswrapper[4785]: E1126 15:08:08.772472 4785 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Nov 26 15:08:08 crc kubenswrapper[4785]: E1126 15:08:08.772634 4785 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-26 15:08:40.772620001 +0000 UTC m=+84.450985775 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Nov 26 15:08:08 crc kubenswrapper[4785]: E1126 15:08:08.772717 4785 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 26 15:08:08 crc kubenswrapper[4785]: E1126 15:08:08.772756 4785 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 26 15:08:08 crc kubenswrapper[4785]: E1126 15:08:08.772782 4785 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 26 15:08:08 crc kubenswrapper[4785]: E1126 15:08:08.772717 4785 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 26 15:08:08 crc kubenswrapper[4785]: E1126 15:08:08.772871 4785 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-11-26 15:08:40.772839806 +0000 UTC m=+84.451205600 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 26 15:08:08 crc kubenswrapper[4785]: E1126 15:08:08.772881 4785 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 26 15:08:08 crc kubenswrapper[4785]: E1126 15:08:08.772906 4785 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 26 15:08:08 crc kubenswrapper[4785]: E1126 15:08:08.772978 4785 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-11-26 15:08:40.772960229 +0000 UTC m=+84.451326023 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 26 15:08:08 crc kubenswrapper[4785]: I1126 15:08:08.783812 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:08 crc kubenswrapper[4785]: I1126 15:08:08.783878 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:08 crc kubenswrapper[4785]: I1126 15:08:08.783901 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:08 crc kubenswrapper[4785]: I1126 15:08:08.783931 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:08 crc kubenswrapper[4785]: I1126 15:08:08.783953 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:08Z","lastTransitionTime":"2025-11-26T15:08:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:08 crc kubenswrapper[4785]: I1126 15:08:08.887116 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:08 crc kubenswrapper[4785]: I1126 15:08:08.887183 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:08 crc kubenswrapper[4785]: I1126 15:08:08.887206 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:08 crc kubenswrapper[4785]: I1126 15:08:08.887230 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:08 crc kubenswrapper[4785]: I1126 15:08:08.887246 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:08Z","lastTransitionTime":"2025-11-26T15:08:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:08 crc kubenswrapper[4785]: I1126 15:08:08.990280 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:08 crc kubenswrapper[4785]: I1126 15:08:08.990367 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:08 crc kubenswrapper[4785]: I1126 15:08:08.990396 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:08 crc kubenswrapper[4785]: I1126 15:08:08.990428 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:08 crc kubenswrapper[4785]: I1126 15:08:08.990452 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:08Z","lastTransitionTime":"2025-11-26T15:08:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:09 crc kubenswrapper[4785]: I1126 15:08:09.036042 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 26 15:08:09 crc kubenswrapper[4785]: E1126 15:08:09.036688 4785 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 26 15:08:09 crc kubenswrapper[4785]: I1126 15:08:09.036720 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 26 15:08:09 crc kubenswrapper[4785]: E1126 15:08:09.036927 4785 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 26 15:08:09 crc kubenswrapper[4785]: I1126 15:08:09.092699 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:09 crc kubenswrapper[4785]: I1126 15:08:09.092745 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:09 crc kubenswrapper[4785]: I1126 15:08:09.092756 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:09 crc kubenswrapper[4785]: I1126 15:08:09.092786 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:09 crc kubenswrapper[4785]: I1126 15:08:09.092795 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:09Z","lastTransitionTime":"2025-11-26T15:08:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:09 crc kubenswrapper[4785]: I1126 15:08:09.195185 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:09 crc kubenswrapper[4785]: I1126 15:08:09.195237 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:09 crc kubenswrapper[4785]: I1126 15:08:09.195252 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:09 crc kubenswrapper[4785]: I1126 15:08:09.195273 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:09 crc kubenswrapper[4785]: I1126 15:08:09.195289 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:09Z","lastTransitionTime":"2025-11-26T15:08:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:09 crc kubenswrapper[4785]: I1126 15:08:09.297502 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:09 crc kubenswrapper[4785]: I1126 15:08:09.297548 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:09 crc kubenswrapper[4785]: I1126 15:08:09.297575 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:09 crc kubenswrapper[4785]: I1126 15:08:09.297590 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:09 crc kubenswrapper[4785]: I1126 15:08:09.297600 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:09Z","lastTransitionTime":"2025-11-26T15:08:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:09 crc kubenswrapper[4785]: I1126 15:08:09.400407 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:09 crc kubenswrapper[4785]: I1126 15:08:09.400481 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:09 crc kubenswrapper[4785]: I1126 15:08:09.400503 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:09 crc kubenswrapper[4785]: I1126 15:08:09.400535 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:09 crc kubenswrapper[4785]: I1126 15:08:09.400614 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:09Z","lastTransitionTime":"2025-11-26T15:08:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:09 crc kubenswrapper[4785]: I1126 15:08:09.503174 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:09 crc kubenswrapper[4785]: I1126 15:08:09.503213 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:09 crc kubenswrapper[4785]: I1126 15:08:09.503222 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:09 crc kubenswrapper[4785]: I1126 15:08:09.503235 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:09 crc kubenswrapper[4785]: I1126 15:08:09.503245 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:09Z","lastTransitionTime":"2025-11-26T15:08:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:09 crc kubenswrapper[4785]: I1126 15:08:09.605291 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:09 crc kubenswrapper[4785]: I1126 15:08:09.605339 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:09 crc kubenswrapper[4785]: I1126 15:08:09.605347 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:09 crc kubenswrapper[4785]: I1126 15:08:09.605361 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:09 crc kubenswrapper[4785]: I1126 15:08:09.605370 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:09Z","lastTransitionTime":"2025-11-26T15:08:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:09 crc kubenswrapper[4785]: I1126 15:08:09.708103 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:09 crc kubenswrapper[4785]: I1126 15:08:09.708152 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:09 crc kubenswrapper[4785]: I1126 15:08:09.708164 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:09 crc kubenswrapper[4785]: I1126 15:08:09.708182 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:09 crc kubenswrapper[4785]: I1126 15:08:09.708194 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:09Z","lastTransitionTime":"2025-11-26T15:08:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:09 crc kubenswrapper[4785]: I1126 15:08:09.811494 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:09 crc kubenswrapper[4785]: I1126 15:08:09.811601 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:09 crc kubenswrapper[4785]: I1126 15:08:09.811621 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:09 crc kubenswrapper[4785]: I1126 15:08:09.811644 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:09 crc kubenswrapper[4785]: I1126 15:08:09.811667 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:09Z","lastTransitionTime":"2025-11-26T15:08:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:09 crc kubenswrapper[4785]: I1126 15:08:09.914919 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:09 crc kubenswrapper[4785]: I1126 15:08:09.915002 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:09 crc kubenswrapper[4785]: I1126 15:08:09.915024 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:09 crc kubenswrapper[4785]: I1126 15:08:09.915050 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:09 crc kubenswrapper[4785]: I1126 15:08:09.915070 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:09Z","lastTransitionTime":"2025-11-26T15:08:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:10 crc kubenswrapper[4785]: I1126 15:08:10.018213 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:10 crc kubenswrapper[4785]: I1126 15:08:10.018261 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:10 crc kubenswrapper[4785]: I1126 15:08:10.018270 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:10 crc kubenswrapper[4785]: I1126 15:08:10.018284 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:10 crc kubenswrapper[4785]: I1126 15:08:10.018296 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:10Z","lastTransitionTime":"2025-11-26T15:08:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:10 crc kubenswrapper[4785]: I1126 15:08:10.035705 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 26 15:08:10 crc kubenswrapper[4785]: I1126 15:08:10.035738 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qdfwp" Nov 26 15:08:10 crc kubenswrapper[4785]: E1126 15:08:10.035850 4785 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 26 15:08:10 crc kubenswrapper[4785]: E1126 15:08:10.036062 4785 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qdfwp" podUID="72903df2-b694-4229-96b5-167500cab723" Nov 26 15:08:10 crc kubenswrapper[4785]: I1126 15:08:10.121066 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:10 crc kubenswrapper[4785]: I1126 15:08:10.121130 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:10 crc kubenswrapper[4785]: I1126 15:08:10.121154 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:10 crc kubenswrapper[4785]: I1126 15:08:10.121187 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:10 crc kubenswrapper[4785]: I1126 15:08:10.121208 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:10Z","lastTransitionTime":"2025-11-26T15:08:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:10 crc kubenswrapper[4785]: I1126 15:08:10.224895 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:10 crc kubenswrapper[4785]: I1126 15:08:10.224969 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:10 crc kubenswrapper[4785]: I1126 15:08:10.225052 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:10 crc kubenswrapper[4785]: I1126 15:08:10.225086 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:10 crc kubenswrapper[4785]: I1126 15:08:10.225112 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:10Z","lastTransitionTime":"2025-11-26T15:08:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:10 crc kubenswrapper[4785]: I1126 15:08:10.327533 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:10 crc kubenswrapper[4785]: I1126 15:08:10.327586 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:10 crc kubenswrapper[4785]: I1126 15:08:10.327595 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:10 crc kubenswrapper[4785]: I1126 15:08:10.327614 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:10 crc kubenswrapper[4785]: I1126 15:08:10.327632 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:10Z","lastTransitionTime":"2025-11-26T15:08:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:10 crc kubenswrapper[4785]: I1126 15:08:10.429589 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:10 crc kubenswrapper[4785]: I1126 15:08:10.429672 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:10 crc kubenswrapper[4785]: I1126 15:08:10.429696 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:10 crc kubenswrapper[4785]: I1126 15:08:10.429724 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:10 crc kubenswrapper[4785]: I1126 15:08:10.429748 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:10Z","lastTransitionTime":"2025-11-26T15:08:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:10 crc kubenswrapper[4785]: I1126 15:08:10.531937 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:10 crc kubenswrapper[4785]: I1126 15:08:10.532013 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:10 crc kubenswrapper[4785]: I1126 15:08:10.532033 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:10 crc kubenswrapper[4785]: I1126 15:08:10.532058 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:10 crc kubenswrapper[4785]: I1126 15:08:10.532084 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:10Z","lastTransitionTime":"2025-11-26T15:08:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:10 crc kubenswrapper[4785]: I1126 15:08:10.634781 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:10 crc kubenswrapper[4785]: I1126 15:08:10.634822 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:10 crc kubenswrapper[4785]: I1126 15:08:10.634839 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:10 crc kubenswrapper[4785]: I1126 15:08:10.634853 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:10 crc kubenswrapper[4785]: I1126 15:08:10.634863 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:10Z","lastTransitionTime":"2025-11-26T15:08:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:10 crc kubenswrapper[4785]: I1126 15:08:10.738359 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:10 crc kubenswrapper[4785]: I1126 15:08:10.738440 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:10 crc kubenswrapper[4785]: I1126 15:08:10.738459 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:10 crc kubenswrapper[4785]: I1126 15:08:10.738484 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:10 crc kubenswrapper[4785]: I1126 15:08:10.738502 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:10Z","lastTransitionTime":"2025-11-26T15:08:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:10 crc kubenswrapper[4785]: I1126 15:08:10.841597 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:10 crc kubenswrapper[4785]: I1126 15:08:10.841660 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:10 crc kubenswrapper[4785]: I1126 15:08:10.841676 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:10 crc kubenswrapper[4785]: I1126 15:08:10.841693 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:10 crc kubenswrapper[4785]: I1126 15:08:10.841705 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:10Z","lastTransitionTime":"2025-11-26T15:08:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:10 crc kubenswrapper[4785]: I1126 15:08:10.881223 4785 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Nov 26 15:08:10 crc kubenswrapper[4785]: I1126 15:08:10.897501 4785 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Nov 26 15:08:10 crc kubenswrapper[4785]: I1126 15:08:10.904708 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ce2473df-8540-437a-9b68-a0c79c8f1189\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26237e500f4655636096b017418a9029308b9ad9e5c12041546e51d919766529\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7904d95c1ece08fadcdc54e83f80b295f3904cc950fb84624bc2a6967e4d5bdd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://936454b4c6cdedceb140c50e2dba0e677ce2b126075b203bdae8a77bbdefde76\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2202f005c9b704cc7a97a7387e4563a14bda9e52001c63918f1067a7e676d34d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a6d4414478527bb76f539125e4be6ebf2a6470ddb46fd35e07180319fa98ede\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-26T15:07:36Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1126 15:07:30.557519 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1126 15:07:30.559919 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2275430498/tls.crt::/tmp/serving-cert-2275430498/tls.key\\\\\\\"\\\\nI1126 15:07:36.843783 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1126 15:07:36.847546 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1126 15:07:36.847601 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1126 15:07:36.847630 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1126 15:07:36.847642 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1126 15:07:36.858915 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1126 15:07:36.858955 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1126 15:07:36.858960 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1126 15:07:36.858971 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1126 15:07:36.858975 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1126 15:07:36.858978 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1126 15:07:36.858982 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1126 15:07:36.859422 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1126 15:07:36.862125 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:20Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcce80995d95ff6993051a4f1ca260133d2513d216a291e3f2c905ec29f88d1b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e86ca1ef167bd0c31ad5ae1283d40945c72c4e528f3702b577bbbf9818e84c08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e86ca1ef167bd0c31ad5ae1283d40945c72c4e528f3702b577bbbf9818e84c08\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:08:10Z is after 2025-08-24T17:21:41Z" Nov 26 15:08:10 crc kubenswrapper[4785]: I1126 15:08:10.923126 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f369bd7df062792e57a0cee19e7be96c6850b20d6e739a0c7fa5eccc14ae1d1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:08:10Z is after 2025-08-24T17:21:41Z" Nov 26 15:08:10 crc kubenswrapper[4785]: I1126 15:08:10.935905 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:08:10Z is after 2025-08-24T17:21:41Z" Nov 26 15:08:10 crc kubenswrapper[4785]: I1126 15:08:10.945105 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:10 crc kubenswrapper[4785]: I1126 15:08:10.945155 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:10 crc kubenswrapper[4785]: I1126 15:08:10.945166 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:10 crc kubenswrapper[4785]: I1126 15:08:10.945183 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:10 crc kubenswrapper[4785]: I1126 15:08:10.945196 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:10Z","lastTransitionTime":"2025-11-26T15:08:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:10 crc kubenswrapper[4785]: I1126 15:08:10.954887 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:08:10Z is after 2025-08-24T17:21:41Z" Nov 26 15:08:10 crc kubenswrapper[4785]: I1126 15:08:10.975456 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xbz7b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84d83039-5d86-45bb-a5c1-ca5b94ed92c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5459c3414a7754407ac6d182e88047bbe1b7b9750d15c7a88767c50c4b9dc720\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gdb7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f7072a2a9067489a27300fea823075b89d71d45a1d972bd9d94226b01c2c123\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f7072a2a9067489a27300fea823075b89d71d45a1d972bd9d94226b01c2c123\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gdb7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://538d65a0b5f3e2c45c1b9761f2c73e958d5de0873652f20f4047631423e538eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://538d65a0b5f3e2c45c1b9761f2c73e958d5de0873652f20f4047631423e538eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gdb7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1abf2b4c1a00c0bc17acaf7547130008127579df59592382ae6e81430936e6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d1abf2b4c1a00c0bc17acaf7547130008127579df59592382ae6e81430936e6d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gdb7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://35a2151729e5cb2ae347214fc91e6de116a4062943e4cb4a27ab0553436953ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35a2151729e5cb2ae347214fc91e6de116a4062943e4cb4a27ab0553436953ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gdb7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b51e95d33c83c57a3b5a7f580eb9cac5e50575e699c066694a9ae8fd4dcdfb20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b51e95d33c83c57a3b5a7f580eb9cac5e50575e699c066694a9ae8fd4dcdfb20\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gdb7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20f87e07ed26b3b93d4dad87f971e4ee3bf3c46299491a95990ab34c63167f93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://20f87e07ed26b3b93d4dad87f971e4ee3bf3c46299491a95990ab34c63167f93\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gdb7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xbz7b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:08:10Z is after 2025-08-24T17:21:41Z" Nov 26 15:08:10 crc kubenswrapper[4785]: I1126 15:08:10.992124 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-gkxdl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5539d39a-e2bc-4e7f-8b5a-a5e3e10c4ba4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://183451ec5e6ad1983b6e56a53adfcf2eaf3489dcfbc3cbc709f67002691b9010\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqzs8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5538d579d2e0ac36c778f58da6e8176f8f7a0995797e8bc7c3ad21a054346867\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqzs8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-gkxdl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:08:10Z is after 2025-08-24T17:21:41Z" Nov 26 15:08:11 crc kubenswrapper[4785]: I1126 15:08:11.007110 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dbac4f7f-578e-41e6-88a0-4ad8d2f5eef2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://320fc519a91dfed4d5c233e7c1c2f21aadf25336ccade8830069aae4bd804a68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b79ab88118b1e683ad26fc3bd390961d15af99cce73126c56587c0df819f3a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6c964ff713703fe4484d76d86473279ffc4a3eef0cf3dd8b29aea7fdc6271616\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03ac9c745ec9eb21263a599561496630202872ef631f5e846e0475efedee6943\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:08:11Z is after 2025-08-24T17:21:41Z" Nov 26 15:08:11 crc kubenswrapper[4785]: I1126 15:08:11.036101 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 26 15:08:11 crc kubenswrapper[4785]: I1126 15:08:11.036091 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ec3c2133-1999-46b4-b5a6-c1b800f04c30\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee866b365ef5e15c3cc262ab6ad0649b10fa7fdf0c10c3f747d9e20d48535e76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79b8f6bd2047642cacd4effcee8f9760c7c9480092993c7a25eabe4727bd8238\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dee1454ae94951e83f437b999f871e4965a9199a20bcd6392a6e07a7bfa9c6aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://257e51cb9b93727ccb616e193abc21dfed1ee9ee2f1ce459e35728457dae5280\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc743861f85f56a87c6a20d7152af59e554284fe13b8212c5dae854438771406\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ff60b31932a511f9156f1416e112d115fcbc79a0a6030143ba9f5ddd327577a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ff60b31932a511f9156f1416e112d115fcbc79a0a6030143ba9f5ddd327577a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff524be439c6bf410b9377cd503516b43be3826e4c2fcd03a6ed9bf3fcfab95b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff524be439c6bf410b9377cd503516b43be3826e4c2fcd03a6ed9bf3fcfab95b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:19Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://26cb5ba5508ef320a24ba05e352547c9feb82f37ebeab58ad7c0a073d03b8c2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26cb5ba5508ef320a24ba05e352547c9feb82f37ebeab58ad7c0a073d03b8c2d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:17Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:08:11Z is after 2025-08-24T17:21:41Z" Nov 26 15:08:11 crc kubenswrapper[4785]: E1126 15:08:11.036317 4785 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 26 15:08:11 crc kubenswrapper[4785]: I1126 15:08:11.036134 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 26 15:08:11 crc kubenswrapper[4785]: E1126 15:08:11.036589 4785 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 26 15:08:11 crc kubenswrapper[4785]: I1126 15:08:11.052393 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:11 crc kubenswrapper[4785]: I1126 15:08:11.052451 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:11 crc kubenswrapper[4785]: I1126 15:08:11.052469 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:11 crc kubenswrapper[4785]: I1126 15:08:11.052495 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:11 crc kubenswrapper[4785]: I1126 15:08:11.052513 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:11Z","lastTransitionTime":"2025-11-26T15:08:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:11 crc kubenswrapper[4785]: I1126 15:08:11.056305 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8acfa933b476aa5e196c195906225ae92af5cca3353292e23f29f87ee374d0ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f432643beb27bff31e146230a60837b1292cddb8595ede8ed7f929170fceeb17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:08:11Z is after 2025-08-24T17:21:41Z" Nov 26 15:08:11 crc kubenswrapper[4785]: I1126 15:08:11.074478 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ab8b8fda3321443cb24c214c10ba7171f01c80adf81860f4aa931062ec886d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:08:11Z is after 2025-08-24T17:21:41Z" Nov 26 15:08:11 crc kubenswrapper[4785]: I1126 15:08:11.105877 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-925q9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"862c58fd-3f79-4276-bd76-ce689d32cbd6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b754df1d5f2d20f3afdb10f2e77f4716de9d953c7322bb3e289d989d2699d17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://018b59a73e9d454f9156de17b00a8c5cccc3f321164cac374187afa18466bc45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e1a2013c12cbd23c8c9e4e0d5bd9f76df1c2c2f30f0d55fdb8c8cb03698ea576\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bc34631383b7470b2d582d037ea922f2c46d09ca9e37011d758e16837336a78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://77387f2d556a76807d809c75aa4d9e2fdf5de82d7a04e23d3f5461ccef860196\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://47be007cc11c8c2db5bc6dd5786cad4e45c1f8f08f872d060be0211012aad935\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e098d64d0baa7671b1693be648e0a05bcf267a0fbb27f8df8341589e7f2403a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e098d64d0baa7671b1693be648e0a05bcf267a0fbb27f8df8341589e7f2403a8\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-26T15:08:05Z\\\",\\\"message\\\":\\\" event handler 8\\\\nI1126 15:08:05.966708 6428 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1126 15:08:05.966737 6428 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1126 15:08:05.966799 6428 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1126 15:08:05.966828 6428 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1126 15:08:05.966851 6428 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1126 15:08:05.966838 6428 handler.go:208] Removed *v1.Node event handler 2\\\\nI1126 15:08:05.966854 6428 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1126 15:08:05.966902 6428 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1126 15:08:05.966906 6428 factory.go:656] Stopping watch factory\\\\nI1126 15:08:05.966902 6428 handler.go:208] Removed *v1.Node event handler 7\\\\nI1126 15:08:05.966903 6428 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI1126 15:08:05.966957 6428 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1126 15:08:05.967001 6428 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI1126 15:08:05.967033 6428 ovnkube.go:599] Stopped ovnkube\\\\nI1126 15:08:05.967055 6428 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1126 15:08:05.967109 6428 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-26T15:08:05Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-925q9_openshift-ovn-kubernetes(862c58fd-3f79-4276-bd76-ce689d32cbd6)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84b98c7c476182b515072f71c6d81656debe5c03855324623b1eb31f53e6c5c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://049fc6de07d4b51e1c898035109c3427dff189cc509f5973db7cb75c3a022bc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://049fc6de07d4b51e1c898035109c3427dff189cc509f5973db7cb75c3a022bc1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-925q9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:08:11Z is after 2025-08-24T17:21:41Z" Nov 26 15:08:11 crc kubenswrapper[4785]: I1126 15:08:11.124749 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-hk884" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b9a87c55-b930-4993-88b8-15902e000caa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://018c4591e8ee49a626a6357d6672b723d221f62eaa62face83fd6c3d5f1b3bfa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n9rk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:38Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-hk884\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:08:11Z is after 2025-08-24T17:21:41Z" Nov 26 15:08:11 crc kubenswrapper[4785]: I1126 15:08:11.146338 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6q4xd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"855bd894-cca9-4fe1-a0d5-8b72afe7c93a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://041064cd63f3d654e02e5e310fbc190ca527a986959c9d7a35ab87962184f2a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7zv2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6q4xd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:08:11Z is after 2025-08-24T17:21:41Z" Nov 26 15:08:11 crc kubenswrapper[4785]: I1126 15:08:11.155466 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:11 crc kubenswrapper[4785]: I1126 15:08:11.155507 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:11 crc kubenswrapper[4785]: I1126 15:08:11.155518 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:11 crc kubenswrapper[4785]: I1126 15:08:11.155534 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:11 crc kubenswrapper[4785]: I1126 15:08:11.155545 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:11Z","lastTransitionTime":"2025-11-26T15:08:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:11 crc kubenswrapper[4785]: I1126 15:08:11.162421 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-qdfwp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72903df2-b694-4229-96b5-167500cab723\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrprh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrprh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:52Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-qdfwp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:08:11Z is after 2025-08-24T17:21:41Z" Nov 26 15:08:11 crc kubenswrapper[4785]: I1126 15:08:11.181546 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:08:11Z is after 2025-08-24T17:21:41Z" Nov 26 15:08:11 crc kubenswrapper[4785]: I1126 15:08:11.198336 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-smv28" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d77660b-5b10-4573-84ab-3dc318d4b4ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3bb62d64073e9a5efa6ec04394fb33bf1498892d59e0843f323a728dff2e7b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mwfft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:38Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-smv28\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:08:11Z is after 2025-08-24T17:21:41Z" Nov 26 15:08:11 crc kubenswrapper[4785]: I1126 15:08:11.216379 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zk6jt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"09c65f70-203d-40eb-a45e-ed8d7e36912f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69bcee7670290dd116b68ea6d15585455861981551919b4b553fc699f43161e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dg52p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba1f2e54aa3b8e81f5bc6ad4531d5976cfd8d6c4c15f16cefecc75c3c6189ef4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dg52p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-zk6jt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:08:11Z is after 2025-08-24T17:21:41Z" Nov 26 15:08:11 crc kubenswrapper[4785]: I1126 15:08:11.257882 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:11 crc kubenswrapper[4785]: I1126 15:08:11.258028 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:11 crc kubenswrapper[4785]: I1126 15:08:11.258051 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:11 crc kubenswrapper[4785]: I1126 15:08:11.258078 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:11 crc kubenswrapper[4785]: I1126 15:08:11.258096 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:11Z","lastTransitionTime":"2025-11-26T15:08:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:11 crc kubenswrapper[4785]: I1126 15:08:11.361475 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:11 crc kubenswrapper[4785]: I1126 15:08:11.361590 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:11 crc kubenswrapper[4785]: I1126 15:08:11.361611 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:11 crc kubenswrapper[4785]: I1126 15:08:11.361635 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:11 crc kubenswrapper[4785]: I1126 15:08:11.361652 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:11Z","lastTransitionTime":"2025-11-26T15:08:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:11 crc kubenswrapper[4785]: I1126 15:08:11.464270 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:11 crc kubenswrapper[4785]: I1126 15:08:11.464325 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:11 crc kubenswrapper[4785]: I1126 15:08:11.464342 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:11 crc kubenswrapper[4785]: I1126 15:08:11.464366 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:11 crc kubenswrapper[4785]: I1126 15:08:11.464384 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:11Z","lastTransitionTime":"2025-11-26T15:08:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:11 crc kubenswrapper[4785]: I1126 15:08:11.567670 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:11 crc kubenswrapper[4785]: I1126 15:08:11.567769 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:11 crc kubenswrapper[4785]: I1126 15:08:11.567794 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:11 crc kubenswrapper[4785]: I1126 15:08:11.567819 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:11 crc kubenswrapper[4785]: I1126 15:08:11.567840 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:11Z","lastTransitionTime":"2025-11-26T15:08:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:11 crc kubenswrapper[4785]: I1126 15:08:11.670990 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:11 crc kubenswrapper[4785]: I1126 15:08:11.671049 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:11 crc kubenswrapper[4785]: I1126 15:08:11.671065 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:11 crc kubenswrapper[4785]: I1126 15:08:11.671089 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:11 crc kubenswrapper[4785]: I1126 15:08:11.671108 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:11Z","lastTransitionTime":"2025-11-26T15:08:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:11 crc kubenswrapper[4785]: I1126 15:08:11.773962 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:11 crc kubenswrapper[4785]: I1126 15:08:11.774043 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:11 crc kubenswrapper[4785]: I1126 15:08:11.774052 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:11 crc kubenswrapper[4785]: I1126 15:08:11.774068 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:11 crc kubenswrapper[4785]: I1126 15:08:11.774078 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:11Z","lastTransitionTime":"2025-11-26T15:08:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:11 crc kubenswrapper[4785]: I1126 15:08:11.876440 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:11 crc kubenswrapper[4785]: I1126 15:08:11.876501 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:11 crc kubenswrapper[4785]: I1126 15:08:11.876521 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:11 crc kubenswrapper[4785]: I1126 15:08:11.876544 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:11 crc kubenswrapper[4785]: I1126 15:08:11.876596 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:11Z","lastTransitionTime":"2025-11-26T15:08:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:11 crc kubenswrapper[4785]: I1126 15:08:11.979822 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:11 crc kubenswrapper[4785]: I1126 15:08:11.979867 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:11 crc kubenswrapper[4785]: I1126 15:08:11.979918 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:11 crc kubenswrapper[4785]: I1126 15:08:11.979938 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:11 crc kubenswrapper[4785]: I1126 15:08:11.979950 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:11Z","lastTransitionTime":"2025-11-26T15:08:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:12 crc kubenswrapper[4785]: I1126 15:08:12.035345 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qdfwp" Nov 26 15:08:12 crc kubenswrapper[4785]: E1126 15:08:12.035507 4785 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qdfwp" podUID="72903df2-b694-4229-96b5-167500cab723" Nov 26 15:08:12 crc kubenswrapper[4785]: I1126 15:08:12.035342 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 26 15:08:12 crc kubenswrapper[4785]: E1126 15:08:12.035772 4785 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 26 15:08:12 crc kubenswrapper[4785]: I1126 15:08:12.082060 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:12 crc kubenswrapper[4785]: I1126 15:08:12.082107 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:12 crc kubenswrapper[4785]: I1126 15:08:12.082120 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:12 crc kubenswrapper[4785]: I1126 15:08:12.082137 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:12 crc kubenswrapper[4785]: I1126 15:08:12.082148 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:12Z","lastTransitionTime":"2025-11-26T15:08:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:12 crc kubenswrapper[4785]: I1126 15:08:12.185326 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:12 crc kubenswrapper[4785]: I1126 15:08:12.185392 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:12 crc kubenswrapper[4785]: I1126 15:08:12.185408 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:12 crc kubenswrapper[4785]: I1126 15:08:12.185436 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:12 crc kubenswrapper[4785]: I1126 15:08:12.185453 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:12Z","lastTransitionTime":"2025-11-26T15:08:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:12 crc kubenswrapper[4785]: I1126 15:08:12.288394 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:12 crc kubenswrapper[4785]: I1126 15:08:12.288452 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:12 crc kubenswrapper[4785]: I1126 15:08:12.288469 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:12 crc kubenswrapper[4785]: I1126 15:08:12.288495 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:12 crc kubenswrapper[4785]: I1126 15:08:12.288512 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:12Z","lastTransitionTime":"2025-11-26T15:08:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:12 crc kubenswrapper[4785]: I1126 15:08:12.391854 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:12 crc kubenswrapper[4785]: I1126 15:08:12.391909 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:12 crc kubenswrapper[4785]: I1126 15:08:12.391926 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:12 crc kubenswrapper[4785]: I1126 15:08:12.391948 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:12 crc kubenswrapper[4785]: I1126 15:08:12.391966 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:12Z","lastTransitionTime":"2025-11-26T15:08:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:12 crc kubenswrapper[4785]: I1126 15:08:12.494899 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:12 crc kubenswrapper[4785]: I1126 15:08:12.494991 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:12 crc kubenswrapper[4785]: I1126 15:08:12.495017 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:12 crc kubenswrapper[4785]: I1126 15:08:12.495048 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:12 crc kubenswrapper[4785]: I1126 15:08:12.495072 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:12Z","lastTransitionTime":"2025-11-26T15:08:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:12 crc kubenswrapper[4785]: I1126 15:08:12.598490 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:12 crc kubenswrapper[4785]: I1126 15:08:12.598614 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:12 crc kubenswrapper[4785]: I1126 15:08:12.598636 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:12 crc kubenswrapper[4785]: I1126 15:08:12.598657 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:12 crc kubenswrapper[4785]: I1126 15:08:12.598706 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:12Z","lastTransitionTime":"2025-11-26T15:08:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:12 crc kubenswrapper[4785]: I1126 15:08:12.701702 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:12 crc kubenswrapper[4785]: I1126 15:08:12.701765 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:12 crc kubenswrapper[4785]: I1126 15:08:12.701783 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:12 crc kubenswrapper[4785]: I1126 15:08:12.701809 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:12 crc kubenswrapper[4785]: I1126 15:08:12.701867 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:12Z","lastTransitionTime":"2025-11-26T15:08:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:12 crc kubenswrapper[4785]: I1126 15:08:12.805298 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:12 crc kubenswrapper[4785]: I1126 15:08:12.805370 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:12 crc kubenswrapper[4785]: I1126 15:08:12.805388 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:12 crc kubenswrapper[4785]: I1126 15:08:12.805411 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:12 crc kubenswrapper[4785]: I1126 15:08:12.805441 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:12Z","lastTransitionTime":"2025-11-26T15:08:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:12 crc kubenswrapper[4785]: I1126 15:08:12.908728 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:12 crc kubenswrapper[4785]: I1126 15:08:12.908805 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:12 crc kubenswrapper[4785]: I1126 15:08:12.908829 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:12 crc kubenswrapper[4785]: I1126 15:08:12.908860 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:12 crc kubenswrapper[4785]: I1126 15:08:12.908882 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:12Z","lastTransitionTime":"2025-11-26T15:08:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:13 crc kubenswrapper[4785]: I1126 15:08:13.012577 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:13 crc kubenswrapper[4785]: I1126 15:08:13.012620 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:13 crc kubenswrapper[4785]: I1126 15:08:13.012630 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:13 crc kubenswrapper[4785]: I1126 15:08:13.012645 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:13 crc kubenswrapper[4785]: I1126 15:08:13.012656 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:13Z","lastTransitionTime":"2025-11-26T15:08:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:13 crc kubenswrapper[4785]: I1126 15:08:13.035988 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 26 15:08:13 crc kubenswrapper[4785]: I1126 15:08:13.036098 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 26 15:08:13 crc kubenswrapper[4785]: E1126 15:08:13.036255 4785 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 26 15:08:13 crc kubenswrapper[4785]: E1126 15:08:13.036446 4785 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 26 15:08:13 crc kubenswrapper[4785]: I1126 15:08:13.116010 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:13 crc kubenswrapper[4785]: I1126 15:08:13.116058 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:13 crc kubenswrapper[4785]: I1126 15:08:13.116071 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:13 crc kubenswrapper[4785]: I1126 15:08:13.116088 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:13 crc kubenswrapper[4785]: I1126 15:08:13.116100 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:13Z","lastTransitionTime":"2025-11-26T15:08:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:13 crc kubenswrapper[4785]: I1126 15:08:13.219164 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:13 crc kubenswrapper[4785]: I1126 15:08:13.219195 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:13 crc kubenswrapper[4785]: I1126 15:08:13.219205 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:13 crc kubenswrapper[4785]: I1126 15:08:13.219220 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:13 crc kubenswrapper[4785]: I1126 15:08:13.219230 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:13Z","lastTransitionTime":"2025-11-26T15:08:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:13 crc kubenswrapper[4785]: I1126 15:08:13.322250 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:13 crc kubenswrapper[4785]: I1126 15:08:13.322317 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:13 crc kubenswrapper[4785]: I1126 15:08:13.322330 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:13 crc kubenswrapper[4785]: I1126 15:08:13.322348 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:13 crc kubenswrapper[4785]: I1126 15:08:13.322360 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:13Z","lastTransitionTime":"2025-11-26T15:08:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:13 crc kubenswrapper[4785]: I1126 15:08:13.424938 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:13 crc kubenswrapper[4785]: I1126 15:08:13.425011 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:13 crc kubenswrapper[4785]: I1126 15:08:13.425032 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:13 crc kubenswrapper[4785]: I1126 15:08:13.425054 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:13 crc kubenswrapper[4785]: I1126 15:08:13.425073 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:13Z","lastTransitionTime":"2025-11-26T15:08:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:13 crc kubenswrapper[4785]: I1126 15:08:13.527926 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:13 crc kubenswrapper[4785]: I1126 15:08:13.527962 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:13 crc kubenswrapper[4785]: I1126 15:08:13.527972 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:13 crc kubenswrapper[4785]: I1126 15:08:13.528011 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:13 crc kubenswrapper[4785]: I1126 15:08:13.528022 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:13Z","lastTransitionTime":"2025-11-26T15:08:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:13 crc kubenswrapper[4785]: I1126 15:08:13.630565 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:13 crc kubenswrapper[4785]: I1126 15:08:13.630602 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:13 crc kubenswrapper[4785]: I1126 15:08:13.630613 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:13 crc kubenswrapper[4785]: I1126 15:08:13.630628 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:13 crc kubenswrapper[4785]: I1126 15:08:13.630638 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:13Z","lastTransitionTime":"2025-11-26T15:08:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:13 crc kubenswrapper[4785]: I1126 15:08:13.732680 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:13 crc kubenswrapper[4785]: I1126 15:08:13.732735 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:13 crc kubenswrapper[4785]: I1126 15:08:13.732745 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:13 crc kubenswrapper[4785]: I1126 15:08:13.732759 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:13 crc kubenswrapper[4785]: I1126 15:08:13.732768 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:13Z","lastTransitionTime":"2025-11-26T15:08:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:13 crc kubenswrapper[4785]: I1126 15:08:13.836729 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:13 crc kubenswrapper[4785]: I1126 15:08:13.836791 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:13 crc kubenswrapper[4785]: I1126 15:08:13.836809 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:13 crc kubenswrapper[4785]: I1126 15:08:13.836832 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:13 crc kubenswrapper[4785]: I1126 15:08:13.836848 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:13Z","lastTransitionTime":"2025-11-26T15:08:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:13 crc kubenswrapper[4785]: I1126 15:08:13.939569 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:13 crc kubenswrapper[4785]: I1126 15:08:13.939619 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:13 crc kubenswrapper[4785]: I1126 15:08:13.939633 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:13 crc kubenswrapper[4785]: I1126 15:08:13.939651 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:13 crc kubenswrapper[4785]: I1126 15:08:13.939666 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:13Z","lastTransitionTime":"2025-11-26T15:08:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:14 crc kubenswrapper[4785]: I1126 15:08:14.035704 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qdfwp" Nov 26 15:08:14 crc kubenswrapper[4785]: I1126 15:08:14.035760 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 26 15:08:14 crc kubenswrapper[4785]: E1126 15:08:14.035886 4785 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qdfwp" podUID="72903df2-b694-4229-96b5-167500cab723" Nov 26 15:08:14 crc kubenswrapper[4785]: E1126 15:08:14.035985 4785 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 26 15:08:14 crc kubenswrapper[4785]: I1126 15:08:14.043354 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:14 crc kubenswrapper[4785]: I1126 15:08:14.043413 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:14 crc kubenswrapper[4785]: I1126 15:08:14.043452 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:14 crc kubenswrapper[4785]: I1126 15:08:14.043489 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:14 crc kubenswrapper[4785]: I1126 15:08:14.043514 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:14Z","lastTransitionTime":"2025-11-26T15:08:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:14 crc kubenswrapper[4785]: I1126 15:08:14.146244 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:14 crc kubenswrapper[4785]: I1126 15:08:14.146307 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:14 crc kubenswrapper[4785]: I1126 15:08:14.146327 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:14 crc kubenswrapper[4785]: I1126 15:08:14.146351 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:14 crc kubenswrapper[4785]: I1126 15:08:14.146368 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:14Z","lastTransitionTime":"2025-11-26T15:08:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:14 crc kubenswrapper[4785]: I1126 15:08:14.159311 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:14 crc kubenswrapper[4785]: I1126 15:08:14.159360 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:14 crc kubenswrapper[4785]: I1126 15:08:14.159380 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:14 crc kubenswrapper[4785]: I1126 15:08:14.159397 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:14 crc kubenswrapper[4785]: I1126 15:08:14.159408 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:14Z","lastTransitionTime":"2025-11-26T15:08:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:14 crc kubenswrapper[4785]: E1126 15:08:14.173663 4785 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T15:08:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T15:08:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T15:08:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T15:08:14Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T15:08:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T15:08:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T15:08:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T15:08:14Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"564b0a72-079b-4004-bac6-7e7947bc6860\\\",\\\"systemUUID\\\":\\\"0559caf5-1d73-4afa-a2e0-e8d6b738bfd5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:08:14Z is after 2025-08-24T17:21:41Z" Nov 26 15:08:14 crc kubenswrapper[4785]: I1126 15:08:14.178013 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:14 crc kubenswrapper[4785]: I1126 15:08:14.178043 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:14 crc kubenswrapper[4785]: I1126 15:08:14.178052 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:14 crc kubenswrapper[4785]: I1126 15:08:14.178067 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:14 crc kubenswrapper[4785]: I1126 15:08:14.178089 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:14Z","lastTransitionTime":"2025-11-26T15:08:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:14 crc kubenswrapper[4785]: E1126 15:08:14.191806 4785 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T15:08:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T15:08:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T15:08:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T15:08:14Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T15:08:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T15:08:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T15:08:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T15:08:14Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"564b0a72-079b-4004-bac6-7e7947bc6860\\\",\\\"systemUUID\\\":\\\"0559caf5-1d73-4afa-a2e0-e8d6b738bfd5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:08:14Z is after 2025-08-24T17:21:41Z" Nov 26 15:08:14 crc kubenswrapper[4785]: I1126 15:08:14.196214 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:14 crc kubenswrapper[4785]: I1126 15:08:14.196248 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:14 crc kubenswrapper[4785]: I1126 15:08:14.196256 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:14 crc kubenswrapper[4785]: I1126 15:08:14.196268 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:14 crc kubenswrapper[4785]: I1126 15:08:14.196277 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:14Z","lastTransitionTime":"2025-11-26T15:08:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:14 crc kubenswrapper[4785]: E1126 15:08:14.217588 4785 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T15:08:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T15:08:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T15:08:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T15:08:14Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T15:08:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T15:08:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T15:08:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T15:08:14Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"564b0a72-079b-4004-bac6-7e7947bc6860\\\",\\\"systemUUID\\\":\\\"0559caf5-1d73-4afa-a2e0-e8d6b738bfd5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:08:14Z is after 2025-08-24T17:21:41Z" Nov 26 15:08:14 crc kubenswrapper[4785]: I1126 15:08:14.221787 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:14 crc kubenswrapper[4785]: I1126 15:08:14.221838 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:14 crc kubenswrapper[4785]: I1126 15:08:14.221849 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:14 crc kubenswrapper[4785]: I1126 15:08:14.221865 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:14 crc kubenswrapper[4785]: I1126 15:08:14.221876 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:14Z","lastTransitionTime":"2025-11-26T15:08:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:14 crc kubenswrapper[4785]: E1126 15:08:14.236223 4785 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T15:08:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T15:08:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T15:08:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T15:08:14Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T15:08:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T15:08:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T15:08:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T15:08:14Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"564b0a72-079b-4004-bac6-7e7947bc6860\\\",\\\"systemUUID\\\":\\\"0559caf5-1d73-4afa-a2e0-e8d6b738bfd5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:08:14Z is after 2025-08-24T17:21:41Z" Nov 26 15:08:14 crc kubenswrapper[4785]: I1126 15:08:14.239534 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:14 crc kubenswrapper[4785]: I1126 15:08:14.239571 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:14 crc kubenswrapper[4785]: I1126 15:08:14.239579 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:14 crc kubenswrapper[4785]: I1126 15:08:14.239592 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:14 crc kubenswrapper[4785]: I1126 15:08:14.239600 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:14Z","lastTransitionTime":"2025-11-26T15:08:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:14 crc kubenswrapper[4785]: E1126 15:08:14.257827 4785 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T15:08:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T15:08:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T15:08:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T15:08:14Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T15:08:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T15:08:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T15:08:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T15:08:14Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"564b0a72-079b-4004-bac6-7e7947bc6860\\\",\\\"systemUUID\\\":\\\"0559caf5-1d73-4afa-a2e0-e8d6b738bfd5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:08:14Z is after 2025-08-24T17:21:41Z" Nov 26 15:08:14 crc kubenswrapper[4785]: E1126 15:08:14.257999 4785 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Nov 26 15:08:14 crc kubenswrapper[4785]: I1126 15:08:14.259949 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:14 crc kubenswrapper[4785]: I1126 15:08:14.259987 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:14 crc kubenswrapper[4785]: I1126 15:08:14.260001 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:14 crc kubenswrapper[4785]: I1126 15:08:14.260021 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:14 crc kubenswrapper[4785]: I1126 15:08:14.260035 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:14Z","lastTransitionTime":"2025-11-26T15:08:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:14 crc kubenswrapper[4785]: I1126 15:08:14.362690 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:14 crc kubenswrapper[4785]: I1126 15:08:14.362758 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:14 crc kubenswrapper[4785]: I1126 15:08:14.362779 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:14 crc kubenswrapper[4785]: I1126 15:08:14.362806 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:14 crc kubenswrapper[4785]: I1126 15:08:14.362828 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:14Z","lastTransitionTime":"2025-11-26T15:08:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:14 crc kubenswrapper[4785]: I1126 15:08:14.465273 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:14 crc kubenswrapper[4785]: I1126 15:08:14.465360 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:14 crc kubenswrapper[4785]: I1126 15:08:14.465377 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:14 crc kubenswrapper[4785]: I1126 15:08:14.465400 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:14 crc kubenswrapper[4785]: I1126 15:08:14.465420 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:14Z","lastTransitionTime":"2025-11-26T15:08:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:14 crc kubenswrapper[4785]: I1126 15:08:14.568796 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:14 crc kubenswrapper[4785]: I1126 15:08:14.568844 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:14 crc kubenswrapper[4785]: I1126 15:08:14.568862 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:14 crc kubenswrapper[4785]: I1126 15:08:14.568884 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:14 crc kubenswrapper[4785]: I1126 15:08:14.568900 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:14Z","lastTransitionTime":"2025-11-26T15:08:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:14 crc kubenswrapper[4785]: I1126 15:08:14.671514 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:14 crc kubenswrapper[4785]: I1126 15:08:14.671575 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:14 crc kubenswrapper[4785]: I1126 15:08:14.671587 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:14 crc kubenswrapper[4785]: I1126 15:08:14.671605 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:14 crc kubenswrapper[4785]: I1126 15:08:14.671623 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:14Z","lastTransitionTime":"2025-11-26T15:08:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:14 crc kubenswrapper[4785]: I1126 15:08:14.774645 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:14 crc kubenswrapper[4785]: I1126 15:08:14.774739 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:14 crc kubenswrapper[4785]: I1126 15:08:14.774758 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:14 crc kubenswrapper[4785]: I1126 15:08:14.774789 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:14 crc kubenswrapper[4785]: I1126 15:08:14.774808 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:14Z","lastTransitionTime":"2025-11-26T15:08:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:14 crc kubenswrapper[4785]: I1126 15:08:14.878278 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:14 crc kubenswrapper[4785]: I1126 15:08:14.878321 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:14 crc kubenswrapper[4785]: I1126 15:08:14.878330 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:14 crc kubenswrapper[4785]: I1126 15:08:14.878345 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:14 crc kubenswrapper[4785]: I1126 15:08:14.878355 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:14Z","lastTransitionTime":"2025-11-26T15:08:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:14 crc kubenswrapper[4785]: I1126 15:08:14.982095 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:14 crc kubenswrapper[4785]: I1126 15:08:14.982166 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:14 crc kubenswrapper[4785]: I1126 15:08:14.982185 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:14 crc kubenswrapper[4785]: I1126 15:08:14.982209 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:14 crc kubenswrapper[4785]: I1126 15:08:14.982229 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:14Z","lastTransitionTime":"2025-11-26T15:08:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:15 crc kubenswrapper[4785]: I1126 15:08:15.036069 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 26 15:08:15 crc kubenswrapper[4785]: E1126 15:08:15.036294 4785 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 26 15:08:15 crc kubenswrapper[4785]: I1126 15:08:15.036529 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 26 15:08:15 crc kubenswrapper[4785]: E1126 15:08:15.036684 4785 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 26 15:08:15 crc kubenswrapper[4785]: I1126 15:08:15.085513 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:15 crc kubenswrapper[4785]: I1126 15:08:15.085621 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:15 crc kubenswrapper[4785]: I1126 15:08:15.085647 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:15 crc kubenswrapper[4785]: I1126 15:08:15.085682 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:15 crc kubenswrapper[4785]: I1126 15:08:15.085704 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:15Z","lastTransitionTime":"2025-11-26T15:08:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:15 crc kubenswrapper[4785]: I1126 15:08:15.188491 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:15 crc kubenswrapper[4785]: I1126 15:08:15.188602 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:15 crc kubenswrapper[4785]: I1126 15:08:15.188621 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:15 crc kubenswrapper[4785]: I1126 15:08:15.188645 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:15 crc kubenswrapper[4785]: I1126 15:08:15.188662 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:15Z","lastTransitionTime":"2025-11-26T15:08:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:15 crc kubenswrapper[4785]: I1126 15:08:15.291936 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:15 crc kubenswrapper[4785]: I1126 15:08:15.292010 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:15 crc kubenswrapper[4785]: I1126 15:08:15.292022 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:15 crc kubenswrapper[4785]: I1126 15:08:15.292042 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:15 crc kubenswrapper[4785]: I1126 15:08:15.292058 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:15Z","lastTransitionTime":"2025-11-26T15:08:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:15 crc kubenswrapper[4785]: I1126 15:08:15.395794 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:15 crc kubenswrapper[4785]: I1126 15:08:15.395845 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:15 crc kubenswrapper[4785]: I1126 15:08:15.395855 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:15 crc kubenswrapper[4785]: I1126 15:08:15.395879 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:15 crc kubenswrapper[4785]: I1126 15:08:15.395891 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:15Z","lastTransitionTime":"2025-11-26T15:08:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:15 crc kubenswrapper[4785]: I1126 15:08:15.500217 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:15 crc kubenswrapper[4785]: I1126 15:08:15.500286 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:15 crc kubenswrapper[4785]: I1126 15:08:15.500307 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:15 crc kubenswrapper[4785]: I1126 15:08:15.500332 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:15 crc kubenswrapper[4785]: I1126 15:08:15.500352 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:15Z","lastTransitionTime":"2025-11-26T15:08:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:15 crc kubenswrapper[4785]: I1126 15:08:15.604087 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:15 crc kubenswrapper[4785]: I1126 15:08:15.604139 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:15 crc kubenswrapper[4785]: I1126 15:08:15.604155 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:15 crc kubenswrapper[4785]: I1126 15:08:15.604179 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:15 crc kubenswrapper[4785]: I1126 15:08:15.604195 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:15Z","lastTransitionTime":"2025-11-26T15:08:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:15 crc kubenswrapper[4785]: I1126 15:08:15.707223 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:15 crc kubenswrapper[4785]: I1126 15:08:15.707299 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:15 crc kubenswrapper[4785]: I1126 15:08:15.707323 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:15 crc kubenswrapper[4785]: I1126 15:08:15.707353 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:15 crc kubenswrapper[4785]: I1126 15:08:15.707374 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:15Z","lastTransitionTime":"2025-11-26T15:08:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:15 crc kubenswrapper[4785]: I1126 15:08:15.810171 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:15 crc kubenswrapper[4785]: I1126 15:08:15.810223 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:15 crc kubenswrapper[4785]: I1126 15:08:15.810235 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:15 crc kubenswrapper[4785]: I1126 15:08:15.810254 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:15 crc kubenswrapper[4785]: I1126 15:08:15.810266 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:15Z","lastTransitionTime":"2025-11-26T15:08:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:15 crc kubenswrapper[4785]: I1126 15:08:15.913659 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:15 crc kubenswrapper[4785]: I1126 15:08:15.913725 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:15 crc kubenswrapper[4785]: I1126 15:08:15.913746 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:15 crc kubenswrapper[4785]: I1126 15:08:15.913775 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:15 crc kubenswrapper[4785]: I1126 15:08:15.913796 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:15Z","lastTransitionTime":"2025-11-26T15:08:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:16 crc kubenswrapper[4785]: I1126 15:08:16.016795 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:16 crc kubenswrapper[4785]: I1126 15:08:16.016835 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:16 crc kubenswrapper[4785]: I1126 15:08:16.016846 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:16 crc kubenswrapper[4785]: I1126 15:08:16.016862 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:16 crc kubenswrapper[4785]: I1126 15:08:16.016874 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:16Z","lastTransitionTime":"2025-11-26T15:08:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:16 crc kubenswrapper[4785]: I1126 15:08:16.035304 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 26 15:08:16 crc kubenswrapper[4785]: E1126 15:08:16.035455 4785 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 26 15:08:16 crc kubenswrapper[4785]: I1126 15:08:16.035599 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qdfwp" Nov 26 15:08:16 crc kubenswrapper[4785]: E1126 15:08:16.035830 4785 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qdfwp" podUID="72903df2-b694-4229-96b5-167500cab723" Nov 26 15:08:16 crc kubenswrapper[4785]: I1126 15:08:16.119983 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:16 crc kubenswrapper[4785]: I1126 15:08:16.120076 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:16 crc kubenswrapper[4785]: I1126 15:08:16.120101 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:16 crc kubenswrapper[4785]: I1126 15:08:16.120137 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:16 crc kubenswrapper[4785]: I1126 15:08:16.120163 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:16Z","lastTransitionTime":"2025-11-26T15:08:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:16 crc kubenswrapper[4785]: I1126 15:08:16.223521 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:16 crc kubenswrapper[4785]: I1126 15:08:16.223893 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:16 crc kubenswrapper[4785]: I1126 15:08:16.223986 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:16 crc kubenswrapper[4785]: I1126 15:08:16.224097 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:16 crc kubenswrapper[4785]: I1126 15:08:16.224193 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:16Z","lastTransitionTime":"2025-11-26T15:08:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:16 crc kubenswrapper[4785]: I1126 15:08:16.326746 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:16 crc kubenswrapper[4785]: I1126 15:08:16.326815 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:16 crc kubenswrapper[4785]: I1126 15:08:16.326832 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:16 crc kubenswrapper[4785]: I1126 15:08:16.326852 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:16 crc kubenswrapper[4785]: I1126 15:08:16.326866 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:16Z","lastTransitionTime":"2025-11-26T15:08:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:16 crc kubenswrapper[4785]: I1126 15:08:16.430603 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:16 crc kubenswrapper[4785]: I1126 15:08:16.430655 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:16 crc kubenswrapper[4785]: I1126 15:08:16.430667 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:16 crc kubenswrapper[4785]: I1126 15:08:16.430689 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:16 crc kubenswrapper[4785]: I1126 15:08:16.430700 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:16Z","lastTransitionTime":"2025-11-26T15:08:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:16 crc kubenswrapper[4785]: I1126 15:08:16.533299 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:16 crc kubenswrapper[4785]: I1126 15:08:16.533378 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:16 crc kubenswrapper[4785]: I1126 15:08:16.533396 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:16 crc kubenswrapper[4785]: I1126 15:08:16.533419 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:16 crc kubenswrapper[4785]: I1126 15:08:16.533437 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:16Z","lastTransitionTime":"2025-11-26T15:08:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:16 crc kubenswrapper[4785]: I1126 15:08:16.636329 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:16 crc kubenswrapper[4785]: I1126 15:08:16.636760 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:16 crc kubenswrapper[4785]: I1126 15:08:16.636906 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:16 crc kubenswrapper[4785]: I1126 15:08:16.637060 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:16 crc kubenswrapper[4785]: I1126 15:08:16.637196 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:16Z","lastTransitionTime":"2025-11-26T15:08:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:16 crc kubenswrapper[4785]: I1126 15:08:16.740002 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:16 crc kubenswrapper[4785]: I1126 15:08:16.740044 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:16 crc kubenswrapper[4785]: I1126 15:08:16.740054 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:16 crc kubenswrapper[4785]: I1126 15:08:16.740072 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:16 crc kubenswrapper[4785]: I1126 15:08:16.740085 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:16Z","lastTransitionTime":"2025-11-26T15:08:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:16 crc kubenswrapper[4785]: I1126 15:08:16.843460 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:16 crc kubenswrapper[4785]: I1126 15:08:16.843515 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:16 crc kubenswrapper[4785]: I1126 15:08:16.843527 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:16 crc kubenswrapper[4785]: I1126 15:08:16.843545 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:16 crc kubenswrapper[4785]: I1126 15:08:16.843578 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:16Z","lastTransitionTime":"2025-11-26T15:08:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:16 crc kubenswrapper[4785]: I1126 15:08:16.946443 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:16 crc kubenswrapper[4785]: I1126 15:08:16.946487 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:16 crc kubenswrapper[4785]: I1126 15:08:16.946503 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:16 crc kubenswrapper[4785]: I1126 15:08:16.946519 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:16 crc kubenswrapper[4785]: I1126 15:08:16.946530 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:16Z","lastTransitionTime":"2025-11-26T15:08:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:17 crc kubenswrapper[4785]: I1126 15:08:17.035537 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 26 15:08:17 crc kubenswrapper[4785]: E1126 15:08:17.035693 4785 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 26 15:08:17 crc kubenswrapper[4785]: I1126 15:08:17.035546 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 26 15:08:17 crc kubenswrapper[4785]: E1126 15:08:17.035772 4785 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 26 15:08:17 crc kubenswrapper[4785]: I1126 15:08:17.049280 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:17 crc kubenswrapper[4785]: I1126 15:08:17.049535 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:17 crc kubenswrapper[4785]: I1126 15:08:17.049546 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:17 crc kubenswrapper[4785]: I1126 15:08:17.049580 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:17 crc kubenswrapper[4785]: I1126 15:08:17.049364 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:08:17Z is after 2025-08-24T17:21:41Z" Nov 26 15:08:17 crc kubenswrapper[4785]: I1126 15:08:17.049593 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:17Z","lastTransitionTime":"2025-11-26T15:08:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:17 crc kubenswrapper[4785]: I1126 15:08:17.060028 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-smv28" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d77660b-5b10-4573-84ab-3dc318d4b4ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3bb62d64073e9a5efa6ec04394fb33bf1498892d59e0843f323a728dff2e7b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mwfft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:38Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-smv28\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:08:17Z is after 2025-08-24T17:21:41Z" Nov 26 15:08:17 crc kubenswrapper[4785]: I1126 15:08:17.071020 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zk6jt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"09c65f70-203d-40eb-a45e-ed8d7e36912f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69bcee7670290dd116b68ea6d15585455861981551919b4b553fc699f43161e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dg52p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba1f2e54aa3b8e81f5bc6ad4531d5976cfd8d6c4c15f16cefecc75c3c6189ef4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dg52p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-zk6jt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:08:17Z is after 2025-08-24T17:21:41Z" Nov 26 15:08:17 crc kubenswrapper[4785]: I1126 15:08:17.082988 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f369bd7df062792e57a0cee19e7be96c6850b20d6e739a0c7fa5eccc14ae1d1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:08:17Z is after 2025-08-24T17:21:41Z" Nov 26 15:08:17 crc kubenswrapper[4785]: I1126 15:08:17.095256 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:08:17Z is after 2025-08-24T17:21:41Z" Nov 26 15:08:17 crc kubenswrapper[4785]: I1126 15:08:17.112934 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ce2473df-8540-437a-9b68-a0c79c8f1189\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26237e500f4655636096b017418a9029308b9ad9e5c12041546e51d919766529\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7904d95c1ece08fadcdc54e83f80b295f3904cc950fb84624bc2a6967e4d5bdd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://936454b4c6cdedceb140c50e2dba0e677ce2b126075b203bdae8a77bbdefde76\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2202f005c9b704cc7a97a7387e4563a14bda9e52001c63918f1067a7e676d34d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a6d4414478527bb76f539125e4be6ebf2a6470ddb46fd35e07180319fa98ede\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-26T15:07:36Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1126 15:07:30.557519 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1126 15:07:30.559919 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2275430498/tls.crt::/tmp/serving-cert-2275430498/tls.key\\\\\\\"\\\\nI1126 15:07:36.843783 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1126 15:07:36.847546 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1126 15:07:36.847601 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1126 15:07:36.847630 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1126 15:07:36.847642 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1126 15:07:36.858915 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1126 15:07:36.858955 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1126 15:07:36.858960 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1126 15:07:36.858971 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1126 15:07:36.858975 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1126 15:07:36.858978 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1126 15:07:36.858982 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1126 15:07:36.859422 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1126 15:07:36.862125 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:20Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcce80995d95ff6993051a4f1ca260133d2513d216a291e3f2c905ec29f88d1b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e86ca1ef167bd0c31ad5ae1283d40945c72c4e528f3702b577bbbf9818e84c08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e86ca1ef167bd0c31ad5ae1283d40945c72c4e528f3702b577bbbf9818e84c08\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:08:17Z is after 2025-08-24T17:21:41Z" Nov 26 15:08:17 crc kubenswrapper[4785]: I1126 15:08:17.123906 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58c0357f-6901-420b-a246-f5ef95e4fb7e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:08:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:08:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2553398db49910f840349bfd11fd5ce7908438bdbcf5a654355c10fb3c3d610\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3c4c50721c470ba97725e59a42ae3cf740017ed7027a9e5b0a7e1906f40fd49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f8dc60997fbeb399187b6bc95276994ff0a181732022bfff1e229928996f90b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://169def9ac03274e05424365ce5b2fc1236d495e846603c29fa6efbaae27b53b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://169def9ac03274e05424365ce5b2fc1236d495e846603c29fa6efbaae27b53b2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:18Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:17Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:08:17Z is after 2025-08-24T17:21:41Z" Nov 26 15:08:17 crc kubenswrapper[4785]: I1126 15:08:17.151829 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ec3c2133-1999-46b4-b5a6-c1b800f04c30\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee866b365ef5e15c3cc262ab6ad0649b10fa7fdf0c10c3f747d9e20d48535e76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79b8f6bd2047642cacd4effcee8f9760c7c9480092993c7a25eabe4727bd8238\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dee1454ae94951e83f437b999f871e4965a9199a20bcd6392a6e07a7bfa9c6aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://257e51cb9b93727ccb616e193abc21dfed1ee9ee2f1ce459e35728457dae5280\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc743861f85f56a87c6a20d7152af59e554284fe13b8212c5dae854438771406\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ff60b31932a511f9156f1416e112d115fcbc79a0a6030143ba9f5ddd327577a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ff60b31932a511f9156f1416e112d115fcbc79a0a6030143ba9f5ddd327577a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff524be439c6bf410b9377cd503516b43be3826e4c2fcd03a6ed9bf3fcfab95b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff524be439c6bf410b9377cd503516b43be3826e4c2fcd03a6ed9bf3fcfab95b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:19Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://26cb5ba5508ef320a24ba05e352547c9feb82f37ebeab58ad7c0a073d03b8c2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26cb5ba5508ef320a24ba05e352547c9feb82f37ebeab58ad7c0a073d03b8c2d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:17Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:08:17Z is after 2025-08-24T17:21:41Z" Nov 26 15:08:17 crc kubenswrapper[4785]: I1126 15:08:17.151946 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:17 crc kubenswrapper[4785]: I1126 15:08:17.151966 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:17 crc kubenswrapper[4785]: I1126 15:08:17.151976 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:17 crc kubenswrapper[4785]: I1126 15:08:17.151992 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:17 crc kubenswrapper[4785]: I1126 15:08:17.152009 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:17Z","lastTransitionTime":"2025-11-26T15:08:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:17 crc kubenswrapper[4785]: I1126 15:08:17.171954 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8acfa933b476aa5e196c195906225ae92af5cca3353292e23f29f87ee374d0ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f432643beb27bff31e146230a60837b1292cddb8595ede8ed7f929170fceeb17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:08:17Z is after 2025-08-24T17:21:41Z" Nov 26 15:08:17 crc kubenswrapper[4785]: I1126 15:08:17.187380 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ab8b8fda3321443cb24c214c10ba7171f01c80adf81860f4aa931062ec886d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:08:17Z is after 2025-08-24T17:21:41Z" Nov 26 15:08:17 crc kubenswrapper[4785]: I1126 15:08:17.200989 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:08:17Z is after 2025-08-24T17:21:41Z" Nov 26 15:08:17 crc kubenswrapper[4785]: I1126 15:08:17.218875 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xbz7b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84d83039-5d86-45bb-a5c1-ca5b94ed92c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5459c3414a7754407ac6d182e88047bbe1b7b9750d15c7a88767c50c4b9dc720\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gdb7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f7072a2a9067489a27300fea823075b89d71d45a1d972bd9d94226b01c2c123\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f7072a2a9067489a27300fea823075b89d71d45a1d972bd9d94226b01c2c123\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gdb7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://538d65a0b5f3e2c45c1b9761f2c73e958d5de0873652f20f4047631423e538eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://538d65a0b5f3e2c45c1b9761f2c73e958d5de0873652f20f4047631423e538eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gdb7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1abf2b4c1a00c0bc17acaf7547130008127579df59592382ae6e81430936e6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d1abf2b4c1a00c0bc17acaf7547130008127579df59592382ae6e81430936e6d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gdb7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://35a2151729e5cb2ae347214fc91e6de116a4062943e4cb4a27ab0553436953ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35a2151729e5cb2ae347214fc91e6de116a4062943e4cb4a27ab0553436953ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gdb7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b51e95d33c83c57a3b5a7f580eb9cac5e50575e699c066694a9ae8fd4dcdfb20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b51e95d33c83c57a3b5a7f580eb9cac5e50575e699c066694a9ae8fd4dcdfb20\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gdb7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20f87e07ed26b3b93d4dad87f971e4ee3bf3c46299491a95990ab34c63167f93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://20f87e07ed26b3b93d4dad87f971e4ee3bf3c46299491a95990ab34c63167f93\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gdb7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xbz7b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:08:17Z is after 2025-08-24T17:21:41Z" Nov 26 15:08:17 crc kubenswrapper[4785]: I1126 15:08:17.232963 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-gkxdl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5539d39a-e2bc-4e7f-8b5a-a5e3e10c4ba4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://183451ec5e6ad1983b6e56a53adfcf2eaf3489dcfbc3cbc709f67002691b9010\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqzs8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5538d579d2e0ac36c778f58da6e8176f8f7a0995797e8bc7c3ad21a054346867\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqzs8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-gkxdl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:08:17Z is after 2025-08-24T17:21:41Z" Nov 26 15:08:17 crc kubenswrapper[4785]: I1126 15:08:17.246946 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dbac4f7f-578e-41e6-88a0-4ad8d2f5eef2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://320fc519a91dfed4d5c233e7c1c2f21aadf25336ccade8830069aae4bd804a68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b79ab88118b1e683ad26fc3bd390961d15af99cce73126c56587c0df819f3a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6c964ff713703fe4484d76d86473279ffc4a3eef0cf3dd8b29aea7fdc6271616\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03ac9c745ec9eb21263a599561496630202872ef631f5e846e0475efedee6943\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:08:17Z is after 2025-08-24T17:21:41Z" Nov 26 15:08:17 crc kubenswrapper[4785]: I1126 15:08:17.255375 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:17 crc kubenswrapper[4785]: I1126 15:08:17.255415 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:17 crc kubenswrapper[4785]: I1126 15:08:17.255465 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:17 crc kubenswrapper[4785]: I1126 15:08:17.255484 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:17 crc kubenswrapper[4785]: I1126 15:08:17.255496 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:17Z","lastTransitionTime":"2025-11-26T15:08:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:17 crc kubenswrapper[4785]: I1126 15:08:17.269867 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-925q9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"862c58fd-3f79-4276-bd76-ce689d32cbd6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b754df1d5f2d20f3afdb10f2e77f4716de9d953c7322bb3e289d989d2699d17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://018b59a73e9d454f9156de17b00a8c5cccc3f321164cac374187afa18466bc45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e1a2013c12cbd23c8c9e4e0d5bd9f76df1c2c2f30f0d55fdb8c8cb03698ea576\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bc34631383b7470b2d582d037ea922f2c46d09ca9e37011d758e16837336a78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://77387f2d556a76807d809c75aa4d9e2fdf5de82d7a04e23d3f5461ccef860196\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://47be007cc11c8c2db5bc6dd5786cad4e45c1f8f08f872d060be0211012aad935\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e098d64d0baa7671b1693be648e0a05bcf267a0fbb27f8df8341589e7f2403a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e098d64d0baa7671b1693be648e0a05bcf267a0fbb27f8df8341589e7f2403a8\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-26T15:08:05Z\\\",\\\"message\\\":\\\" event handler 8\\\\nI1126 15:08:05.966708 6428 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1126 15:08:05.966737 6428 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1126 15:08:05.966799 6428 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1126 15:08:05.966828 6428 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1126 15:08:05.966851 6428 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1126 15:08:05.966838 6428 handler.go:208] Removed *v1.Node event handler 2\\\\nI1126 15:08:05.966854 6428 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1126 15:08:05.966902 6428 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1126 15:08:05.966906 6428 factory.go:656] Stopping watch factory\\\\nI1126 15:08:05.966902 6428 handler.go:208] Removed *v1.Node event handler 7\\\\nI1126 15:08:05.966903 6428 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI1126 15:08:05.966957 6428 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1126 15:08:05.967001 6428 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI1126 15:08:05.967033 6428 ovnkube.go:599] Stopped ovnkube\\\\nI1126 15:08:05.967055 6428 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1126 15:08:05.967109 6428 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-26T15:08:05Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-925q9_openshift-ovn-kubernetes(862c58fd-3f79-4276-bd76-ce689d32cbd6)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84b98c7c476182b515072f71c6d81656debe5c03855324623b1eb31f53e6c5c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://049fc6de07d4b51e1c898035109c3427dff189cc509f5973db7cb75c3a022bc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://049fc6de07d4b51e1c898035109c3427dff189cc509f5973db7cb75c3a022bc1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-925q9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:08:17Z is after 2025-08-24T17:21:41Z" Nov 26 15:08:17 crc kubenswrapper[4785]: I1126 15:08:17.286080 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6q4xd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"855bd894-cca9-4fe1-a0d5-8b72afe7c93a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://041064cd63f3d654e02e5e310fbc190ca527a986959c9d7a35ab87962184f2a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7zv2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6q4xd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:08:17Z is after 2025-08-24T17:21:41Z" Nov 26 15:08:17 crc kubenswrapper[4785]: I1126 15:08:17.298186 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-qdfwp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72903df2-b694-4229-96b5-167500cab723\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrprh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrprh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:52Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-qdfwp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:08:17Z is after 2025-08-24T17:21:41Z" Nov 26 15:08:17 crc kubenswrapper[4785]: I1126 15:08:17.311638 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-hk884" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b9a87c55-b930-4993-88b8-15902e000caa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://018c4591e8ee49a626a6357d6672b723d221f62eaa62face83fd6c3d5f1b3bfa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n9rk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:38Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-hk884\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:08:17Z is after 2025-08-24T17:21:41Z" Nov 26 15:08:17 crc kubenswrapper[4785]: I1126 15:08:17.358613 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:17 crc kubenswrapper[4785]: I1126 15:08:17.358646 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:17 crc kubenswrapper[4785]: I1126 15:08:17.358657 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:17 crc kubenswrapper[4785]: I1126 15:08:17.358672 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:17 crc kubenswrapper[4785]: I1126 15:08:17.358683 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:17Z","lastTransitionTime":"2025-11-26T15:08:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:17 crc kubenswrapper[4785]: I1126 15:08:17.460800 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:17 crc kubenswrapper[4785]: I1126 15:08:17.461781 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:17 crc kubenswrapper[4785]: I1126 15:08:17.461839 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:17 crc kubenswrapper[4785]: I1126 15:08:17.461879 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:17 crc kubenswrapper[4785]: I1126 15:08:17.461905 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:17Z","lastTransitionTime":"2025-11-26T15:08:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:17 crc kubenswrapper[4785]: I1126 15:08:17.565071 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:17 crc kubenswrapper[4785]: I1126 15:08:17.565128 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:17 crc kubenswrapper[4785]: I1126 15:08:17.565144 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:17 crc kubenswrapper[4785]: I1126 15:08:17.565165 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:17 crc kubenswrapper[4785]: I1126 15:08:17.565180 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:17Z","lastTransitionTime":"2025-11-26T15:08:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:17 crc kubenswrapper[4785]: I1126 15:08:17.667625 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:17 crc kubenswrapper[4785]: I1126 15:08:17.667679 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:17 crc kubenswrapper[4785]: I1126 15:08:17.667705 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:17 crc kubenswrapper[4785]: I1126 15:08:17.667727 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:17 crc kubenswrapper[4785]: I1126 15:08:17.667741 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:17Z","lastTransitionTime":"2025-11-26T15:08:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:17 crc kubenswrapper[4785]: I1126 15:08:17.771368 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:17 crc kubenswrapper[4785]: I1126 15:08:17.771444 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:17 crc kubenswrapper[4785]: I1126 15:08:17.771464 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:17 crc kubenswrapper[4785]: I1126 15:08:17.771486 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:17 crc kubenswrapper[4785]: I1126 15:08:17.771503 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:17Z","lastTransitionTime":"2025-11-26T15:08:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:17 crc kubenswrapper[4785]: I1126 15:08:17.873406 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:17 crc kubenswrapper[4785]: I1126 15:08:17.873442 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:17 crc kubenswrapper[4785]: I1126 15:08:17.873450 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:17 crc kubenswrapper[4785]: I1126 15:08:17.873463 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:17 crc kubenswrapper[4785]: I1126 15:08:17.873473 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:17Z","lastTransitionTime":"2025-11-26T15:08:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:17 crc kubenswrapper[4785]: I1126 15:08:17.976431 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:17 crc kubenswrapper[4785]: I1126 15:08:17.976490 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:17 crc kubenswrapper[4785]: I1126 15:08:17.976506 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:17 crc kubenswrapper[4785]: I1126 15:08:17.976528 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:17 crc kubenswrapper[4785]: I1126 15:08:17.976546 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:17Z","lastTransitionTime":"2025-11-26T15:08:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:18 crc kubenswrapper[4785]: I1126 15:08:18.035682 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 26 15:08:18 crc kubenswrapper[4785]: I1126 15:08:18.035705 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qdfwp" Nov 26 15:08:18 crc kubenswrapper[4785]: E1126 15:08:18.035920 4785 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 26 15:08:18 crc kubenswrapper[4785]: E1126 15:08:18.036137 4785 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qdfwp" podUID="72903df2-b694-4229-96b5-167500cab723" Nov 26 15:08:18 crc kubenswrapper[4785]: I1126 15:08:18.080513 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:18 crc kubenswrapper[4785]: I1126 15:08:18.080667 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:18 crc kubenswrapper[4785]: I1126 15:08:18.080741 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:18 crc kubenswrapper[4785]: I1126 15:08:18.080774 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:18 crc kubenswrapper[4785]: I1126 15:08:18.080796 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:18Z","lastTransitionTime":"2025-11-26T15:08:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:18 crc kubenswrapper[4785]: I1126 15:08:18.182730 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:18 crc kubenswrapper[4785]: I1126 15:08:18.182765 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:18 crc kubenswrapper[4785]: I1126 15:08:18.182773 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:18 crc kubenswrapper[4785]: I1126 15:08:18.182787 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:18 crc kubenswrapper[4785]: I1126 15:08:18.182799 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:18Z","lastTransitionTime":"2025-11-26T15:08:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:18 crc kubenswrapper[4785]: I1126 15:08:18.284623 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:18 crc kubenswrapper[4785]: I1126 15:08:18.284698 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:18 crc kubenswrapper[4785]: I1126 15:08:18.284711 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:18 crc kubenswrapper[4785]: I1126 15:08:18.284729 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:18 crc kubenswrapper[4785]: I1126 15:08:18.284745 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:18Z","lastTransitionTime":"2025-11-26T15:08:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:18 crc kubenswrapper[4785]: I1126 15:08:18.386949 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:18 crc kubenswrapper[4785]: I1126 15:08:18.387006 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:18 crc kubenswrapper[4785]: I1126 15:08:18.387023 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:18 crc kubenswrapper[4785]: I1126 15:08:18.387044 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:18 crc kubenswrapper[4785]: I1126 15:08:18.387061 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:18Z","lastTransitionTime":"2025-11-26T15:08:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:18 crc kubenswrapper[4785]: I1126 15:08:18.489328 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:18 crc kubenswrapper[4785]: I1126 15:08:18.489388 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:18 crc kubenswrapper[4785]: I1126 15:08:18.489402 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:18 crc kubenswrapper[4785]: I1126 15:08:18.489417 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:18 crc kubenswrapper[4785]: I1126 15:08:18.489428 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:18Z","lastTransitionTime":"2025-11-26T15:08:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:18 crc kubenswrapper[4785]: I1126 15:08:18.591508 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:18 crc kubenswrapper[4785]: I1126 15:08:18.591546 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:18 crc kubenswrapper[4785]: I1126 15:08:18.591580 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:18 crc kubenswrapper[4785]: I1126 15:08:18.591595 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:18 crc kubenswrapper[4785]: I1126 15:08:18.591604 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:18Z","lastTransitionTime":"2025-11-26T15:08:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:18 crc kubenswrapper[4785]: I1126 15:08:18.693900 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:18 crc kubenswrapper[4785]: I1126 15:08:18.693958 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:18 crc kubenswrapper[4785]: I1126 15:08:18.693974 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:18 crc kubenswrapper[4785]: I1126 15:08:18.693997 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:18 crc kubenswrapper[4785]: I1126 15:08:18.694014 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:18Z","lastTransitionTime":"2025-11-26T15:08:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:18 crc kubenswrapper[4785]: I1126 15:08:18.797389 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:18 crc kubenswrapper[4785]: I1126 15:08:18.797520 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:18 crc kubenswrapper[4785]: I1126 15:08:18.797587 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:18 crc kubenswrapper[4785]: I1126 15:08:18.797625 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:18 crc kubenswrapper[4785]: I1126 15:08:18.797649 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:18Z","lastTransitionTime":"2025-11-26T15:08:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:18 crc kubenswrapper[4785]: I1126 15:08:18.900101 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:18 crc kubenswrapper[4785]: I1126 15:08:18.900143 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:18 crc kubenswrapper[4785]: I1126 15:08:18.900158 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:18 crc kubenswrapper[4785]: I1126 15:08:18.900173 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:18 crc kubenswrapper[4785]: I1126 15:08:18.900183 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:18Z","lastTransitionTime":"2025-11-26T15:08:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:19 crc kubenswrapper[4785]: I1126 15:08:19.002402 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:19 crc kubenswrapper[4785]: I1126 15:08:19.002468 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:19 crc kubenswrapper[4785]: I1126 15:08:19.002490 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:19 crc kubenswrapper[4785]: I1126 15:08:19.002511 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:19 crc kubenswrapper[4785]: I1126 15:08:19.002525 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:19Z","lastTransitionTime":"2025-11-26T15:08:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:19 crc kubenswrapper[4785]: I1126 15:08:19.036341 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 26 15:08:19 crc kubenswrapper[4785]: I1126 15:08:19.036410 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 26 15:08:19 crc kubenswrapper[4785]: E1126 15:08:19.036642 4785 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 26 15:08:19 crc kubenswrapper[4785]: E1126 15:08:19.036743 4785 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 26 15:08:19 crc kubenswrapper[4785]: I1126 15:08:19.105112 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:19 crc kubenswrapper[4785]: I1126 15:08:19.105157 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:19 crc kubenswrapper[4785]: I1126 15:08:19.105173 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:19 crc kubenswrapper[4785]: I1126 15:08:19.105190 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:19 crc kubenswrapper[4785]: I1126 15:08:19.105201 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:19Z","lastTransitionTime":"2025-11-26T15:08:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:19 crc kubenswrapper[4785]: I1126 15:08:19.208992 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:19 crc kubenswrapper[4785]: I1126 15:08:19.209055 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:19 crc kubenswrapper[4785]: I1126 15:08:19.209077 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:19 crc kubenswrapper[4785]: I1126 15:08:19.209105 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:19 crc kubenswrapper[4785]: I1126 15:08:19.209131 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:19Z","lastTransitionTime":"2025-11-26T15:08:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:19 crc kubenswrapper[4785]: I1126 15:08:19.311799 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:19 crc kubenswrapper[4785]: I1126 15:08:19.311849 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:19 crc kubenswrapper[4785]: I1126 15:08:19.311860 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:19 crc kubenswrapper[4785]: I1126 15:08:19.311878 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:19 crc kubenswrapper[4785]: I1126 15:08:19.311892 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:19Z","lastTransitionTime":"2025-11-26T15:08:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:19 crc kubenswrapper[4785]: I1126 15:08:19.415321 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:19 crc kubenswrapper[4785]: I1126 15:08:19.415386 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:19 crc kubenswrapper[4785]: I1126 15:08:19.415404 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:19 crc kubenswrapper[4785]: I1126 15:08:19.415426 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:19 crc kubenswrapper[4785]: I1126 15:08:19.415440 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:19Z","lastTransitionTime":"2025-11-26T15:08:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:19 crc kubenswrapper[4785]: I1126 15:08:19.518023 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:19 crc kubenswrapper[4785]: I1126 15:08:19.518057 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:19 crc kubenswrapper[4785]: I1126 15:08:19.518066 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:19 crc kubenswrapper[4785]: I1126 15:08:19.518079 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:19 crc kubenswrapper[4785]: I1126 15:08:19.518088 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:19Z","lastTransitionTime":"2025-11-26T15:08:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:19 crc kubenswrapper[4785]: I1126 15:08:19.620547 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:19 crc kubenswrapper[4785]: I1126 15:08:19.620607 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:19 crc kubenswrapper[4785]: I1126 15:08:19.620615 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:19 crc kubenswrapper[4785]: I1126 15:08:19.620627 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:19 crc kubenswrapper[4785]: I1126 15:08:19.620637 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:19Z","lastTransitionTime":"2025-11-26T15:08:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:19 crc kubenswrapper[4785]: I1126 15:08:19.722884 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:19 crc kubenswrapper[4785]: I1126 15:08:19.722931 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:19 crc kubenswrapper[4785]: I1126 15:08:19.722942 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:19 crc kubenswrapper[4785]: I1126 15:08:19.722959 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:19 crc kubenswrapper[4785]: I1126 15:08:19.722973 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:19Z","lastTransitionTime":"2025-11-26T15:08:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:19 crc kubenswrapper[4785]: I1126 15:08:19.825514 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:19 crc kubenswrapper[4785]: I1126 15:08:19.825757 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:19 crc kubenswrapper[4785]: I1126 15:08:19.825766 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:19 crc kubenswrapper[4785]: I1126 15:08:19.825779 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:19 crc kubenswrapper[4785]: I1126 15:08:19.825789 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:19Z","lastTransitionTime":"2025-11-26T15:08:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:19 crc kubenswrapper[4785]: I1126 15:08:19.928583 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:19 crc kubenswrapper[4785]: I1126 15:08:19.928637 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:19 crc kubenswrapper[4785]: I1126 15:08:19.928650 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:19 crc kubenswrapper[4785]: I1126 15:08:19.928668 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:19 crc kubenswrapper[4785]: I1126 15:08:19.928681 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:19Z","lastTransitionTime":"2025-11-26T15:08:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:20 crc kubenswrapper[4785]: I1126 15:08:20.031466 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:20 crc kubenswrapper[4785]: I1126 15:08:20.031547 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:20 crc kubenswrapper[4785]: I1126 15:08:20.031613 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:20 crc kubenswrapper[4785]: I1126 15:08:20.031647 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:20 crc kubenswrapper[4785]: I1126 15:08:20.031677 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:20Z","lastTransitionTime":"2025-11-26T15:08:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:20 crc kubenswrapper[4785]: I1126 15:08:20.036182 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qdfwp" Nov 26 15:08:20 crc kubenswrapper[4785]: I1126 15:08:20.036180 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 26 15:08:20 crc kubenswrapper[4785]: E1126 15:08:20.036433 4785 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qdfwp" podUID="72903df2-b694-4229-96b5-167500cab723" Nov 26 15:08:20 crc kubenswrapper[4785]: E1126 15:08:20.036601 4785 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 26 15:08:20 crc kubenswrapper[4785]: I1126 15:08:20.134209 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:20 crc kubenswrapper[4785]: I1126 15:08:20.134259 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:20 crc kubenswrapper[4785]: I1126 15:08:20.134279 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:20 crc kubenswrapper[4785]: I1126 15:08:20.134302 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:20 crc kubenswrapper[4785]: I1126 15:08:20.134320 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:20Z","lastTransitionTime":"2025-11-26T15:08:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:20 crc kubenswrapper[4785]: I1126 15:08:20.237056 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:20 crc kubenswrapper[4785]: I1126 15:08:20.237103 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:20 crc kubenswrapper[4785]: I1126 15:08:20.237118 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:20 crc kubenswrapper[4785]: I1126 15:08:20.237136 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:20 crc kubenswrapper[4785]: I1126 15:08:20.237149 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:20Z","lastTransitionTime":"2025-11-26T15:08:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:20 crc kubenswrapper[4785]: I1126 15:08:20.339155 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:20 crc kubenswrapper[4785]: I1126 15:08:20.339190 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:20 crc kubenswrapper[4785]: I1126 15:08:20.339201 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:20 crc kubenswrapper[4785]: I1126 15:08:20.339216 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:20 crc kubenswrapper[4785]: I1126 15:08:20.339227 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:20Z","lastTransitionTime":"2025-11-26T15:08:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:20 crc kubenswrapper[4785]: I1126 15:08:20.441108 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:20 crc kubenswrapper[4785]: I1126 15:08:20.441146 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:20 crc kubenswrapper[4785]: I1126 15:08:20.441158 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:20 crc kubenswrapper[4785]: I1126 15:08:20.441173 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:20 crc kubenswrapper[4785]: I1126 15:08:20.441184 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:20Z","lastTransitionTime":"2025-11-26T15:08:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:20 crc kubenswrapper[4785]: I1126 15:08:20.544330 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:20 crc kubenswrapper[4785]: I1126 15:08:20.544377 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:20 crc kubenswrapper[4785]: I1126 15:08:20.544392 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:20 crc kubenswrapper[4785]: I1126 15:08:20.544414 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:20 crc kubenswrapper[4785]: I1126 15:08:20.544432 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:20Z","lastTransitionTime":"2025-11-26T15:08:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:20 crc kubenswrapper[4785]: I1126 15:08:20.647316 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:20 crc kubenswrapper[4785]: I1126 15:08:20.647395 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:20 crc kubenswrapper[4785]: I1126 15:08:20.647412 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:20 crc kubenswrapper[4785]: I1126 15:08:20.647438 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:20 crc kubenswrapper[4785]: I1126 15:08:20.647457 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:20Z","lastTransitionTime":"2025-11-26T15:08:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:20 crc kubenswrapper[4785]: I1126 15:08:20.750231 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:20 crc kubenswrapper[4785]: I1126 15:08:20.750298 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:20 crc kubenswrapper[4785]: I1126 15:08:20.750312 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:20 crc kubenswrapper[4785]: I1126 15:08:20.750372 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:20 crc kubenswrapper[4785]: I1126 15:08:20.750387 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:20Z","lastTransitionTime":"2025-11-26T15:08:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:20 crc kubenswrapper[4785]: I1126 15:08:20.852544 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:20 crc kubenswrapper[4785]: I1126 15:08:20.852844 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:20 crc kubenswrapper[4785]: I1126 15:08:20.852943 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:20 crc kubenswrapper[4785]: I1126 15:08:20.853042 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:20 crc kubenswrapper[4785]: I1126 15:08:20.853109 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:20Z","lastTransitionTime":"2025-11-26T15:08:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:20 crc kubenswrapper[4785]: I1126 15:08:20.956045 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:20 crc kubenswrapper[4785]: I1126 15:08:20.956082 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:20 crc kubenswrapper[4785]: I1126 15:08:20.956091 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:20 crc kubenswrapper[4785]: I1126 15:08:20.956104 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:20 crc kubenswrapper[4785]: I1126 15:08:20.956114 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:20Z","lastTransitionTime":"2025-11-26T15:08:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:21 crc kubenswrapper[4785]: I1126 15:08:21.035926 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 26 15:08:21 crc kubenswrapper[4785]: I1126 15:08:21.035926 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 26 15:08:21 crc kubenswrapper[4785]: E1126 15:08:21.036079 4785 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 26 15:08:21 crc kubenswrapper[4785]: E1126 15:08:21.036125 4785 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 26 15:08:21 crc kubenswrapper[4785]: I1126 15:08:21.036801 4785 scope.go:117] "RemoveContainer" containerID="e098d64d0baa7671b1693be648e0a05bcf267a0fbb27f8df8341589e7f2403a8" Nov 26 15:08:21 crc kubenswrapper[4785]: E1126 15:08:21.036935 4785 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-925q9_openshift-ovn-kubernetes(862c58fd-3f79-4276-bd76-ce689d32cbd6)\"" pod="openshift-ovn-kubernetes/ovnkube-node-925q9" podUID="862c58fd-3f79-4276-bd76-ce689d32cbd6" Nov 26 15:08:21 crc kubenswrapper[4785]: I1126 15:08:21.058230 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:21 crc kubenswrapper[4785]: I1126 15:08:21.058276 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:21 crc kubenswrapper[4785]: I1126 15:08:21.058287 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:21 crc kubenswrapper[4785]: I1126 15:08:21.058306 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:21 crc kubenswrapper[4785]: I1126 15:08:21.058318 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:21Z","lastTransitionTime":"2025-11-26T15:08:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:21 crc kubenswrapper[4785]: I1126 15:08:21.161709 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:21 crc kubenswrapper[4785]: I1126 15:08:21.161778 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:21 crc kubenswrapper[4785]: I1126 15:08:21.161799 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:21 crc kubenswrapper[4785]: I1126 15:08:21.161824 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:21 crc kubenswrapper[4785]: I1126 15:08:21.161842 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:21Z","lastTransitionTime":"2025-11-26T15:08:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:21 crc kubenswrapper[4785]: I1126 15:08:21.264125 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:21 crc kubenswrapper[4785]: I1126 15:08:21.264169 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:21 crc kubenswrapper[4785]: I1126 15:08:21.264181 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:21 crc kubenswrapper[4785]: I1126 15:08:21.264198 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:21 crc kubenswrapper[4785]: I1126 15:08:21.264209 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:21Z","lastTransitionTime":"2025-11-26T15:08:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:21 crc kubenswrapper[4785]: I1126 15:08:21.367981 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:21 crc kubenswrapper[4785]: I1126 15:08:21.368053 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:21 crc kubenswrapper[4785]: I1126 15:08:21.368065 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:21 crc kubenswrapper[4785]: I1126 15:08:21.368085 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:21 crc kubenswrapper[4785]: I1126 15:08:21.368096 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:21Z","lastTransitionTime":"2025-11-26T15:08:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:21 crc kubenswrapper[4785]: I1126 15:08:21.470776 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:21 crc kubenswrapper[4785]: I1126 15:08:21.470848 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:21 crc kubenswrapper[4785]: I1126 15:08:21.470865 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:21 crc kubenswrapper[4785]: I1126 15:08:21.470894 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:21 crc kubenswrapper[4785]: I1126 15:08:21.470910 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:21Z","lastTransitionTime":"2025-11-26T15:08:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:21 crc kubenswrapper[4785]: I1126 15:08:21.573757 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:21 crc kubenswrapper[4785]: I1126 15:08:21.573801 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:21 crc kubenswrapper[4785]: I1126 15:08:21.573813 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:21 crc kubenswrapper[4785]: I1126 15:08:21.573829 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:21 crc kubenswrapper[4785]: I1126 15:08:21.573842 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:21Z","lastTransitionTime":"2025-11-26T15:08:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:21 crc kubenswrapper[4785]: I1126 15:08:21.676369 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:21 crc kubenswrapper[4785]: I1126 15:08:21.676395 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:21 crc kubenswrapper[4785]: I1126 15:08:21.676404 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:21 crc kubenswrapper[4785]: I1126 15:08:21.676417 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:21 crc kubenswrapper[4785]: I1126 15:08:21.676425 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:21Z","lastTransitionTime":"2025-11-26T15:08:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:21 crc kubenswrapper[4785]: I1126 15:08:21.779677 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:21 crc kubenswrapper[4785]: I1126 15:08:21.779718 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:21 crc kubenswrapper[4785]: I1126 15:08:21.779733 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:21 crc kubenswrapper[4785]: I1126 15:08:21.779749 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:21 crc kubenswrapper[4785]: I1126 15:08:21.779761 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:21Z","lastTransitionTime":"2025-11-26T15:08:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:21 crc kubenswrapper[4785]: I1126 15:08:21.882881 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:21 crc kubenswrapper[4785]: I1126 15:08:21.882921 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:21 crc kubenswrapper[4785]: I1126 15:08:21.882930 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:21 crc kubenswrapper[4785]: I1126 15:08:21.882946 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:21 crc kubenswrapper[4785]: I1126 15:08:21.882956 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:21Z","lastTransitionTime":"2025-11-26T15:08:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:21 crc kubenswrapper[4785]: I1126 15:08:21.985419 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:21 crc kubenswrapper[4785]: I1126 15:08:21.985469 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:21 crc kubenswrapper[4785]: I1126 15:08:21.985485 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:21 crc kubenswrapper[4785]: I1126 15:08:21.985505 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:21 crc kubenswrapper[4785]: I1126 15:08:21.985518 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:21Z","lastTransitionTime":"2025-11-26T15:08:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:22 crc kubenswrapper[4785]: I1126 15:08:22.035405 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qdfwp" Nov 26 15:08:22 crc kubenswrapper[4785]: I1126 15:08:22.035473 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 26 15:08:22 crc kubenswrapper[4785]: E1126 15:08:22.035749 4785 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 26 15:08:22 crc kubenswrapper[4785]: E1126 15:08:22.035826 4785 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qdfwp" podUID="72903df2-b694-4229-96b5-167500cab723" Nov 26 15:08:22 crc kubenswrapper[4785]: I1126 15:08:22.045746 4785 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Nov 26 15:08:22 crc kubenswrapper[4785]: I1126 15:08:22.087920 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:22 crc kubenswrapper[4785]: I1126 15:08:22.087958 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:22 crc kubenswrapper[4785]: I1126 15:08:22.087968 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:22 crc kubenswrapper[4785]: I1126 15:08:22.087984 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:22 crc kubenswrapper[4785]: I1126 15:08:22.087995 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:22Z","lastTransitionTime":"2025-11-26T15:08:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:22 crc kubenswrapper[4785]: I1126 15:08:22.190020 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:22 crc kubenswrapper[4785]: I1126 15:08:22.190069 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:22 crc kubenswrapper[4785]: I1126 15:08:22.190081 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:22 crc kubenswrapper[4785]: I1126 15:08:22.190097 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:22 crc kubenswrapper[4785]: I1126 15:08:22.190113 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:22Z","lastTransitionTime":"2025-11-26T15:08:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:22 crc kubenswrapper[4785]: I1126 15:08:22.292744 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:22 crc kubenswrapper[4785]: I1126 15:08:22.292796 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:22 crc kubenswrapper[4785]: I1126 15:08:22.292806 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:22 crc kubenswrapper[4785]: I1126 15:08:22.292821 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:22 crc kubenswrapper[4785]: I1126 15:08:22.292832 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:22Z","lastTransitionTime":"2025-11-26T15:08:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:22 crc kubenswrapper[4785]: I1126 15:08:22.396207 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:22 crc kubenswrapper[4785]: I1126 15:08:22.396258 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:22 crc kubenswrapper[4785]: I1126 15:08:22.396268 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:22 crc kubenswrapper[4785]: I1126 15:08:22.396286 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:22 crc kubenswrapper[4785]: I1126 15:08:22.396298 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:22Z","lastTransitionTime":"2025-11-26T15:08:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:22 crc kubenswrapper[4785]: I1126 15:08:22.497935 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:22 crc kubenswrapper[4785]: I1126 15:08:22.497976 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:22 crc kubenswrapper[4785]: I1126 15:08:22.497986 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:22 crc kubenswrapper[4785]: I1126 15:08:22.498001 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:22 crc kubenswrapper[4785]: I1126 15:08:22.498013 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:22Z","lastTransitionTime":"2025-11-26T15:08:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:22 crc kubenswrapper[4785]: I1126 15:08:22.600909 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:22 crc kubenswrapper[4785]: I1126 15:08:22.600958 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:22 crc kubenswrapper[4785]: I1126 15:08:22.600970 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:22 crc kubenswrapper[4785]: I1126 15:08:22.600988 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:22 crc kubenswrapper[4785]: I1126 15:08:22.600999 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:22Z","lastTransitionTime":"2025-11-26T15:08:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:22 crc kubenswrapper[4785]: I1126 15:08:22.704669 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:22 crc kubenswrapper[4785]: I1126 15:08:22.704707 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:22 crc kubenswrapper[4785]: I1126 15:08:22.704717 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:22 crc kubenswrapper[4785]: I1126 15:08:22.704733 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:22 crc kubenswrapper[4785]: I1126 15:08:22.704744 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:22Z","lastTransitionTime":"2025-11-26T15:08:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:22 crc kubenswrapper[4785]: I1126 15:08:22.807445 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:22 crc kubenswrapper[4785]: I1126 15:08:22.807489 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:22 crc kubenswrapper[4785]: I1126 15:08:22.807500 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:22 crc kubenswrapper[4785]: I1126 15:08:22.807517 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:22 crc kubenswrapper[4785]: I1126 15:08:22.807527 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:22Z","lastTransitionTime":"2025-11-26T15:08:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:22 crc kubenswrapper[4785]: I1126 15:08:22.910396 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:22 crc kubenswrapper[4785]: I1126 15:08:22.910455 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:22 crc kubenswrapper[4785]: I1126 15:08:22.910468 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:22 crc kubenswrapper[4785]: I1126 15:08:22.910492 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:22 crc kubenswrapper[4785]: I1126 15:08:22.910506 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:22Z","lastTransitionTime":"2025-11-26T15:08:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:23 crc kubenswrapper[4785]: I1126 15:08:23.012794 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:23 crc kubenswrapper[4785]: I1126 15:08:23.012834 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:23 crc kubenswrapper[4785]: I1126 15:08:23.012843 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:23 crc kubenswrapper[4785]: I1126 15:08:23.012858 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:23 crc kubenswrapper[4785]: I1126 15:08:23.012868 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:23Z","lastTransitionTime":"2025-11-26T15:08:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:23 crc kubenswrapper[4785]: I1126 15:08:23.035784 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 26 15:08:23 crc kubenswrapper[4785]: E1126 15:08:23.035917 4785 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 26 15:08:23 crc kubenswrapper[4785]: I1126 15:08:23.036620 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 26 15:08:23 crc kubenswrapper[4785]: E1126 15:08:23.036855 4785 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 26 15:08:23 crc kubenswrapper[4785]: I1126 15:08:23.114892 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:23 crc kubenswrapper[4785]: I1126 15:08:23.114922 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:23 crc kubenswrapper[4785]: I1126 15:08:23.114931 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:23 crc kubenswrapper[4785]: I1126 15:08:23.114943 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:23 crc kubenswrapper[4785]: I1126 15:08:23.114953 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:23Z","lastTransitionTime":"2025-11-26T15:08:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:23 crc kubenswrapper[4785]: I1126 15:08:23.218150 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:23 crc kubenswrapper[4785]: I1126 15:08:23.218189 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:23 crc kubenswrapper[4785]: I1126 15:08:23.218197 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:23 crc kubenswrapper[4785]: I1126 15:08:23.218212 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:23 crc kubenswrapper[4785]: I1126 15:08:23.218222 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:23Z","lastTransitionTime":"2025-11-26T15:08:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:23 crc kubenswrapper[4785]: I1126 15:08:23.320776 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:23 crc kubenswrapper[4785]: I1126 15:08:23.320829 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:23 crc kubenswrapper[4785]: I1126 15:08:23.320839 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:23 crc kubenswrapper[4785]: I1126 15:08:23.320853 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:23 crc kubenswrapper[4785]: I1126 15:08:23.320864 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:23Z","lastTransitionTime":"2025-11-26T15:08:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:23 crc kubenswrapper[4785]: I1126 15:08:23.423054 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:23 crc kubenswrapper[4785]: I1126 15:08:23.423120 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:23 crc kubenswrapper[4785]: I1126 15:08:23.423139 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:23 crc kubenswrapper[4785]: I1126 15:08:23.423165 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:23 crc kubenswrapper[4785]: I1126 15:08:23.423182 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:23Z","lastTransitionTime":"2025-11-26T15:08:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:23 crc kubenswrapper[4785]: I1126 15:08:23.526043 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:23 crc kubenswrapper[4785]: I1126 15:08:23.526103 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:23 crc kubenswrapper[4785]: I1126 15:08:23.526127 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:23 crc kubenswrapper[4785]: I1126 15:08:23.526155 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:23 crc kubenswrapper[4785]: I1126 15:08:23.526176 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:23Z","lastTransitionTime":"2025-11-26T15:08:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:23 crc kubenswrapper[4785]: I1126 15:08:23.632502 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:23 crc kubenswrapper[4785]: I1126 15:08:23.632639 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:23 crc kubenswrapper[4785]: I1126 15:08:23.632667 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:23 crc kubenswrapper[4785]: I1126 15:08:23.632700 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:23 crc kubenswrapper[4785]: I1126 15:08:23.632721 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:23Z","lastTransitionTime":"2025-11-26T15:08:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:23 crc kubenswrapper[4785]: I1126 15:08:23.735616 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:23 crc kubenswrapper[4785]: I1126 15:08:23.735684 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:23 crc kubenswrapper[4785]: I1126 15:08:23.735703 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:23 crc kubenswrapper[4785]: I1126 15:08:23.735725 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:23 crc kubenswrapper[4785]: I1126 15:08:23.735742 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:23Z","lastTransitionTime":"2025-11-26T15:08:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:23 crc kubenswrapper[4785]: I1126 15:08:23.839019 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:23 crc kubenswrapper[4785]: I1126 15:08:23.839058 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:23 crc kubenswrapper[4785]: I1126 15:08:23.839068 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:23 crc kubenswrapper[4785]: I1126 15:08:23.839086 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:23 crc kubenswrapper[4785]: I1126 15:08:23.839097 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:23Z","lastTransitionTime":"2025-11-26T15:08:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:23 crc kubenswrapper[4785]: I1126 15:08:23.942056 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:23 crc kubenswrapper[4785]: I1126 15:08:23.942122 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:23 crc kubenswrapper[4785]: I1126 15:08:23.942136 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:23 crc kubenswrapper[4785]: I1126 15:08:23.942157 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:23 crc kubenswrapper[4785]: I1126 15:08:23.942170 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:23Z","lastTransitionTime":"2025-11-26T15:08:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:24 crc kubenswrapper[4785]: I1126 15:08:24.035888 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qdfwp" Nov 26 15:08:24 crc kubenswrapper[4785]: I1126 15:08:24.035932 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 26 15:08:24 crc kubenswrapper[4785]: E1126 15:08:24.036070 4785 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qdfwp" podUID="72903df2-b694-4229-96b5-167500cab723" Nov 26 15:08:24 crc kubenswrapper[4785]: E1126 15:08:24.036226 4785 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 26 15:08:24 crc kubenswrapper[4785]: I1126 15:08:24.045321 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:24 crc kubenswrapper[4785]: I1126 15:08:24.045365 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:24 crc kubenswrapper[4785]: I1126 15:08:24.045386 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:24 crc kubenswrapper[4785]: I1126 15:08:24.045414 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:24 crc kubenswrapper[4785]: I1126 15:08:24.045437 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:24Z","lastTransitionTime":"2025-11-26T15:08:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:24 crc kubenswrapper[4785]: I1126 15:08:24.148518 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:24 crc kubenswrapper[4785]: I1126 15:08:24.148588 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:24 crc kubenswrapper[4785]: I1126 15:08:24.148599 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:24 crc kubenswrapper[4785]: I1126 15:08:24.148614 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:24 crc kubenswrapper[4785]: I1126 15:08:24.148624 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:24Z","lastTransitionTime":"2025-11-26T15:08:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:24 crc kubenswrapper[4785]: I1126 15:08:24.252001 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:24 crc kubenswrapper[4785]: I1126 15:08:24.252044 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:24 crc kubenswrapper[4785]: I1126 15:08:24.252055 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:24 crc kubenswrapper[4785]: I1126 15:08:24.252071 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:24 crc kubenswrapper[4785]: I1126 15:08:24.252084 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:24Z","lastTransitionTime":"2025-11-26T15:08:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:24 crc kubenswrapper[4785]: I1126 15:08:24.354825 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:24 crc kubenswrapper[4785]: I1126 15:08:24.354860 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:24 crc kubenswrapper[4785]: I1126 15:08:24.354873 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:24 crc kubenswrapper[4785]: I1126 15:08:24.354888 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:24 crc kubenswrapper[4785]: I1126 15:08:24.354899 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:24Z","lastTransitionTime":"2025-11-26T15:08:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:24 crc kubenswrapper[4785]: I1126 15:08:24.457610 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:24 crc kubenswrapper[4785]: I1126 15:08:24.457643 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:24 crc kubenswrapper[4785]: I1126 15:08:24.457653 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:24 crc kubenswrapper[4785]: I1126 15:08:24.457668 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:24 crc kubenswrapper[4785]: I1126 15:08:24.457679 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:24Z","lastTransitionTime":"2025-11-26T15:08:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:24 crc kubenswrapper[4785]: I1126 15:08:24.560374 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:24 crc kubenswrapper[4785]: I1126 15:08:24.560430 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:24 crc kubenswrapper[4785]: I1126 15:08:24.560442 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:24 crc kubenswrapper[4785]: I1126 15:08:24.560460 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:24 crc kubenswrapper[4785]: I1126 15:08:24.560475 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:24Z","lastTransitionTime":"2025-11-26T15:08:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:24 crc kubenswrapper[4785]: I1126 15:08:24.642337 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:24 crc kubenswrapper[4785]: I1126 15:08:24.642374 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:24 crc kubenswrapper[4785]: I1126 15:08:24.642383 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:24 crc kubenswrapper[4785]: I1126 15:08:24.642400 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:24 crc kubenswrapper[4785]: I1126 15:08:24.642410 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:24Z","lastTransitionTime":"2025-11-26T15:08:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:24 crc kubenswrapper[4785]: E1126 15:08:24.654036 4785 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T15:08:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T15:08:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T15:08:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T15:08:24Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T15:08:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T15:08:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T15:08:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T15:08:24Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"564b0a72-079b-4004-bac6-7e7947bc6860\\\",\\\"systemUUID\\\":\\\"0559caf5-1d73-4afa-a2e0-e8d6b738bfd5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:08:24Z is after 2025-08-24T17:21:41Z" Nov 26 15:08:24 crc kubenswrapper[4785]: I1126 15:08:24.658143 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:24 crc kubenswrapper[4785]: I1126 15:08:24.658193 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:24 crc kubenswrapper[4785]: I1126 15:08:24.658210 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:24 crc kubenswrapper[4785]: I1126 15:08:24.658233 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:24 crc kubenswrapper[4785]: I1126 15:08:24.658250 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:24Z","lastTransitionTime":"2025-11-26T15:08:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:24 crc kubenswrapper[4785]: I1126 15:08:24.665527 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/72903df2-b694-4229-96b5-167500cab723-metrics-certs\") pod \"network-metrics-daemon-qdfwp\" (UID: \"72903df2-b694-4229-96b5-167500cab723\") " pod="openshift-multus/network-metrics-daemon-qdfwp" Nov 26 15:08:24 crc kubenswrapper[4785]: E1126 15:08:24.665655 4785 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Nov 26 15:08:24 crc kubenswrapper[4785]: E1126 15:08:24.665712 4785 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/72903df2-b694-4229-96b5-167500cab723-metrics-certs podName:72903df2-b694-4229-96b5-167500cab723 nodeName:}" failed. No retries permitted until 2025-11-26 15:08:56.665698734 +0000 UTC m=+100.344064498 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/72903df2-b694-4229-96b5-167500cab723-metrics-certs") pod "network-metrics-daemon-qdfwp" (UID: "72903df2-b694-4229-96b5-167500cab723") : object "openshift-multus"/"metrics-daemon-secret" not registered Nov 26 15:08:24 crc kubenswrapper[4785]: E1126 15:08:24.674975 4785 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T15:08:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T15:08:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T15:08:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T15:08:24Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T15:08:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T15:08:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T15:08:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T15:08:24Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"564b0a72-079b-4004-bac6-7e7947bc6860\\\",\\\"systemUUID\\\":\\\"0559caf5-1d73-4afa-a2e0-e8d6b738bfd5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:08:24Z is after 2025-08-24T17:21:41Z" Nov 26 15:08:24 crc kubenswrapper[4785]: I1126 15:08:24.683072 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:24 crc kubenswrapper[4785]: I1126 15:08:24.683114 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:24 crc kubenswrapper[4785]: I1126 15:08:24.683127 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:24 crc kubenswrapper[4785]: I1126 15:08:24.683145 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:24 crc kubenswrapper[4785]: I1126 15:08:24.683158 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:24Z","lastTransitionTime":"2025-11-26T15:08:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:24 crc kubenswrapper[4785]: E1126 15:08:24.696089 4785 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T15:08:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T15:08:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T15:08:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T15:08:24Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T15:08:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T15:08:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T15:08:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T15:08:24Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"564b0a72-079b-4004-bac6-7e7947bc6860\\\",\\\"systemUUID\\\":\\\"0559caf5-1d73-4afa-a2e0-e8d6b738bfd5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:08:24Z is after 2025-08-24T17:21:41Z" Nov 26 15:08:24 crc kubenswrapper[4785]: I1126 15:08:24.700266 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:24 crc kubenswrapper[4785]: I1126 15:08:24.700301 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:24 crc kubenswrapper[4785]: I1126 15:08:24.700311 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:24 crc kubenswrapper[4785]: I1126 15:08:24.700324 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:24 crc kubenswrapper[4785]: I1126 15:08:24.700335 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:24Z","lastTransitionTime":"2025-11-26T15:08:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:24 crc kubenswrapper[4785]: E1126 15:08:24.711688 4785 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T15:08:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T15:08:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T15:08:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T15:08:24Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T15:08:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T15:08:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T15:08:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T15:08:24Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"564b0a72-079b-4004-bac6-7e7947bc6860\\\",\\\"systemUUID\\\":\\\"0559caf5-1d73-4afa-a2e0-e8d6b738bfd5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:08:24Z is after 2025-08-24T17:21:41Z" Nov 26 15:08:24 crc kubenswrapper[4785]: I1126 15:08:24.716299 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:24 crc kubenswrapper[4785]: I1126 15:08:24.716340 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:24 crc kubenswrapper[4785]: I1126 15:08:24.716364 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:24 crc kubenswrapper[4785]: I1126 15:08:24.716379 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:24 crc kubenswrapper[4785]: I1126 15:08:24.716389 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:24Z","lastTransitionTime":"2025-11-26T15:08:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:24 crc kubenswrapper[4785]: E1126 15:08:24.731921 4785 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T15:08:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T15:08:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T15:08:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T15:08:24Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T15:08:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T15:08:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T15:08:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T15:08:24Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"564b0a72-079b-4004-bac6-7e7947bc6860\\\",\\\"systemUUID\\\":\\\"0559caf5-1d73-4afa-a2e0-e8d6b738bfd5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:08:24Z is after 2025-08-24T17:21:41Z" Nov 26 15:08:24 crc kubenswrapper[4785]: E1126 15:08:24.732070 4785 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Nov 26 15:08:24 crc kubenswrapper[4785]: I1126 15:08:24.733308 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:24 crc kubenswrapper[4785]: I1126 15:08:24.733340 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:24 crc kubenswrapper[4785]: I1126 15:08:24.733351 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:24 crc kubenswrapper[4785]: I1126 15:08:24.733365 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:24 crc kubenswrapper[4785]: I1126 15:08:24.733378 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:24Z","lastTransitionTime":"2025-11-26T15:08:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:24 crc kubenswrapper[4785]: I1126 15:08:24.836134 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:24 crc kubenswrapper[4785]: I1126 15:08:24.836190 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:24 crc kubenswrapper[4785]: I1126 15:08:24.836202 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:24 crc kubenswrapper[4785]: I1126 15:08:24.836220 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:24 crc kubenswrapper[4785]: I1126 15:08:24.836233 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:24Z","lastTransitionTime":"2025-11-26T15:08:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:24 crc kubenswrapper[4785]: I1126 15:08:24.939317 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:24 crc kubenswrapper[4785]: I1126 15:08:24.939370 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:24 crc kubenswrapper[4785]: I1126 15:08:24.939405 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:24 crc kubenswrapper[4785]: I1126 15:08:24.939424 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:24 crc kubenswrapper[4785]: I1126 15:08:24.939436 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:24Z","lastTransitionTime":"2025-11-26T15:08:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:25 crc kubenswrapper[4785]: I1126 15:08:25.035983 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 26 15:08:25 crc kubenswrapper[4785]: E1126 15:08:25.036113 4785 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 26 15:08:25 crc kubenswrapper[4785]: I1126 15:08:25.036004 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 26 15:08:25 crc kubenswrapper[4785]: E1126 15:08:25.036438 4785 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 26 15:08:25 crc kubenswrapper[4785]: I1126 15:08:25.040869 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:25 crc kubenswrapper[4785]: I1126 15:08:25.041170 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:25 crc kubenswrapper[4785]: I1126 15:08:25.041231 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:25 crc kubenswrapper[4785]: I1126 15:08:25.041302 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:25 crc kubenswrapper[4785]: I1126 15:08:25.041367 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:25Z","lastTransitionTime":"2025-11-26T15:08:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:25 crc kubenswrapper[4785]: I1126 15:08:25.143642 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:25 crc kubenswrapper[4785]: I1126 15:08:25.143667 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:25 crc kubenswrapper[4785]: I1126 15:08:25.143678 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:25 crc kubenswrapper[4785]: I1126 15:08:25.143692 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:25 crc kubenswrapper[4785]: I1126 15:08:25.143701 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:25Z","lastTransitionTime":"2025-11-26T15:08:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:25 crc kubenswrapper[4785]: I1126 15:08:25.245819 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:25 crc kubenswrapper[4785]: I1126 15:08:25.245852 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:25 crc kubenswrapper[4785]: I1126 15:08:25.245860 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:25 crc kubenswrapper[4785]: I1126 15:08:25.245873 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:25 crc kubenswrapper[4785]: I1126 15:08:25.245883 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:25Z","lastTransitionTime":"2025-11-26T15:08:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:25 crc kubenswrapper[4785]: I1126 15:08:25.347997 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:25 crc kubenswrapper[4785]: I1126 15:08:25.348050 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:25 crc kubenswrapper[4785]: I1126 15:08:25.348058 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:25 crc kubenswrapper[4785]: I1126 15:08:25.348072 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:25 crc kubenswrapper[4785]: I1126 15:08:25.348082 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:25Z","lastTransitionTime":"2025-11-26T15:08:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:25 crc kubenswrapper[4785]: I1126 15:08:25.450176 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:25 crc kubenswrapper[4785]: I1126 15:08:25.450204 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:25 crc kubenswrapper[4785]: I1126 15:08:25.450212 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:25 crc kubenswrapper[4785]: I1126 15:08:25.450224 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:25 crc kubenswrapper[4785]: I1126 15:08:25.450232 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:25Z","lastTransitionTime":"2025-11-26T15:08:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:25 crc kubenswrapper[4785]: I1126 15:08:25.552837 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:25 crc kubenswrapper[4785]: I1126 15:08:25.552890 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:25 crc kubenswrapper[4785]: I1126 15:08:25.552907 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:25 crc kubenswrapper[4785]: I1126 15:08:25.552929 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:25 crc kubenswrapper[4785]: I1126 15:08:25.552947 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:25Z","lastTransitionTime":"2025-11-26T15:08:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:25 crc kubenswrapper[4785]: I1126 15:08:25.654909 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:25 crc kubenswrapper[4785]: I1126 15:08:25.654943 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:25 crc kubenswrapper[4785]: I1126 15:08:25.654976 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:25 crc kubenswrapper[4785]: I1126 15:08:25.654991 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:25 crc kubenswrapper[4785]: I1126 15:08:25.655001 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:25Z","lastTransitionTime":"2025-11-26T15:08:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:25 crc kubenswrapper[4785]: I1126 15:08:25.757426 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:25 crc kubenswrapper[4785]: I1126 15:08:25.757497 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:25 crc kubenswrapper[4785]: I1126 15:08:25.757519 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:25 crc kubenswrapper[4785]: I1126 15:08:25.757587 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:25 crc kubenswrapper[4785]: I1126 15:08:25.757603 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:25Z","lastTransitionTime":"2025-11-26T15:08:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:25 crc kubenswrapper[4785]: I1126 15:08:25.861154 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:25 crc kubenswrapper[4785]: I1126 15:08:25.861206 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:25 crc kubenswrapper[4785]: I1126 15:08:25.861264 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:25 crc kubenswrapper[4785]: I1126 15:08:25.861283 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:25 crc kubenswrapper[4785]: I1126 15:08:25.861315 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:25Z","lastTransitionTime":"2025-11-26T15:08:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:25 crc kubenswrapper[4785]: I1126 15:08:25.964441 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:25 crc kubenswrapper[4785]: I1126 15:08:25.964474 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:25 crc kubenswrapper[4785]: I1126 15:08:25.964483 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:25 crc kubenswrapper[4785]: I1126 15:08:25.964496 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:25 crc kubenswrapper[4785]: I1126 15:08:25.964505 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:25Z","lastTransitionTime":"2025-11-26T15:08:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:26 crc kubenswrapper[4785]: I1126 15:08:26.035893 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qdfwp" Nov 26 15:08:26 crc kubenswrapper[4785]: E1126 15:08:26.036088 4785 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qdfwp" podUID="72903df2-b694-4229-96b5-167500cab723" Nov 26 15:08:26 crc kubenswrapper[4785]: I1126 15:08:26.035893 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 26 15:08:26 crc kubenswrapper[4785]: E1126 15:08:26.037361 4785 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 26 15:08:26 crc kubenswrapper[4785]: I1126 15:08:26.066197 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:26 crc kubenswrapper[4785]: I1126 15:08:26.066239 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:26 crc kubenswrapper[4785]: I1126 15:08:26.066251 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:26 crc kubenswrapper[4785]: I1126 15:08:26.066266 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:26 crc kubenswrapper[4785]: I1126 15:08:26.066277 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:26Z","lastTransitionTime":"2025-11-26T15:08:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:26 crc kubenswrapper[4785]: I1126 15:08:26.168620 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:26 crc kubenswrapper[4785]: I1126 15:08:26.168654 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:26 crc kubenswrapper[4785]: I1126 15:08:26.168662 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:26 crc kubenswrapper[4785]: I1126 15:08:26.168681 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:26 crc kubenswrapper[4785]: I1126 15:08:26.168691 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:26Z","lastTransitionTime":"2025-11-26T15:08:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:26 crc kubenswrapper[4785]: I1126 15:08:26.271165 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:26 crc kubenswrapper[4785]: I1126 15:08:26.271199 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:26 crc kubenswrapper[4785]: I1126 15:08:26.271208 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:26 crc kubenswrapper[4785]: I1126 15:08:26.271223 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:26 crc kubenswrapper[4785]: I1126 15:08:26.271232 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:26Z","lastTransitionTime":"2025-11-26T15:08:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:26 crc kubenswrapper[4785]: I1126 15:08:26.373883 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:26 crc kubenswrapper[4785]: I1126 15:08:26.373922 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:26 crc kubenswrapper[4785]: I1126 15:08:26.373932 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:26 crc kubenswrapper[4785]: I1126 15:08:26.373945 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:26 crc kubenswrapper[4785]: I1126 15:08:26.373956 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:26Z","lastTransitionTime":"2025-11-26T15:08:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:26 crc kubenswrapper[4785]: I1126 15:08:26.466241 4785 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-6q4xd_855bd894-cca9-4fe1-a0d5-8b72afe7c93a/kube-multus/0.log" Nov 26 15:08:26 crc kubenswrapper[4785]: I1126 15:08:26.466291 4785 generic.go:334] "Generic (PLEG): container finished" podID="855bd894-cca9-4fe1-a0d5-8b72afe7c93a" containerID="041064cd63f3d654e02e5e310fbc190ca527a986959c9d7a35ab87962184f2a0" exitCode=1 Nov 26 15:08:26 crc kubenswrapper[4785]: I1126 15:08:26.466323 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-6q4xd" event={"ID":"855bd894-cca9-4fe1-a0d5-8b72afe7c93a","Type":"ContainerDied","Data":"041064cd63f3d654e02e5e310fbc190ca527a986959c9d7a35ab87962184f2a0"} Nov 26 15:08:26 crc kubenswrapper[4785]: I1126 15:08:26.466761 4785 scope.go:117] "RemoveContainer" containerID="041064cd63f3d654e02e5e310fbc190ca527a986959c9d7a35ab87962184f2a0" Nov 26 15:08:26 crc kubenswrapper[4785]: I1126 15:08:26.477318 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:26 crc kubenswrapper[4785]: I1126 15:08:26.477366 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:26 crc kubenswrapper[4785]: I1126 15:08:26.477378 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:26 crc kubenswrapper[4785]: I1126 15:08:26.477394 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:26 crc kubenswrapper[4785]: I1126 15:08:26.477762 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:26Z","lastTransitionTime":"2025-11-26T15:08:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:26 crc kubenswrapper[4785]: I1126 15:08:26.480884 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f369bd7df062792e57a0cee19e7be96c6850b20d6e739a0c7fa5eccc14ae1d1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:08:26Z is after 2025-08-24T17:21:41Z" Nov 26 15:08:26 crc kubenswrapper[4785]: I1126 15:08:26.494476 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:08:26Z is after 2025-08-24T17:21:41Z" Nov 26 15:08:26 crc kubenswrapper[4785]: I1126 15:08:26.511153 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ce2473df-8540-437a-9b68-a0c79c8f1189\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26237e500f4655636096b017418a9029308b9ad9e5c12041546e51d919766529\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7904d95c1ece08fadcdc54e83f80b295f3904cc950fb84624bc2a6967e4d5bdd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://936454b4c6cdedceb140c50e2dba0e677ce2b126075b203bdae8a77bbdefde76\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2202f005c9b704cc7a97a7387e4563a14bda9e52001c63918f1067a7e676d34d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a6d4414478527bb76f539125e4be6ebf2a6470ddb46fd35e07180319fa98ede\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-26T15:07:36Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1126 15:07:30.557519 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1126 15:07:30.559919 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2275430498/tls.crt::/tmp/serving-cert-2275430498/tls.key\\\\\\\"\\\\nI1126 15:07:36.843783 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1126 15:07:36.847546 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1126 15:07:36.847601 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1126 15:07:36.847630 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1126 15:07:36.847642 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1126 15:07:36.858915 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1126 15:07:36.858955 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1126 15:07:36.858960 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1126 15:07:36.858971 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1126 15:07:36.858975 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1126 15:07:36.858978 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1126 15:07:36.858982 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1126 15:07:36.859422 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1126 15:07:36.862125 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:20Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcce80995d95ff6993051a4f1ca260133d2513d216a291e3f2c905ec29f88d1b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e86ca1ef167bd0c31ad5ae1283d40945c72c4e528f3702b577bbbf9818e84c08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e86ca1ef167bd0c31ad5ae1283d40945c72c4e528f3702b577bbbf9818e84c08\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:08:26Z is after 2025-08-24T17:21:41Z" Nov 26 15:08:26 crc kubenswrapper[4785]: I1126 15:08:26.526541 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58c0357f-6901-420b-a246-f5ef95e4fb7e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:08:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:08:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2553398db49910f840349bfd11fd5ce7908438bdbcf5a654355c10fb3c3d610\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3c4c50721c470ba97725e59a42ae3cf740017ed7027a9e5b0a7e1906f40fd49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f8dc60997fbeb399187b6bc95276994ff0a181732022bfff1e229928996f90b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://169def9ac03274e05424365ce5b2fc1236d495e846603c29fa6efbaae27b53b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://169def9ac03274e05424365ce5b2fc1236d495e846603c29fa6efbaae27b53b2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:18Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:17Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:08:26Z is after 2025-08-24T17:21:41Z" Nov 26 15:08:26 crc kubenswrapper[4785]: I1126 15:08:26.545727 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ec3c2133-1999-46b4-b5a6-c1b800f04c30\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee866b365ef5e15c3cc262ab6ad0649b10fa7fdf0c10c3f747d9e20d48535e76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79b8f6bd2047642cacd4effcee8f9760c7c9480092993c7a25eabe4727bd8238\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dee1454ae94951e83f437b999f871e4965a9199a20bcd6392a6e07a7bfa9c6aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://257e51cb9b93727ccb616e193abc21dfed1ee9ee2f1ce459e35728457dae5280\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc743861f85f56a87c6a20d7152af59e554284fe13b8212c5dae854438771406\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ff60b31932a511f9156f1416e112d115fcbc79a0a6030143ba9f5ddd327577a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ff60b31932a511f9156f1416e112d115fcbc79a0a6030143ba9f5ddd327577a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff524be439c6bf410b9377cd503516b43be3826e4c2fcd03a6ed9bf3fcfab95b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff524be439c6bf410b9377cd503516b43be3826e4c2fcd03a6ed9bf3fcfab95b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:19Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://26cb5ba5508ef320a24ba05e352547c9feb82f37ebeab58ad7c0a073d03b8c2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26cb5ba5508ef320a24ba05e352547c9feb82f37ebeab58ad7c0a073d03b8c2d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:17Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:08:26Z is after 2025-08-24T17:21:41Z" Nov 26 15:08:26 crc kubenswrapper[4785]: I1126 15:08:26.561184 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8acfa933b476aa5e196c195906225ae92af5cca3353292e23f29f87ee374d0ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f432643beb27bff31e146230a60837b1292cddb8595ede8ed7f929170fceeb17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:08:26Z is after 2025-08-24T17:21:41Z" Nov 26 15:08:26 crc kubenswrapper[4785]: I1126 15:08:26.575104 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ab8b8fda3321443cb24c214c10ba7171f01c80adf81860f4aa931062ec886d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:08:26Z is after 2025-08-24T17:21:41Z" Nov 26 15:08:26 crc kubenswrapper[4785]: I1126 15:08:26.579366 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:26 crc kubenswrapper[4785]: I1126 15:08:26.579524 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:26 crc kubenswrapper[4785]: I1126 15:08:26.579654 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:26 crc kubenswrapper[4785]: I1126 15:08:26.579780 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:26 crc kubenswrapper[4785]: I1126 15:08:26.579900 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:26Z","lastTransitionTime":"2025-11-26T15:08:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:26 crc kubenswrapper[4785]: I1126 15:08:26.589242 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:08:26Z is after 2025-08-24T17:21:41Z" Nov 26 15:08:26 crc kubenswrapper[4785]: I1126 15:08:26.604045 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xbz7b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84d83039-5d86-45bb-a5c1-ca5b94ed92c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5459c3414a7754407ac6d182e88047bbe1b7b9750d15c7a88767c50c4b9dc720\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gdb7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f7072a2a9067489a27300fea823075b89d71d45a1d972bd9d94226b01c2c123\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f7072a2a9067489a27300fea823075b89d71d45a1d972bd9d94226b01c2c123\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gdb7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://538d65a0b5f3e2c45c1b9761f2c73e958d5de0873652f20f4047631423e538eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://538d65a0b5f3e2c45c1b9761f2c73e958d5de0873652f20f4047631423e538eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gdb7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1abf2b4c1a00c0bc17acaf7547130008127579df59592382ae6e81430936e6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d1abf2b4c1a00c0bc17acaf7547130008127579df59592382ae6e81430936e6d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gdb7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://35a2151729e5cb2ae347214fc91e6de116a4062943e4cb4a27ab0553436953ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35a2151729e5cb2ae347214fc91e6de116a4062943e4cb4a27ab0553436953ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gdb7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b51e95d33c83c57a3b5a7f580eb9cac5e50575e699c066694a9ae8fd4dcdfb20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b51e95d33c83c57a3b5a7f580eb9cac5e50575e699c066694a9ae8fd4dcdfb20\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gdb7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20f87e07ed26b3b93d4dad87f971e4ee3bf3c46299491a95990ab34c63167f93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://20f87e07ed26b3b93d4dad87f971e4ee3bf3c46299491a95990ab34c63167f93\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gdb7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xbz7b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:08:26Z is after 2025-08-24T17:21:41Z" Nov 26 15:08:26 crc kubenswrapper[4785]: I1126 15:08:26.616873 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-gkxdl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5539d39a-e2bc-4e7f-8b5a-a5e3e10c4ba4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://183451ec5e6ad1983b6e56a53adfcf2eaf3489dcfbc3cbc709f67002691b9010\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqzs8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5538d579d2e0ac36c778f58da6e8176f8f7a0995797e8bc7c3ad21a054346867\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqzs8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-gkxdl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:08:26Z is after 2025-08-24T17:21:41Z" Nov 26 15:08:26 crc kubenswrapper[4785]: I1126 15:08:26.630156 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dbac4f7f-578e-41e6-88a0-4ad8d2f5eef2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://320fc519a91dfed4d5c233e7c1c2f21aadf25336ccade8830069aae4bd804a68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b79ab88118b1e683ad26fc3bd390961d15af99cce73126c56587c0df819f3a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6c964ff713703fe4484d76d86473279ffc4a3eef0cf3dd8b29aea7fdc6271616\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03ac9c745ec9eb21263a599561496630202872ef631f5e846e0475efedee6943\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:08:26Z is after 2025-08-24T17:21:41Z" Nov 26 15:08:26 crc kubenswrapper[4785]: I1126 15:08:26.658102 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-925q9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"862c58fd-3f79-4276-bd76-ce689d32cbd6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b754df1d5f2d20f3afdb10f2e77f4716de9d953c7322bb3e289d989d2699d17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://018b59a73e9d454f9156de17b00a8c5cccc3f321164cac374187afa18466bc45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e1a2013c12cbd23c8c9e4e0d5bd9f76df1c2c2f30f0d55fdb8c8cb03698ea576\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bc34631383b7470b2d582d037ea922f2c46d09ca9e37011d758e16837336a78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://77387f2d556a76807d809c75aa4d9e2fdf5de82d7a04e23d3f5461ccef860196\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://47be007cc11c8c2db5bc6dd5786cad4e45c1f8f08f872d060be0211012aad935\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e098d64d0baa7671b1693be648e0a05bcf267a0fbb27f8df8341589e7f2403a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e098d64d0baa7671b1693be648e0a05bcf267a0fbb27f8df8341589e7f2403a8\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-26T15:08:05Z\\\",\\\"message\\\":\\\" event handler 8\\\\nI1126 15:08:05.966708 6428 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1126 15:08:05.966737 6428 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1126 15:08:05.966799 6428 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1126 15:08:05.966828 6428 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1126 15:08:05.966851 6428 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1126 15:08:05.966838 6428 handler.go:208] Removed *v1.Node event handler 2\\\\nI1126 15:08:05.966854 6428 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1126 15:08:05.966902 6428 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1126 15:08:05.966906 6428 factory.go:656] Stopping watch factory\\\\nI1126 15:08:05.966902 6428 handler.go:208] Removed *v1.Node event handler 7\\\\nI1126 15:08:05.966903 6428 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI1126 15:08:05.966957 6428 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1126 15:08:05.967001 6428 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI1126 15:08:05.967033 6428 ovnkube.go:599] Stopped ovnkube\\\\nI1126 15:08:05.967055 6428 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1126 15:08:05.967109 6428 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-26T15:08:05Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-925q9_openshift-ovn-kubernetes(862c58fd-3f79-4276-bd76-ce689d32cbd6)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84b98c7c476182b515072f71c6d81656debe5c03855324623b1eb31f53e6c5c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://049fc6de07d4b51e1c898035109c3427dff189cc509f5973db7cb75c3a022bc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://049fc6de07d4b51e1c898035109c3427dff189cc509f5973db7cb75c3a022bc1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-925q9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:08:26Z is after 2025-08-24T17:21:41Z" Nov 26 15:08:26 crc kubenswrapper[4785]: I1126 15:08:26.671495 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6q4xd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"855bd894-cca9-4fe1-a0d5-8b72afe7c93a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:08:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:08:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://041064cd63f3d654e02e5e310fbc190ca527a986959c9d7a35ab87962184f2a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://041064cd63f3d654e02e5e310fbc190ca527a986959c9d7a35ab87962184f2a0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-26T15:08:26Z\\\",\\\"message\\\":\\\"2025-11-26T15:07:40+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_bd6b8591-1e03-43be-806d-3dc58de1780b\\\\n2025-11-26T15:07:40+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_bd6b8591-1e03-43be-806d-3dc58de1780b to /host/opt/cni/bin/\\\\n2025-11-26T15:07:41Z [verbose] multus-daemon started\\\\n2025-11-26T15:07:41Z [verbose] Readiness Indicator file check\\\\n2025-11-26T15:08:26Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7zv2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6q4xd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:08:26Z is after 2025-08-24T17:21:41Z" Nov 26 15:08:26 crc kubenswrapper[4785]: I1126 15:08:26.682169 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:26 crc kubenswrapper[4785]: I1126 15:08:26.682220 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:26 crc kubenswrapper[4785]: I1126 15:08:26.682231 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:26 crc kubenswrapper[4785]: I1126 15:08:26.682251 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:26 crc kubenswrapper[4785]: I1126 15:08:26.682262 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:26Z","lastTransitionTime":"2025-11-26T15:08:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:26 crc kubenswrapper[4785]: I1126 15:08:26.685848 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-qdfwp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72903df2-b694-4229-96b5-167500cab723\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrprh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrprh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:52Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-qdfwp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:08:26Z is after 2025-08-24T17:21:41Z" Nov 26 15:08:26 crc kubenswrapper[4785]: I1126 15:08:26.701940 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-hk884" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b9a87c55-b930-4993-88b8-15902e000caa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://018c4591e8ee49a626a6357d6672b723d221f62eaa62face83fd6c3d5f1b3bfa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n9rk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:38Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-hk884\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:08:26Z is after 2025-08-24T17:21:41Z" Nov 26 15:08:26 crc kubenswrapper[4785]: I1126 15:08:26.719121 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:08:26Z is after 2025-08-24T17:21:41Z" Nov 26 15:08:26 crc kubenswrapper[4785]: I1126 15:08:26.731774 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-smv28" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d77660b-5b10-4573-84ab-3dc318d4b4ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3bb62d64073e9a5efa6ec04394fb33bf1498892d59e0843f323a728dff2e7b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mwfft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:38Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-smv28\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:08:26Z is after 2025-08-24T17:21:41Z" Nov 26 15:08:26 crc kubenswrapper[4785]: I1126 15:08:26.744576 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zk6jt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"09c65f70-203d-40eb-a45e-ed8d7e36912f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69bcee7670290dd116b68ea6d15585455861981551919b4b553fc699f43161e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dg52p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba1f2e54aa3b8e81f5bc6ad4531d5976cfd8d6c4c15f16cefecc75c3c6189ef4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dg52p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-zk6jt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:08:26Z is after 2025-08-24T17:21:41Z" Nov 26 15:08:26 crc kubenswrapper[4785]: I1126 15:08:26.756447 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2fc8b9de-d617-46d7-8834-84da7611f363\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08b2ab738589b29c6c67e3319ccab97ca16955fbc26dfa0dcf8a4374f6cc0d4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad113c4b2e70dde555f17e5b90363d0c0d5bcd70fa99ed5422ffc32bce3253c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ad113c4b2e70dde555f17e5b90363d0c0d5bcd70fa99ed5422ffc32bce3253c1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:17Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:08:26Z is after 2025-08-24T17:21:41Z" Nov 26 15:08:26 crc kubenswrapper[4785]: I1126 15:08:26.784833 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:26 crc kubenswrapper[4785]: I1126 15:08:26.785102 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:26 crc kubenswrapper[4785]: I1126 15:08:26.785340 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:26 crc kubenswrapper[4785]: I1126 15:08:26.785518 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:26 crc kubenswrapper[4785]: I1126 15:08:26.785636 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:26Z","lastTransitionTime":"2025-11-26T15:08:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:26 crc kubenswrapper[4785]: I1126 15:08:26.888610 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:26 crc kubenswrapper[4785]: I1126 15:08:26.888658 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:26 crc kubenswrapper[4785]: I1126 15:08:26.888671 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:26 crc kubenswrapper[4785]: I1126 15:08:26.888689 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:26 crc kubenswrapper[4785]: I1126 15:08:26.888703 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:26Z","lastTransitionTime":"2025-11-26T15:08:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:26 crc kubenswrapper[4785]: I1126 15:08:26.990725 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:26 crc kubenswrapper[4785]: I1126 15:08:26.990772 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:26 crc kubenswrapper[4785]: I1126 15:08:26.990782 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:26 crc kubenswrapper[4785]: I1126 15:08:26.990799 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:26 crc kubenswrapper[4785]: I1126 15:08:26.990810 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:26Z","lastTransitionTime":"2025-11-26T15:08:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:27 crc kubenswrapper[4785]: I1126 15:08:27.036280 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 26 15:08:27 crc kubenswrapper[4785]: E1126 15:08:27.036490 4785 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 26 15:08:27 crc kubenswrapper[4785]: I1126 15:08:27.037395 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 26 15:08:27 crc kubenswrapper[4785]: E1126 15:08:27.037549 4785 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 26 15:08:27 crc kubenswrapper[4785]: I1126 15:08:27.053001 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ce2473df-8540-437a-9b68-a0c79c8f1189\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26237e500f4655636096b017418a9029308b9ad9e5c12041546e51d919766529\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7904d95c1ece08fadcdc54e83f80b295f3904cc950fb84624bc2a6967e4d5bdd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://936454b4c6cdedceb140c50e2dba0e677ce2b126075b203bdae8a77bbdefde76\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2202f005c9b704cc7a97a7387e4563a14bda9e52001c63918f1067a7e676d34d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a6d4414478527bb76f539125e4be6ebf2a6470ddb46fd35e07180319fa98ede\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-26T15:07:36Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1126 15:07:30.557519 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1126 15:07:30.559919 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2275430498/tls.crt::/tmp/serving-cert-2275430498/tls.key\\\\\\\"\\\\nI1126 15:07:36.843783 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1126 15:07:36.847546 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1126 15:07:36.847601 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1126 15:07:36.847630 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1126 15:07:36.847642 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1126 15:07:36.858915 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1126 15:07:36.858955 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1126 15:07:36.858960 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1126 15:07:36.858971 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1126 15:07:36.858975 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1126 15:07:36.858978 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1126 15:07:36.858982 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1126 15:07:36.859422 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1126 15:07:36.862125 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:20Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcce80995d95ff6993051a4f1ca260133d2513d216a291e3f2c905ec29f88d1b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e86ca1ef167bd0c31ad5ae1283d40945c72c4e528f3702b577bbbf9818e84c08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e86ca1ef167bd0c31ad5ae1283d40945c72c4e528f3702b577bbbf9818e84c08\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:08:27Z is after 2025-08-24T17:21:41Z" Nov 26 15:08:27 crc kubenswrapper[4785]: I1126 15:08:27.067822 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f369bd7df062792e57a0cee19e7be96c6850b20d6e739a0c7fa5eccc14ae1d1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:08:27Z is after 2025-08-24T17:21:41Z" Nov 26 15:08:27 crc kubenswrapper[4785]: I1126 15:08:27.083071 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:08:27Z is after 2025-08-24T17:21:41Z" Nov 26 15:08:27 crc kubenswrapper[4785]: I1126 15:08:27.093306 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:27 crc kubenswrapper[4785]: I1126 15:08:27.093347 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:27 crc kubenswrapper[4785]: I1126 15:08:27.093651 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:27 crc kubenswrapper[4785]: I1126 15:08:27.093797 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:27 crc kubenswrapper[4785]: I1126 15:08:27.093822 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:27Z","lastTransitionTime":"2025-11-26T15:08:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:27 crc kubenswrapper[4785]: I1126 15:08:27.096103 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ab8b8fda3321443cb24c214c10ba7171f01c80adf81860f4aa931062ec886d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:08:27Z is after 2025-08-24T17:21:41Z" Nov 26 15:08:27 crc kubenswrapper[4785]: I1126 15:08:27.110342 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:08:27Z is after 2025-08-24T17:21:41Z" Nov 26 15:08:27 crc kubenswrapper[4785]: I1126 15:08:27.129340 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xbz7b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84d83039-5d86-45bb-a5c1-ca5b94ed92c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5459c3414a7754407ac6d182e88047bbe1b7b9750d15c7a88767c50c4b9dc720\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gdb7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f7072a2a9067489a27300fea823075b89d71d45a1d972bd9d94226b01c2c123\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f7072a2a9067489a27300fea823075b89d71d45a1d972bd9d94226b01c2c123\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gdb7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://538d65a0b5f3e2c45c1b9761f2c73e958d5de0873652f20f4047631423e538eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://538d65a0b5f3e2c45c1b9761f2c73e958d5de0873652f20f4047631423e538eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gdb7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1abf2b4c1a00c0bc17acaf7547130008127579df59592382ae6e81430936e6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d1abf2b4c1a00c0bc17acaf7547130008127579df59592382ae6e81430936e6d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gdb7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://35a2151729e5cb2ae347214fc91e6de116a4062943e4cb4a27ab0553436953ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35a2151729e5cb2ae347214fc91e6de116a4062943e4cb4a27ab0553436953ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gdb7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b51e95d33c83c57a3b5a7f580eb9cac5e50575e699c066694a9ae8fd4dcdfb20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b51e95d33c83c57a3b5a7f580eb9cac5e50575e699c066694a9ae8fd4dcdfb20\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gdb7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20f87e07ed26b3b93d4dad87f971e4ee3bf3c46299491a95990ab34c63167f93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://20f87e07ed26b3b93d4dad87f971e4ee3bf3c46299491a95990ab34c63167f93\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gdb7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xbz7b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:08:27Z is after 2025-08-24T17:21:41Z" Nov 26 15:08:27 crc kubenswrapper[4785]: I1126 15:08:27.141493 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-gkxdl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5539d39a-e2bc-4e7f-8b5a-a5e3e10c4ba4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://183451ec5e6ad1983b6e56a53adfcf2eaf3489dcfbc3cbc709f67002691b9010\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqzs8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5538d579d2e0ac36c778f58da6e8176f8f7a0995797e8bc7c3ad21a054346867\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqzs8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-gkxdl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:08:27Z is after 2025-08-24T17:21:41Z" Nov 26 15:08:27 crc kubenswrapper[4785]: I1126 15:08:27.159301 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dbac4f7f-578e-41e6-88a0-4ad8d2f5eef2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://320fc519a91dfed4d5c233e7c1c2f21aadf25336ccade8830069aae4bd804a68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b79ab88118b1e683ad26fc3bd390961d15af99cce73126c56587c0df819f3a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6c964ff713703fe4484d76d86473279ffc4a3eef0cf3dd8b29aea7fdc6271616\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03ac9c745ec9eb21263a599561496630202872ef631f5e846e0475efedee6943\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:08:27Z is after 2025-08-24T17:21:41Z" Nov 26 15:08:27 crc kubenswrapper[4785]: I1126 15:08:27.176140 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58c0357f-6901-420b-a246-f5ef95e4fb7e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:08:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:08:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2553398db49910f840349bfd11fd5ce7908438bdbcf5a654355c10fb3c3d610\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3c4c50721c470ba97725e59a42ae3cf740017ed7027a9e5b0a7e1906f40fd49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f8dc60997fbeb399187b6bc95276994ff0a181732022bfff1e229928996f90b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://169def9ac03274e05424365ce5b2fc1236d495e846603c29fa6efbaae27b53b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://169def9ac03274e05424365ce5b2fc1236d495e846603c29fa6efbaae27b53b2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:18Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:17Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:08:27Z is after 2025-08-24T17:21:41Z" Nov 26 15:08:27 crc kubenswrapper[4785]: I1126 15:08:27.195892 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:27 crc kubenswrapper[4785]: I1126 15:08:27.195932 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:27 crc kubenswrapper[4785]: I1126 15:08:27.195943 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:27 crc kubenswrapper[4785]: I1126 15:08:27.195960 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:27 crc kubenswrapper[4785]: I1126 15:08:27.195973 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:27Z","lastTransitionTime":"2025-11-26T15:08:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:27 crc kubenswrapper[4785]: I1126 15:08:27.207469 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ec3c2133-1999-46b4-b5a6-c1b800f04c30\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee866b365ef5e15c3cc262ab6ad0649b10fa7fdf0c10c3f747d9e20d48535e76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79b8f6bd2047642cacd4effcee8f9760c7c9480092993c7a25eabe4727bd8238\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dee1454ae94951e83f437b999f871e4965a9199a20bcd6392a6e07a7bfa9c6aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://257e51cb9b93727ccb616e193abc21dfed1ee9ee2f1ce459e35728457dae5280\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc743861f85f56a87c6a20d7152af59e554284fe13b8212c5dae854438771406\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ff60b31932a511f9156f1416e112d115fcbc79a0a6030143ba9f5ddd327577a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ff60b31932a511f9156f1416e112d115fcbc79a0a6030143ba9f5ddd327577a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff524be439c6bf410b9377cd503516b43be3826e4c2fcd03a6ed9bf3fcfab95b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff524be439c6bf410b9377cd503516b43be3826e4c2fcd03a6ed9bf3fcfab95b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:19Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://26cb5ba5508ef320a24ba05e352547c9feb82f37ebeab58ad7c0a073d03b8c2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26cb5ba5508ef320a24ba05e352547c9feb82f37ebeab58ad7c0a073d03b8c2d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:17Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:08:27Z is after 2025-08-24T17:21:41Z" Nov 26 15:08:27 crc kubenswrapper[4785]: I1126 15:08:27.221785 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8acfa933b476aa5e196c195906225ae92af5cca3353292e23f29f87ee374d0ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f432643beb27bff31e146230a60837b1292cddb8595ede8ed7f929170fceeb17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:08:27Z is after 2025-08-24T17:21:41Z" Nov 26 15:08:27 crc kubenswrapper[4785]: I1126 15:08:27.250708 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-925q9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"862c58fd-3f79-4276-bd76-ce689d32cbd6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b754df1d5f2d20f3afdb10f2e77f4716de9d953c7322bb3e289d989d2699d17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://018b59a73e9d454f9156de17b00a8c5cccc3f321164cac374187afa18466bc45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e1a2013c12cbd23c8c9e4e0d5bd9f76df1c2c2f30f0d55fdb8c8cb03698ea576\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bc34631383b7470b2d582d037ea922f2c46d09ca9e37011d758e16837336a78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://77387f2d556a76807d809c75aa4d9e2fdf5de82d7a04e23d3f5461ccef860196\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://47be007cc11c8c2db5bc6dd5786cad4e45c1f8f08f872d060be0211012aad935\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e098d64d0baa7671b1693be648e0a05bcf267a0fbb27f8df8341589e7f2403a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e098d64d0baa7671b1693be648e0a05bcf267a0fbb27f8df8341589e7f2403a8\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-26T15:08:05Z\\\",\\\"message\\\":\\\" event handler 8\\\\nI1126 15:08:05.966708 6428 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1126 15:08:05.966737 6428 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1126 15:08:05.966799 6428 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1126 15:08:05.966828 6428 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1126 15:08:05.966851 6428 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1126 15:08:05.966838 6428 handler.go:208] Removed *v1.Node event handler 2\\\\nI1126 15:08:05.966854 6428 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1126 15:08:05.966902 6428 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1126 15:08:05.966906 6428 factory.go:656] Stopping watch factory\\\\nI1126 15:08:05.966902 6428 handler.go:208] Removed *v1.Node event handler 7\\\\nI1126 15:08:05.966903 6428 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI1126 15:08:05.966957 6428 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1126 15:08:05.967001 6428 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI1126 15:08:05.967033 6428 ovnkube.go:599] Stopped ovnkube\\\\nI1126 15:08:05.967055 6428 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1126 15:08:05.967109 6428 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-26T15:08:05Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-925q9_openshift-ovn-kubernetes(862c58fd-3f79-4276-bd76-ce689d32cbd6)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84b98c7c476182b515072f71c6d81656debe5c03855324623b1eb31f53e6c5c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://049fc6de07d4b51e1c898035109c3427dff189cc509f5973db7cb75c3a022bc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://049fc6de07d4b51e1c898035109c3427dff189cc509f5973db7cb75c3a022bc1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-925q9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:08:27Z is after 2025-08-24T17:21:41Z" Nov 26 15:08:27 crc kubenswrapper[4785]: I1126 15:08:27.263282 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-hk884" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b9a87c55-b930-4993-88b8-15902e000caa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://018c4591e8ee49a626a6357d6672b723d221f62eaa62face83fd6c3d5f1b3bfa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n9rk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:38Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-hk884\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:08:27Z is after 2025-08-24T17:21:41Z" Nov 26 15:08:27 crc kubenswrapper[4785]: I1126 15:08:27.280284 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6q4xd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"855bd894-cca9-4fe1-a0d5-8b72afe7c93a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:08:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:08:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://041064cd63f3d654e02e5e310fbc190ca527a986959c9d7a35ab87962184f2a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://041064cd63f3d654e02e5e310fbc190ca527a986959c9d7a35ab87962184f2a0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-26T15:08:26Z\\\",\\\"message\\\":\\\"2025-11-26T15:07:40+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_bd6b8591-1e03-43be-806d-3dc58de1780b\\\\n2025-11-26T15:07:40+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_bd6b8591-1e03-43be-806d-3dc58de1780b to /host/opt/cni/bin/\\\\n2025-11-26T15:07:41Z [verbose] multus-daemon started\\\\n2025-11-26T15:07:41Z [verbose] Readiness Indicator file check\\\\n2025-11-26T15:08:26Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7zv2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6q4xd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:08:27Z is after 2025-08-24T17:21:41Z" Nov 26 15:08:27 crc kubenswrapper[4785]: I1126 15:08:27.295340 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-qdfwp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72903df2-b694-4229-96b5-167500cab723\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrprh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrprh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:52Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-qdfwp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:08:27Z is after 2025-08-24T17:21:41Z" Nov 26 15:08:27 crc kubenswrapper[4785]: I1126 15:08:27.298902 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:27 crc kubenswrapper[4785]: I1126 15:08:27.298966 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:27 crc kubenswrapper[4785]: I1126 15:08:27.298987 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:27 crc kubenswrapper[4785]: I1126 15:08:27.299013 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:27 crc kubenswrapper[4785]: I1126 15:08:27.299039 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:27Z","lastTransitionTime":"2025-11-26T15:08:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:27 crc kubenswrapper[4785]: I1126 15:08:27.316510 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2fc8b9de-d617-46d7-8834-84da7611f363\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08b2ab738589b29c6c67e3319ccab97ca16955fbc26dfa0dcf8a4374f6cc0d4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad113c4b2e70dde555f17e5b90363d0c0d5bcd70fa99ed5422ffc32bce3253c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ad113c4b2e70dde555f17e5b90363d0c0d5bcd70fa99ed5422ffc32bce3253c1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:17Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:08:27Z is after 2025-08-24T17:21:41Z" Nov 26 15:08:27 crc kubenswrapper[4785]: I1126 15:08:27.336365 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:08:27Z is after 2025-08-24T17:21:41Z" Nov 26 15:08:27 crc kubenswrapper[4785]: I1126 15:08:27.346706 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-smv28" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d77660b-5b10-4573-84ab-3dc318d4b4ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3bb62d64073e9a5efa6ec04394fb33bf1498892d59e0843f323a728dff2e7b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mwfft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:38Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-smv28\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:08:27Z is after 2025-08-24T17:21:41Z" Nov 26 15:08:27 crc kubenswrapper[4785]: I1126 15:08:27.356660 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zk6jt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"09c65f70-203d-40eb-a45e-ed8d7e36912f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69bcee7670290dd116b68ea6d15585455861981551919b4b553fc699f43161e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dg52p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba1f2e54aa3b8e81f5bc6ad4531d5976cfd8d6c4c15f16cefecc75c3c6189ef4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dg52p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-zk6jt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:08:27Z is after 2025-08-24T17:21:41Z" Nov 26 15:08:27 crc kubenswrapper[4785]: I1126 15:08:27.401123 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:27 crc kubenswrapper[4785]: I1126 15:08:27.401155 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:27 crc kubenswrapper[4785]: I1126 15:08:27.401166 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:27 crc kubenswrapper[4785]: I1126 15:08:27.401183 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:27 crc kubenswrapper[4785]: I1126 15:08:27.401196 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:27Z","lastTransitionTime":"2025-11-26T15:08:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:27 crc kubenswrapper[4785]: I1126 15:08:27.472336 4785 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-6q4xd_855bd894-cca9-4fe1-a0d5-8b72afe7c93a/kube-multus/0.log" Nov 26 15:08:27 crc kubenswrapper[4785]: I1126 15:08:27.472392 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-6q4xd" event={"ID":"855bd894-cca9-4fe1-a0d5-8b72afe7c93a","Type":"ContainerStarted","Data":"96bab32cef462780b564492521f2bdddda092ed263374eb243d3e20ed8a068ff"} Nov 26 15:08:27 crc kubenswrapper[4785]: I1126 15:08:27.487495 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8acfa933b476aa5e196c195906225ae92af5cca3353292e23f29f87ee374d0ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f432643beb27bff31e146230a60837b1292cddb8595ede8ed7f929170fceeb17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:08:27Z is after 2025-08-24T17:21:41Z" Nov 26 15:08:27 crc kubenswrapper[4785]: I1126 15:08:27.501077 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ab8b8fda3321443cb24c214c10ba7171f01c80adf81860f4aa931062ec886d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:08:27Z is after 2025-08-24T17:21:41Z" Nov 26 15:08:27 crc kubenswrapper[4785]: I1126 15:08:27.503525 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:27 crc kubenswrapper[4785]: I1126 15:08:27.503590 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:27 crc kubenswrapper[4785]: I1126 15:08:27.503599 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:27 crc kubenswrapper[4785]: I1126 15:08:27.503612 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:27 crc kubenswrapper[4785]: I1126 15:08:27.503620 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:27Z","lastTransitionTime":"2025-11-26T15:08:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:27 crc kubenswrapper[4785]: I1126 15:08:27.514426 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:08:27Z is after 2025-08-24T17:21:41Z" Nov 26 15:08:27 crc kubenswrapper[4785]: I1126 15:08:27.530936 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xbz7b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84d83039-5d86-45bb-a5c1-ca5b94ed92c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5459c3414a7754407ac6d182e88047bbe1b7b9750d15c7a88767c50c4b9dc720\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gdb7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f7072a2a9067489a27300fea823075b89d71d45a1d972bd9d94226b01c2c123\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f7072a2a9067489a27300fea823075b89d71d45a1d972bd9d94226b01c2c123\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gdb7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://538d65a0b5f3e2c45c1b9761f2c73e958d5de0873652f20f4047631423e538eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://538d65a0b5f3e2c45c1b9761f2c73e958d5de0873652f20f4047631423e538eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gdb7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1abf2b4c1a00c0bc17acaf7547130008127579df59592382ae6e81430936e6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d1abf2b4c1a00c0bc17acaf7547130008127579df59592382ae6e81430936e6d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gdb7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://35a2151729e5cb2ae347214fc91e6de116a4062943e4cb4a27ab0553436953ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35a2151729e5cb2ae347214fc91e6de116a4062943e4cb4a27ab0553436953ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gdb7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b51e95d33c83c57a3b5a7f580eb9cac5e50575e699c066694a9ae8fd4dcdfb20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b51e95d33c83c57a3b5a7f580eb9cac5e50575e699c066694a9ae8fd4dcdfb20\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gdb7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20f87e07ed26b3b93d4dad87f971e4ee3bf3c46299491a95990ab34c63167f93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://20f87e07ed26b3b93d4dad87f971e4ee3bf3c46299491a95990ab34c63167f93\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gdb7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xbz7b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:08:27Z is after 2025-08-24T17:21:41Z" Nov 26 15:08:27 crc kubenswrapper[4785]: I1126 15:08:27.541634 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-gkxdl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5539d39a-e2bc-4e7f-8b5a-a5e3e10c4ba4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://183451ec5e6ad1983b6e56a53adfcf2eaf3489dcfbc3cbc709f67002691b9010\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqzs8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5538d579d2e0ac36c778f58da6e8176f8f7a0995797e8bc7c3ad21a054346867\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqzs8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-gkxdl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:08:27Z is after 2025-08-24T17:21:41Z" Nov 26 15:08:27 crc kubenswrapper[4785]: I1126 15:08:27.553172 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dbac4f7f-578e-41e6-88a0-4ad8d2f5eef2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://320fc519a91dfed4d5c233e7c1c2f21aadf25336ccade8830069aae4bd804a68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b79ab88118b1e683ad26fc3bd390961d15af99cce73126c56587c0df819f3a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6c964ff713703fe4484d76d86473279ffc4a3eef0cf3dd8b29aea7fdc6271616\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03ac9c745ec9eb21263a599561496630202872ef631f5e846e0475efedee6943\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:08:27Z is after 2025-08-24T17:21:41Z" Nov 26 15:08:27 crc kubenswrapper[4785]: I1126 15:08:27.566295 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58c0357f-6901-420b-a246-f5ef95e4fb7e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:08:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:08:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2553398db49910f840349bfd11fd5ce7908438bdbcf5a654355c10fb3c3d610\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3c4c50721c470ba97725e59a42ae3cf740017ed7027a9e5b0a7e1906f40fd49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f8dc60997fbeb399187b6bc95276994ff0a181732022bfff1e229928996f90b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://169def9ac03274e05424365ce5b2fc1236d495e846603c29fa6efbaae27b53b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://169def9ac03274e05424365ce5b2fc1236d495e846603c29fa6efbaae27b53b2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:18Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:17Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:08:27Z is after 2025-08-24T17:21:41Z" Nov 26 15:08:27 crc kubenswrapper[4785]: I1126 15:08:27.585822 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ec3c2133-1999-46b4-b5a6-c1b800f04c30\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee866b365ef5e15c3cc262ab6ad0649b10fa7fdf0c10c3f747d9e20d48535e76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79b8f6bd2047642cacd4effcee8f9760c7c9480092993c7a25eabe4727bd8238\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dee1454ae94951e83f437b999f871e4965a9199a20bcd6392a6e07a7bfa9c6aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://257e51cb9b93727ccb616e193abc21dfed1ee9ee2f1ce459e35728457dae5280\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc743861f85f56a87c6a20d7152af59e554284fe13b8212c5dae854438771406\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ff60b31932a511f9156f1416e112d115fcbc79a0a6030143ba9f5ddd327577a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ff60b31932a511f9156f1416e112d115fcbc79a0a6030143ba9f5ddd327577a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff524be439c6bf410b9377cd503516b43be3826e4c2fcd03a6ed9bf3fcfab95b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff524be439c6bf410b9377cd503516b43be3826e4c2fcd03a6ed9bf3fcfab95b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:19Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://26cb5ba5508ef320a24ba05e352547c9feb82f37ebeab58ad7c0a073d03b8c2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26cb5ba5508ef320a24ba05e352547c9feb82f37ebeab58ad7c0a073d03b8c2d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:17Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:08:27Z is after 2025-08-24T17:21:41Z" Nov 26 15:08:27 crc kubenswrapper[4785]: I1126 15:08:27.605313 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:27 crc kubenswrapper[4785]: I1126 15:08:27.605342 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:27 crc kubenswrapper[4785]: I1126 15:08:27.605354 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:27 crc kubenswrapper[4785]: I1126 15:08:27.605368 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:27 crc kubenswrapper[4785]: I1126 15:08:27.605379 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:27Z","lastTransitionTime":"2025-11-26T15:08:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:27 crc kubenswrapper[4785]: I1126 15:08:27.610238 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-925q9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"862c58fd-3f79-4276-bd76-ce689d32cbd6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b754df1d5f2d20f3afdb10f2e77f4716de9d953c7322bb3e289d989d2699d17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://018b59a73e9d454f9156de17b00a8c5cccc3f321164cac374187afa18466bc45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e1a2013c12cbd23c8c9e4e0d5bd9f76df1c2c2f30f0d55fdb8c8cb03698ea576\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bc34631383b7470b2d582d037ea922f2c46d09ca9e37011d758e16837336a78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://77387f2d556a76807d809c75aa4d9e2fdf5de82d7a04e23d3f5461ccef860196\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://47be007cc11c8c2db5bc6dd5786cad4e45c1f8f08f872d060be0211012aad935\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e098d64d0baa7671b1693be648e0a05bcf267a0fbb27f8df8341589e7f2403a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e098d64d0baa7671b1693be648e0a05bcf267a0fbb27f8df8341589e7f2403a8\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-26T15:08:05Z\\\",\\\"message\\\":\\\" event handler 8\\\\nI1126 15:08:05.966708 6428 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1126 15:08:05.966737 6428 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1126 15:08:05.966799 6428 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1126 15:08:05.966828 6428 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1126 15:08:05.966851 6428 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1126 15:08:05.966838 6428 handler.go:208] Removed *v1.Node event handler 2\\\\nI1126 15:08:05.966854 6428 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1126 15:08:05.966902 6428 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1126 15:08:05.966906 6428 factory.go:656] Stopping watch factory\\\\nI1126 15:08:05.966902 6428 handler.go:208] Removed *v1.Node event handler 7\\\\nI1126 15:08:05.966903 6428 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI1126 15:08:05.966957 6428 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1126 15:08:05.967001 6428 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI1126 15:08:05.967033 6428 ovnkube.go:599] Stopped ovnkube\\\\nI1126 15:08:05.967055 6428 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1126 15:08:05.967109 6428 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-26T15:08:05Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-925q9_openshift-ovn-kubernetes(862c58fd-3f79-4276-bd76-ce689d32cbd6)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84b98c7c476182b515072f71c6d81656debe5c03855324623b1eb31f53e6c5c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://049fc6de07d4b51e1c898035109c3427dff189cc509f5973db7cb75c3a022bc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://049fc6de07d4b51e1c898035109c3427dff189cc509f5973db7cb75c3a022bc1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-925q9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:08:27Z is after 2025-08-24T17:21:41Z" Nov 26 15:08:27 crc kubenswrapper[4785]: I1126 15:08:27.622122 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-hk884" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b9a87c55-b930-4993-88b8-15902e000caa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://018c4591e8ee49a626a6357d6672b723d221f62eaa62face83fd6c3d5f1b3bfa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n9rk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:38Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-hk884\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:08:27Z is after 2025-08-24T17:21:41Z" Nov 26 15:08:27 crc kubenswrapper[4785]: I1126 15:08:27.637599 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6q4xd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"855bd894-cca9-4fe1-a0d5-8b72afe7c93a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:08:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:08:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96bab32cef462780b564492521f2bdddda092ed263374eb243d3e20ed8a068ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://041064cd63f3d654e02e5e310fbc190ca527a986959c9d7a35ab87962184f2a0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-26T15:08:26Z\\\",\\\"message\\\":\\\"2025-11-26T15:07:40+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_bd6b8591-1e03-43be-806d-3dc58de1780b\\\\n2025-11-26T15:07:40+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_bd6b8591-1e03-43be-806d-3dc58de1780b to /host/opt/cni/bin/\\\\n2025-11-26T15:07:41Z [verbose] multus-daemon started\\\\n2025-11-26T15:07:41Z [verbose] Readiness Indicator file check\\\\n2025-11-26T15:08:26Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:39Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7zv2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6q4xd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:08:27Z is after 2025-08-24T17:21:41Z" Nov 26 15:08:27 crc kubenswrapper[4785]: I1126 15:08:27.649976 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-qdfwp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72903df2-b694-4229-96b5-167500cab723\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrprh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrprh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:52Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-qdfwp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:08:27Z is after 2025-08-24T17:21:41Z" Nov 26 15:08:27 crc kubenswrapper[4785]: I1126 15:08:27.666060 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zk6jt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"09c65f70-203d-40eb-a45e-ed8d7e36912f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69bcee7670290dd116b68ea6d15585455861981551919b4b553fc699f43161e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dg52p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba1f2e54aa3b8e81f5bc6ad4531d5976cfd8d6c4c15f16cefecc75c3c6189ef4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dg52p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-zk6jt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:08:27Z is after 2025-08-24T17:21:41Z" Nov 26 15:08:27 crc kubenswrapper[4785]: I1126 15:08:27.678680 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2fc8b9de-d617-46d7-8834-84da7611f363\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08b2ab738589b29c6c67e3319ccab97ca16955fbc26dfa0dcf8a4374f6cc0d4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad113c4b2e70dde555f17e5b90363d0c0d5bcd70fa99ed5422ffc32bce3253c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ad113c4b2e70dde555f17e5b90363d0c0d5bcd70fa99ed5422ffc32bce3253c1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:17Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:08:27Z is after 2025-08-24T17:21:41Z" Nov 26 15:08:27 crc kubenswrapper[4785]: I1126 15:08:27.691038 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:08:27Z is after 2025-08-24T17:21:41Z" Nov 26 15:08:27 crc kubenswrapper[4785]: I1126 15:08:27.700724 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-smv28" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d77660b-5b10-4573-84ab-3dc318d4b4ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3bb62d64073e9a5efa6ec04394fb33bf1498892d59e0843f323a728dff2e7b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mwfft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:38Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-smv28\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:08:27Z is after 2025-08-24T17:21:41Z" Nov 26 15:08:27 crc kubenswrapper[4785]: I1126 15:08:27.708480 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:27 crc kubenswrapper[4785]: I1126 15:08:27.708517 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:27 crc kubenswrapper[4785]: I1126 15:08:27.708534 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:27 crc kubenswrapper[4785]: I1126 15:08:27.708558 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:27 crc kubenswrapper[4785]: I1126 15:08:27.708599 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:27Z","lastTransitionTime":"2025-11-26T15:08:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:27 crc kubenswrapper[4785]: I1126 15:08:27.716941 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ce2473df-8540-437a-9b68-a0c79c8f1189\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26237e500f4655636096b017418a9029308b9ad9e5c12041546e51d919766529\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7904d95c1ece08fadcdc54e83f80b295f3904cc950fb84624bc2a6967e4d5bdd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://936454b4c6cdedceb140c50e2dba0e677ce2b126075b203bdae8a77bbdefde76\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2202f005c9b704cc7a97a7387e4563a14bda9e52001c63918f1067a7e676d34d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a6d4414478527bb76f539125e4be6ebf2a6470ddb46fd35e07180319fa98ede\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-26T15:07:36Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1126 15:07:30.557519 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1126 15:07:30.559919 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2275430498/tls.crt::/tmp/serving-cert-2275430498/tls.key\\\\\\\"\\\\nI1126 15:07:36.843783 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1126 15:07:36.847546 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1126 15:07:36.847601 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1126 15:07:36.847630 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1126 15:07:36.847642 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1126 15:07:36.858915 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1126 15:07:36.858955 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1126 15:07:36.858960 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1126 15:07:36.858971 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1126 15:07:36.858975 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1126 15:07:36.858978 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1126 15:07:36.858982 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1126 15:07:36.859422 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1126 15:07:36.862125 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:20Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcce80995d95ff6993051a4f1ca260133d2513d216a291e3f2c905ec29f88d1b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e86ca1ef167bd0c31ad5ae1283d40945c72c4e528f3702b577bbbf9818e84c08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e86ca1ef167bd0c31ad5ae1283d40945c72c4e528f3702b577bbbf9818e84c08\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:08:27Z is after 2025-08-24T17:21:41Z" Nov 26 15:08:27 crc kubenswrapper[4785]: I1126 15:08:27.732443 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f369bd7df062792e57a0cee19e7be96c6850b20d6e739a0c7fa5eccc14ae1d1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:08:27Z is after 2025-08-24T17:21:41Z" Nov 26 15:08:27 crc kubenswrapper[4785]: I1126 15:08:27.747808 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:08:27Z is after 2025-08-24T17:21:41Z" Nov 26 15:08:27 crc kubenswrapper[4785]: I1126 15:08:27.810867 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:27 crc kubenswrapper[4785]: I1126 15:08:27.810911 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:27 crc kubenswrapper[4785]: I1126 15:08:27.810929 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:27 crc kubenswrapper[4785]: I1126 15:08:27.810954 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:27 crc kubenswrapper[4785]: I1126 15:08:27.810972 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:27Z","lastTransitionTime":"2025-11-26T15:08:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:27 crc kubenswrapper[4785]: I1126 15:08:27.913526 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:27 crc kubenswrapper[4785]: I1126 15:08:27.913588 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:27 crc kubenswrapper[4785]: I1126 15:08:27.913600 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:27 crc kubenswrapper[4785]: I1126 15:08:27.913616 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:27 crc kubenswrapper[4785]: I1126 15:08:27.913628 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:27Z","lastTransitionTime":"2025-11-26T15:08:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:28 crc kubenswrapper[4785]: I1126 15:08:28.015681 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:28 crc kubenswrapper[4785]: I1126 15:08:28.015734 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:28 crc kubenswrapper[4785]: I1126 15:08:28.015748 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:28 crc kubenswrapper[4785]: I1126 15:08:28.015768 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:28 crc kubenswrapper[4785]: I1126 15:08:28.015781 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:28Z","lastTransitionTime":"2025-11-26T15:08:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:28 crc kubenswrapper[4785]: I1126 15:08:28.035421 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 26 15:08:28 crc kubenswrapper[4785]: I1126 15:08:28.035491 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qdfwp" Nov 26 15:08:28 crc kubenswrapper[4785]: E1126 15:08:28.035613 4785 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 26 15:08:28 crc kubenswrapper[4785]: E1126 15:08:28.035703 4785 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qdfwp" podUID="72903df2-b694-4229-96b5-167500cab723" Nov 26 15:08:28 crc kubenswrapper[4785]: I1126 15:08:28.118251 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:28 crc kubenswrapper[4785]: I1126 15:08:28.118307 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:28 crc kubenswrapper[4785]: I1126 15:08:28.118323 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:28 crc kubenswrapper[4785]: I1126 15:08:28.118350 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:28 crc kubenswrapper[4785]: I1126 15:08:28.118372 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:28Z","lastTransitionTime":"2025-11-26T15:08:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:28 crc kubenswrapper[4785]: I1126 15:08:28.220735 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:28 crc kubenswrapper[4785]: I1126 15:08:28.220769 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:28 crc kubenswrapper[4785]: I1126 15:08:28.220777 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:28 crc kubenswrapper[4785]: I1126 15:08:28.220796 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:28 crc kubenswrapper[4785]: I1126 15:08:28.220807 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:28Z","lastTransitionTime":"2025-11-26T15:08:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:28 crc kubenswrapper[4785]: I1126 15:08:28.328285 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:28 crc kubenswrapper[4785]: I1126 15:08:28.328347 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:28 crc kubenswrapper[4785]: I1126 15:08:28.328360 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:28 crc kubenswrapper[4785]: I1126 15:08:28.328377 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:28 crc kubenswrapper[4785]: I1126 15:08:28.328393 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:28Z","lastTransitionTime":"2025-11-26T15:08:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:28 crc kubenswrapper[4785]: I1126 15:08:28.431467 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:28 crc kubenswrapper[4785]: I1126 15:08:28.431509 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:28 crc kubenswrapper[4785]: I1126 15:08:28.431525 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:28 crc kubenswrapper[4785]: I1126 15:08:28.431546 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:28 crc kubenswrapper[4785]: I1126 15:08:28.431596 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:28Z","lastTransitionTime":"2025-11-26T15:08:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:28 crc kubenswrapper[4785]: I1126 15:08:28.534474 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:28 crc kubenswrapper[4785]: I1126 15:08:28.534515 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:28 crc kubenswrapper[4785]: I1126 15:08:28.534524 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:28 crc kubenswrapper[4785]: I1126 15:08:28.534539 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:28 crc kubenswrapper[4785]: I1126 15:08:28.534574 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:28Z","lastTransitionTime":"2025-11-26T15:08:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:28 crc kubenswrapper[4785]: I1126 15:08:28.636539 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:28 crc kubenswrapper[4785]: I1126 15:08:28.636608 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:28 crc kubenswrapper[4785]: I1126 15:08:28.636620 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:28 crc kubenswrapper[4785]: I1126 15:08:28.636640 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:28 crc kubenswrapper[4785]: I1126 15:08:28.636693 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:28Z","lastTransitionTime":"2025-11-26T15:08:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:28 crc kubenswrapper[4785]: I1126 15:08:28.739236 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:28 crc kubenswrapper[4785]: I1126 15:08:28.739308 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:28 crc kubenswrapper[4785]: I1126 15:08:28.739320 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:28 crc kubenswrapper[4785]: I1126 15:08:28.739335 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:28 crc kubenswrapper[4785]: I1126 15:08:28.739345 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:28Z","lastTransitionTime":"2025-11-26T15:08:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:28 crc kubenswrapper[4785]: I1126 15:08:28.842113 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:28 crc kubenswrapper[4785]: I1126 15:08:28.842168 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:28 crc kubenswrapper[4785]: I1126 15:08:28.842186 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:28 crc kubenswrapper[4785]: I1126 15:08:28.842209 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:28 crc kubenswrapper[4785]: I1126 15:08:28.842226 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:28Z","lastTransitionTime":"2025-11-26T15:08:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:28 crc kubenswrapper[4785]: I1126 15:08:28.944580 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:28 crc kubenswrapper[4785]: I1126 15:08:28.944625 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:28 crc kubenswrapper[4785]: I1126 15:08:28.944635 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:28 crc kubenswrapper[4785]: I1126 15:08:28.944650 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:28 crc kubenswrapper[4785]: I1126 15:08:28.944661 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:28Z","lastTransitionTime":"2025-11-26T15:08:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:29 crc kubenswrapper[4785]: I1126 15:08:29.035622 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 26 15:08:29 crc kubenswrapper[4785]: E1126 15:08:29.035817 4785 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 26 15:08:29 crc kubenswrapper[4785]: I1126 15:08:29.035625 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 26 15:08:29 crc kubenswrapper[4785]: E1126 15:08:29.036052 4785 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 26 15:08:29 crc kubenswrapper[4785]: I1126 15:08:29.046383 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:29 crc kubenswrapper[4785]: I1126 15:08:29.046408 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:29 crc kubenswrapper[4785]: I1126 15:08:29.046417 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:29 crc kubenswrapper[4785]: I1126 15:08:29.046427 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:29 crc kubenswrapper[4785]: I1126 15:08:29.046436 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:29Z","lastTransitionTime":"2025-11-26T15:08:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:29 crc kubenswrapper[4785]: I1126 15:08:29.148396 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:29 crc kubenswrapper[4785]: I1126 15:08:29.148430 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:29 crc kubenswrapper[4785]: I1126 15:08:29.148458 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:29 crc kubenswrapper[4785]: I1126 15:08:29.148473 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:29 crc kubenswrapper[4785]: I1126 15:08:29.148484 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:29Z","lastTransitionTime":"2025-11-26T15:08:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:29 crc kubenswrapper[4785]: I1126 15:08:29.250161 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:29 crc kubenswrapper[4785]: I1126 15:08:29.250242 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:29 crc kubenswrapper[4785]: I1126 15:08:29.250266 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:29 crc kubenswrapper[4785]: I1126 15:08:29.250290 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:29 crc kubenswrapper[4785]: I1126 15:08:29.250309 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:29Z","lastTransitionTime":"2025-11-26T15:08:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:29 crc kubenswrapper[4785]: I1126 15:08:29.353701 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:29 crc kubenswrapper[4785]: I1126 15:08:29.353779 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:29 crc kubenswrapper[4785]: I1126 15:08:29.353804 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:29 crc kubenswrapper[4785]: I1126 15:08:29.353837 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:29 crc kubenswrapper[4785]: I1126 15:08:29.353863 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:29Z","lastTransitionTime":"2025-11-26T15:08:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:29 crc kubenswrapper[4785]: I1126 15:08:29.456807 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:29 crc kubenswrapper[4785]: I1126 15:08:29.456883 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:29 crc kubenswrapper[4785]: I1126 15:08:29.456905 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:29 crc kubenswrapper[4785]: I1126 15:08:29.456933 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:29 crc kubenswrapper[4785]: I1126 15:08:29.456956 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:29Z","lastTransitionTime":"2025-11-26T15:08:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:29 crc kubenswrapper[4785]: I1126 15:08:29.559345 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:29 crc kubenswrapper[4785]: I1126 15:08:29.559373 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:29 crc kubenswrapper[4785]: I1126 15:08:29.559382 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:29 crc kubenswrapper[4785]: I1126 15:08:29.559398 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:29 crc kubenswrapper[4785]: I1126 15:08:29.559408 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:29Z","lastTransitionTime":"2025-11-26T15:08:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:29 crc kubenswrapper[4785]: I1126 15:08:29.662003 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:29 crc kubenswrapper[4785]: I1126 15:08:29.662043 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:29 crc kubenswrapper[4785]: I1126 15:08:29.662052 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:29 crc kubenswrapper[4785]: I1126 15:08:29.662066 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:29 crc kubenswrapper[4785]: I1126 15:08:29.662077 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:29Z","lastTransitionTime":"2025-11-26T15:08:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:29 crc kubenswrapper[4785]: I1126 15:08:29.765298 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:29 crc kubenswrapper[4785]: I1126 15:08:29.765365 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:29 crc kubenswrapper[4785]: I1126 15:08:29.765376 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:29 crc kubenswrapper[4785]: I1126 15:08:29.765396 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:29 crc kubenswrapper[4785]: I1126 15:08:29.765407 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:29Z","lastTransitionTime":"2025-11-26T15:08:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:29 crc kubenswrapper[4785]: I1126 15:08:29.868354 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:29 crc kubenswrapper[4785]: I1126 15:08:29.868404 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:29 crc kubenswrapper[4785]: I1126 15:08:29.868421 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:29 crc kubenswrapper[4785]: I1126 15:08:29.868473 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:29 crc kubenswrapper[4785]: I1126 15:08:29.868490 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:29Z","lastTransitionTime":"2025-11-26T15:08:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:29 crc kubenswrapper[4785]: I1126 15:08:29.970984 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:29 crc kubenswrapper[4785]: I1126 15:08:29.971057 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:29 crc kubenswrapper[4785]: I1126 15:08:29.971077 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:29 crc kubenswrapper[4785]: I1126 15:08:29.971100 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:29 crc kubenswrapper[4785]: I1126 15:08:29.971120 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:29Z","lastTransitionTime":"2025-11-26T15:08:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:30 crc kubenswrapper[4785]: I1126 15:08:30.035880 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qdfwp" Nov 26 15:08:30 crc kubenswrapper[4785]: I1126 15:08:30.035898 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 26 15:08:30 crc kubenswrapper[4785]: E1126 15:08:30.036077 4785 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qdfwp" podUID="72903df2-b694-4229-96b5-167500cab723" Nov 26 15:08:30 crc kubenswrapper[4785]: E1126 15:08:30.036220 4785 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 26 15:08:30 crc kubenswrapper[4785]: I1126 15:08:30.074533 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:30 crc kubenswrapper[4785]: I1126 15:08:30.074592 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:30 crc kubenswrapper[4785]: I1126 15:08:30.074602 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:30 crc kubenswrapper[4785]: I1126 15:08:30.074619 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:30 crc kubenswrapper[4785]: I1126 15:08:30.074631 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:30Z","lastTransitionTime":"2025-11-26T15:08:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:30 crc kubenswrapper[4785]: I1126 15:08:30.177218 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:30 crc kubenswrapper[4785]: I1126 15:08:30.177276 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:30 crc kubenswrapper[4785]: I1126 15:08:30.177293 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:30 crc kubenswrapper[4785]: I1126 15:08:30.177315 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:30 crc kubenswrapper[4785]: I1126 15:08:30.177333 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:30Z","lastTransitionTime":"2025-11-26T15:08:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:30 crc kubenswrapper[4785]: I1126 15:08:30.280305 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:30 crc kubenswrapper[4785]: I1126 15:08:30.280341 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:30 crc kubenswrapper[4785]: I1126 15:08:30.280356 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:30 crc kubenswrapper[4785]: I1126 15:08:30.280375 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:30 crc kubenswrapper[4785]: I1126 15:08:30.280391 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:30Z","lastTransitionTime":"2025-11-26T15:08:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:30 crc kubenswrapper[4785]: I1126 15:08:30.382709 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:30 crc kubenswrapper[4785]: I1126 15:08:30.382749 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:30 crc kubenswrapper[4785]: I1126 15:08:30.382757 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:30 crc kubenswrapper[4785]: I1126 15:08:30.382771 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:30 crc kubenswrapper[4785]: I1126 15:08:30.382780 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:30Z","lastTransitionTime":"2025-11-26T15:08:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:30 crc kubenswrapper[4785]: I1126 15:08:30.485076 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:30 crc kubenswrapper[4785]: I1126 15:08:30.485124 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:30 crc kubenswrapper[4785]: I1126 15:08:30.485141 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:30 crc kubenswrapper[4785]: I1126 15:08:30.485169 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:30 crc kubenswrapper[4785]: I1126 15:08:30.485186 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:30Z","lastTransitionTime":"2025-11-26T15:08:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:30 crc kubenswrapper[4785]: I1126 15:08:30.587789 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:30 crc kubenswrapper[4785]: I1126 15:08:30.587850 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:30 crc kubenswrapper[4785]: I1126 15:08:30.587867 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:30 crc kubenswrapper[4785]: I1126 15:08:30.587889 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:30 crc kubenswrapper[4785]: I1126 15:08:30.587908 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:30Z","lastTransitionTime":"2025-11-26T15:08:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:30 crc kubenswrapper[4785]: I1126 15:08:30.690413 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:30 crc kubenswrapper[4785]: I1126 15:08:30.690459 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:30 crc kubenswrapper[4785]: I1126 15:08:30.690474 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:30 crc kubenswrapper[4785]: I1126 15:08:30.690495 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:30 crc kubenswrapper[4785]: I1126 15:08:30.690509 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:30Z","lastTransitionTime":"2025-11-26T15:08:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:30 crc kubenswrapper[4785]: I1126 15:08:30.792593 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:30 crc kubenswrapper[4785]: I1126 15:08:30.792632 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:30 crc kubenswrapper[4785]: I1126 15:08:30.792645 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:30 crc kubenswrapper[4785]: I1126 15:08:30.792663 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:30 crc kubenswrapper[4785]: I1126 15:08:30.792676 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:30Z","lastTransitionTime":"2025-11-26T15:08:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:30 crc kubenswrapper[4785]: I1126 15:08:30.895717 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:30 crc kubenswrapper[4785]: I1126 15:08:30.895760 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:30 crc kubenswrapper[4785]: I1126 15:08:30.895772 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:30 crc kubenswrapper[4785]: I1126 15:08:30.895789 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:30 crc kubenswrapper[4785]: I1126 15:08:30.895801 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:30Z","lastTransitionTime":"2025-11-26T15:08:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:30 crc kubenswrapper[4785]: I1126 15:08:30.998438 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:30 crc kubenswrapper[4785]: I1126 15:08:30.998492 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:30 crc kubenswrapper[4785]: I1126 15:08:30.998505 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:30 crc kubenswrapper[4785]: I1126 15:08:30.998520 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:30 crc kubenswrapper[4785]: I1126 15:08:30.998533 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:30Z","lastTransitionTime":"2025-11-26T15:08:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:31 crc kubenswrapper[4785]: I1126 15:08:31.035720 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 26 15:08:31 crc kubenswrapper[4785]: E1126 15:08:31.035865 4785 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 26 15:08:31 crc kubenswrapper[4785]: I1126 15:08:31.036077 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 26 15:08:31 crc kubenswrapper[4785]: E1126 15:08:31.036174 4785 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 26 15:08:31 crc kubenswrapper[4785]: I1126 15:08:31.101215 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:31 crc kubenswrapper[4785]: I1126 15:08:31.101277 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:31 crc kubenswrapper[4785]: I1126 15:08:31.101292 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:31 crc kubenswrapper[4785]: I1126 15:08:31.101314 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:31 crc kubenswrapper[4785]: I1126 15:08:31.101327 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:31Z","lastTransitionTime":"2025-11-26T15:08:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:31 crc kubenswrapper[4785]: I1126 15:08:31.204742 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:31 crc kubenswrapper[4785]: I1126 15:08:31.204794 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:31 crc kubenswrapper[4785]: I1126 15:08:31.204803 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:31 crc kubenswrapper[4785]: I1126 15:08:31.204820 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:31 crc kubenswrapper[4785]: I1126 15:08:31.204828 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:31Z","lastTransitionTime":"2025-11-26T15:08:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:31 crc kubenswrapper[4785]: I1126 15:08:31.307819 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:31 crc kubenswrapper[4785]: I1126 15:08:31.307868 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:31 crc kubenswrapper[4785]: I1126 15:08:31.307885 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:31 crc kubenswrapper[4785]: I1126 15:08:31.307908 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:31 crc kubenswrapper[4785]: I1126 15:08:31.307924 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:31Z","lastTransitionTime":"2025-11-26T15:08:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:31 crc kubenswrapper[4785]: I1126 15:08:31.410589 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:31 crc kubenswrapper[4785]: I1126 15:08:31.410630 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:31 crc kubenswrapper[4785]: I1126 15:08:31.410644 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:31 crc kubenswrapper[4785]: I1126 15:08:31.410662 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:31 crc kubenswrapper[4785]: I1126 15:08:31.410674 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:31Z","lastTransitionTime":"2025-11-26T15:08:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:31 crc kubenswrapper[4785]: I1126 15:08:31.512247 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:31 crc kubenswrapper[4785]: I1126 15:08:31.512316 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:31 crc kubenswrapper[4785]: I1126 15:08:31.512329 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:31 crc kubenswrapper[4785]: I1126 15:08:31.512347 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:31 crc kubenswrapper[4785]: I1126 15:08:31.512359 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:31Z","lastTransitionTime":"2025-11-26T15:08:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:31 crc kubenswrapper[4785]: I1126 15:08:31.615396 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:31 crc kubenswrapper[4785]: I1126 15:08:31.615440 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:31 crc kubenswrapper[4785]: I1126 15:08:31.615449 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:31 crc kubenswrapper[4785]: I1126 15:08:31.615466 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:31 crc kubenswrapper[4785]: I1126 15:08:31.615475 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:31Z","lastTransitionTime":"2025-11-26T15:08:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:31 crc kubenswrapper[4785]: I1126 15:08:31.718252 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:31 crc kubenswrapper[4785]: I1126 15:08:31.718319 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:31 crc kubenswrapper[4785]: I1126 15:08:31.718336 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:31 crc kubenswrapper[4785]: I1126 15:08:31.718361 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:31 crc kubenswrapper[4785]: I1126 15:08:31.718381 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:31Z","lastTransitionTime":"2025-11-26T15:08:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:31 crc kubenswrapper[4785]: I1126 15:08:31.820784 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:31 crc kubenswrapper[4785]: I1126 15:08:31.820831 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:31 crc kubenswrapper[4785]: I1126 15:08:31.820843 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:31 crc kubenswrapper[4785]: I1126 15:08:31.820860 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:31 crc kubenswrapper[4785]: I1126 15:08:31.820872 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:31Z","lastTransitionTime":"2025-11-26T15:08:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:31 crc kubenswrapper[4785]: I1126 15:08:31.923995 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:31 crc kubenswrapper[4785]: I1126 15:08:31.924070 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:31 crc kubenswrapper[4785]: I1126 15:08:31.924083 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:31 crc kubenswrapper[4785]: I1126 15:08:31.924098 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:31 crc kubenswrapper[4785]: I1126 15:08:31.924109 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:31Z","lastTransitionTime":"2025-11-26T15:08:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:32 crc kubenswrapper[4785]: I1126 15:08:32.026906 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:32 crc kubenswrapper[4785]: I1126 15:08:32.027508 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:32 crc kubenswrapper[4785]: I1126 15:08:32.027749 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:32 crc kubenswrapper[4785]: I1126 15:08:32.027982 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:32 crc kubenswrapper[4785]: I1126 15:08:32.028198 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:32Z","lastTransitionTime":"2025-11-26T15:08:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:32 crc kubenswrapper[4785]: I1126 15:08:32.036364 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 26 15:08:32 crc kubenswrapper[4785]: E1126 15:08:32.036603 4785 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 26 15:08:32 crc kubenswrapper[4785]: I1126 15:08:32.036372 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qdfwp" Nov 26 15:08:32 crc kubenswrapper[4785]: E1126 15:08:32.036809 4785 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qdfwp" podUID="72903df2-b694-4229-96b5-167500cab723" Nov 26 15:08:32 crc kubenswrapper[4785]: I1126 15:08:32.130170 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:32 crc kubenswrapper[4785]: I1126 15:08:32.130501 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:32 crc kubenswrapper[4785]: I1126 15:08:32.130628 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:32 crc kubenswrapper[4785]: I1126 15:08:32.130725 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:32 crc kubenswrapper[4785]: I1126 15:08:32.130809 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:32Z","lastTransitionTime":"2025-11-26T15:08:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:32 crc kubenswrapper[4785]: I1126 15:08:32.233493 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:32 crc kubenswrapper[4785]: I1126 15:08:32.233771 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:32 crc kubenswrapper[4785]: I1126 15:08:32.233849 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:32 crc kubenswrapper[4785]: I1126 15:08:32.233918 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:32 crc kubenswrapper[4785]: I1126 15:08:32.233986 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:32Z","lastTransitionTime":"2025-11-26T15:08:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:32 crc kubenswrapper[4785]: I1126 15:08:32.336649 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:32 crc kubenswrapper[4785]: I1126 15:08:32.336688 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:32 crc kubenswrapper[4785]: I1126 15:08:32.336697 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:32 crc kubenswrapper[4785]: I1126 15:08:32.336712 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:32 crc kubenswrapper[4785]: I1126 15:08:32.336722 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:32Z","lastTransitionTime":"2025-11-26T15:08:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:32 crc kubenswrapper[4785]: I1126 15:08:32.439350 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:32 crc kubenswrapper[4785]: I1126 15:08:32.439710 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:32 crc kubenswrapper[4785]: I1126 15:08:32.439799 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:32 crc kubenswrapper[4785]: I1126 15:08:32.439876 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:32 crc kubenswrapper[4785]: I1126 15:08:32.439965 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:32Z","lastTransitionTime":"2025-11-26T15:08:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:32 crc kubenswrapper[4785]: I1126 15:08:32.542954 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:32 crc kubenswrapper[4785]: I1126 15:08:32.542990 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:32 crc kubenswrapper[4785]: I1126 15:08:32.542999 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:32 crc kubenswrapper[4785]: I1126 15:08:32.543013 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:32 crc kubenswrapper[4785]: I1126 15:08:32.543022 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:32Z","lastTransitionTime":"2025-11-26T15:08:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:32 crc kubenswrapper[4785]: I1126 15:08:32.644626 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:32 crc kubenswrapper[4785]: I1126 15:08:32.644662 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:32 crc kubenswrapper[4785]: I1126 15:08:32.644673 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:32 crc kubenswrapper[4785]: I1126 15:08:32.644690 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:32 crc kubenswrapper[4785]: I1126 15:08:32.644700 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:32Z","lastTransitionTime":"2025-11-26T15:08:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:32 crc kubenswrapper[4785]: I1126 15:08:32.746797 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:32 crc kubenswrapper[4785]: I1126 15:08:32.746837 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:32 crc kubenswrapper[4785]: I1126 15:08:32.746845 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:32 crc kubenswrapper[4785]: I1126 15:08:32.746860 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:32 crc kubenswrapper[4785]: I1126 15:08:32.746869 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:32Z","lastTransitionTime":"2025-11-26T15:08:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:32 crc kubenswrapper[4785]: I1126 15:08:32.849846 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:32 crc kubenswrapper[4785]: I1126 15:08:32.849879 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:32 crc kubenswrapper[4785]: I1126 15:08:32.849892 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:32 crc kubenswrapper[4785]: I1126 15:08:32.849908 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:32 crc kubenswrapper[4785]: I1126 15:08:32.849919 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:32Z","lastTransitionTime":"2025-11-26T15:08:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:32 crc kubenswrapper[4785]: I1126 15:08:32.952361 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:32 crc kubenswrapper[4785]: I1126 15:08:32.952725 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:32 crc kubenswrapper[4785]: I1126 15:08:32.952916 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:32 crc kubenswrapper[4785]: I1126 15:08:32.953081 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:32 crc kubenswrapper[4785]: I1126 15:08:32.953223 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:32Z","lastTransitionTime":"2025-11-26T15:08:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:33 crc kubenswrapper[4785]: I1126 15:08:33.036232 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 26 15:08:33 crc kubenswrapper[4785]: E1126 15:08:33.036386 4785 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 26 15:08:33 crc kubenswrapper[4785]: I1126 15:08:33.036465 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 26 15:08:33 crc kubenswrapper[4785]: E1126 15:08:33.036996 4785 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 26 15:08:33 crc kubenswrapper[4785]: I1126 15:08:33.037314 4785 scope.go:117] "RemoveContainer" containerID="e098d64d0baa7671b1693be648e0a05bcf267a0fbb27f8df8341589e7f2403a8" Nov 26 15:08:33 crc kubenswrapper[4785]: I1126 15:08:33.054856 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:33 crc kubenswrapper[4785]: I1126 15:08:33.054896 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:33 crc kubenswrapper[4785]: I1126 15:08:33.054907 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:33 crc kubenswrapper[4785]: I1126 15:08:33.054921 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:33 crc kubenswrapper[4785]: I1126 15:08:33.054933 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:33Z","lastTransitionTime":"2025-11-26T15:08:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:33 crc kubenswrapper[4785]: I1126 15:08:33.157493 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:33 crc kubenswrapper[4785]: I1126 15:08:33.157531 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:33 crc kubenswrapper[4785]: I1126 15:08:33.157539 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:33 crc kubenswrapper[4785]: I1126 15:08:33.157556 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:33 crc kubenswrapper[4785]: I1126 15:08:33.157591 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:33Z","lastTransitionTime":"2025-11-26T15:08:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:33 crc kubenswrapper[4785]: I1126 15:08:33.259610 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:33 crc kubenswrapper[4785]: I1126 15:08:33.259647 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:33 crc kubenswrapper[4785]: I1126 15:08:33.259657 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:33 crc kubenswrapper[4785]: I1126 15:08:33.259671 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:33 crc kubenswrapper[4785]: I1126 15:08:33.259680 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:33Z","lastTransitionTime":"2025-11-26T15:08:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:33 crc kubenswrapper[4785]: I1126 15:08:33.362615 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:33 crc kubenswrapper[4785]: I1126 15:08:33.362691 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:33 crc kubenswrapper[4785]: I1126 15:08:33.362709 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:33 crc kubenswrapper[4785]: I1126 15:08:33.362733 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:33 crc kubenswrapper[4785]: I1126 15:08:33.362749 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:33Z","lastTransitionTime":"2025-11-26T15:08:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:33 crc kubenswrapper[4785]: I1126 15:08:33.464876 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:33 crc kubenswrapper[4785]: I1126 15:08:33.464916 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:33 crc kubenswrapper[4785]: I1126 15:08:33.464926 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:33 crc kubenswrapper[4785]: I1126 15:08:33.464939 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:33 crc kubenswrapper[4785]: I1126 15:08:33.464949 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:33Z","lastTransitionTime":"2025-11-26T15:08:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:33 crc kubenswrapper[4785]: I1126 15:08:33.494696 4785 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-925q9_862c58fd-3f79-4276-bd76-ce689d32cbd6/ovnkube-controller/2.log" Nov 26 15:08:33 crc kubenswrapper[4785]: I1126 15:08:33.497320 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-925q9" event={"ID":"862c58fd-3f79-4276-bd76-ce689d32cbd6","Type":"ContainerStarted","Data":"c8d76c8252001a8224d06473aabbbc9d9f0b4a73a415750b8414635ebf2b19da"} Nov 26 15:08:33 crc kubenswrapper[4785]: I1126 15:08:33.497976 4785 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-925q9" Nov 26 15:08:33 crc kubenswrapper[4785]: I1126 15:08:33.517314 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ce2473df-8540-437a-9b68-a0c79c8f1189\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26237e500f4655636096b017418a9029308b9ad9e5c12041546e51d919766529\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7904d95c1ece08fadcdc54e83f80b295f3904cc950fb84624bc2a6967e4d5bdd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://936454b4c6cdedceb140c50e2dba0e677ce2b126075b203bdae8a77bbdefde76\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2202f005c9b704cc7a97a7387e4563a14bda9e52001c63918f1067a7e676d34d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a6d4414478527bb76f539125e4be6ebf2a6470ddb46fd35e07180319fa98ede\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-26T15:07:36Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1126 15:07:30.557519 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1126 15:07:30.559919 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2275430498/tls.crt::/tmp/serving-cert-2275430498/tls.key\\\\\\\"\\\\nI1126 15:07:36.843783 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1126 15:07:36.847546 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1126 15:07:36.847601 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1126 15:07:36.847630 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1126 15:07:36.847642 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1126 15:07:36.858915 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1126 15:07:36.858955 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1126 15:07:36.858960 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1126 15:07:36.858971 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1126 15:07:36.858975 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1126 15:07:36.858978 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1126 15:07:36.858982 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1126 15:07:36.859422 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1126 15:07:36.862125 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:20Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcce80995d95ff6993051a4f1ca260133d2513d216a291e3f2c905ec29f88d1b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e86ca1ef167bd0c31ad5ae1283d40945c72c4e528f3702b577bbbf9818e84c08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e86ca1ef167bd0c31ad5ae1283d40945c72c4e528f3702b577bbbf9818e84c08\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:08:33Z is after 2025-08-24T17:21:41Z" Nov 26 15:08:33 crc kubenswrapper[4785]: I1126 15:08:33.530996 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f369bd7df062792e57a0cee19e7be96c6850b20d6e739a0c7fa5eccc14ae1d1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:08:33Z is after 2025-08-24T17:21:41Z" Nov 26 15:08:33 crc kubenswrapper[4785]: I1126 15:08:33.545032 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:08:33Z is after 2025-08-24T17:21:41Z" Nov 26 15:08:33 crc kubenswrapper[4785]: I1126 15:08:33.559389 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8acfa933b476aa5e196c195906225ae92af5cca3353292e23f29f87ee374d0ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f432643beb27bff31e146230a60837b1292cddb8595ede8ed7f929170fceeb17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:08:33Z is after 2025-08-24T17:21:41Z" Nov 26 15:08:33 crc kubenswrapper[4785]: I1126 15:08:33.567187 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:33 crc kubenswrapper[4785]: I1126 15:08:33.567228 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:33 crc kubenswrapper[4785]: I1126 15:08:33.567238 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:33 crc kubenswrapper[4785]: I1126 15:08:33.567254 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:33 crc kubenswrapper[4785]: I1126 15:08:33.567265 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:33Z","lastTransitionTime":"2025-11-26T15:08:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:33 crc kubenswrapper[4785]: I1126 15:08:33.570885 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ab8b8fda3321443cb24c214c10ba7171f01c80adf81860f4aa931062ec886d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:08:33Z is after 2025-08-24T17:21:41Z" Nov 26 15:08:33 crc kubenswrapper[4785]: I1126 15:08:33.583196 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:08:33Z is after 2025-08-24T17:21:41Z" Nov 26 15:08:33 crc kubenswrapper[4785]: I1126 15:08:33.596716 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xbz7b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84d83039-5d86-45bb-a5c1-ca5b94ed92c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5459c3414a7754407ac6d182e88047bbe1b7b9750d15c7a88767c50c4b9dc720\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gdb7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f7072a2a9067489a27300fea823075b89d71d45a1d972bd9d94226b01c2c123\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f7072a2a9067489a27300fea823075b89d71d45a1d972bd9d94226b01c2c123\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gdb7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://538d65a0b5f3e2c45c1b9761f2c73e958d5de0873652f20f4047631423e538eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://538d65a0b5f3e2c45c1b9761f2c73e958d5de0873652f20f4047631423e538eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gdb7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1abf2b4c1a00c0bc17acaf7547130008127579df59592382ae6e81430936e6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d1abf2b4c1a00c0bc17acaf7547130008127579df59592382ae6e81430936e6d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gdb7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://35a2151729e5cb2ae347214fc91e6de116a4062943e4cb4a27ab0553436953ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35a2151729e5cb2ae347214fc91e6de116a4062943e4cb4a27ab0553436953ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gdb7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b51e95d33c83c57a3b5a7f580eb9cac5e50575e699c066694a9ae8fd4dcdfb20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b51e95d33c83c57a3b5a7f580eb9cac5e50575e699c066694a9ae8fd4dcdfb20\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gdb7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20f87e07ed26b3b93d4dad87f971e4ee3bf3c46299491a95990ab34c63167f93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://20f87e07ed26b3b93d4dad87f971e4ee3bf3c46299491a95990ab34c63167f93\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gdb7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xbz7b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:08:33Z is after 2025-08-24T17:21:41Z" Nov 26 15:08:33 crc kubenswrapper[4785]: I1126 15:08:33.609931 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-gkxdl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5539d39a-e2bc-4e7f-8b5a-a5e3e10c4ba4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://183451ec5e6ad1983b6e56a53adfcf2eaf3489dcfbc3cbc709f67002691b9010\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqzs8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5538d579d2e0ac36c778f58da6e8176f8f7a0995797e8bc7c3ad21a054346867\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqzs8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-gkxdl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:08:33Z is after 2025-08-24T17:21:41Z" Nov 26 15:08:33 crc kubenswrapper[4785]: I1126 15:08:33.620429 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dbac4f7f-578e-41e6-88a0-4ad8d2f5eef2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://320fc519a91dfed4d5c233e7c1c2f21aadf25336ccade8830069aae4bd804a68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b79ab88118b1e683ad26fc3bd390961d15af99cce73126c56587c0df819f3a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6c964ff713703fe4484d76d86473279ffc4a3eef0cf3dd8b29aea7fdc6271616\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03ac9c745ec9eb21263a599561496630202872ef631f5e846e0475efedee6943\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:08:33Z is after 2025-08-24T17:21:41Z" Nov 26 15:08:33 crc kubenswrapper[4785]: I1126 15:08:33.632832 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58c0357f-6901-420b-a246-f5ef95e4fb7e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:08:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:08:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2553398db49910f840349bfd11fd5ce7908438bdbcf5a654355c10fb3c3d610\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3c4c50721c470ba97725e59a42ae3cf740017ed7027a9e5b0a7e1906f40fd49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f8dc60997fbeb399187b6bc95276994ff0a181732022bfff1e229928996f90b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://169def9ac03274e05424365ce5b2fc1236d495e846603c29fa6efbaae27b53b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://169def9ac03274e05424365ce5b2fc1236d495e846603c29fa6efbaae27b53b2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:18Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:17Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:08:33Z is after 2025-08-24T17:21:41Z" Nov 26 15:08:33 crc kubenswrapper[4785]: I1126 15:08:33.650213 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ec3c2133-1999-46b4-b5a6-c1b800f04c30\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee866b365ef5e15c3cc262ab6ad0649b10fa7fdf0c10c3f747d9e20d48535e76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79b8f6bd2047642cacd4effcee8f9760c7c9480092993c7a25eabe4727bd8238\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dee1454ae94951e83f437b999f871e4965a9199a20bcd6392a6e07a7bfa9c6aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://257e51cb9b93727ccb616e193abc21dfed1ee9ee2f1ce459e35728457dae5280\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc743861f85f56a87c6a20d7152af59e554284fe13b8212c5dae854438771406\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ff60b31932a511f9156f1416e112d115fcbc79a0a6030143ba9f5ddd327577a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ff60b31932a511f9156f1416e112d115fcbc79a0a6030143ba9f5ddd327577a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff524be439c6bf410b9377cd503516b43be3826e4c2fcd03a6ed9bf3fcfab95b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff524be439c6bf410b9377cd503516b43be3826e4c2fcd03a6ed9bf3fcfab95b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:19Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://26cb5ba5508ef320a24ba05e352547c9feb82f37ebeab58ad7c0a073d03b8c2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26cb5ba5508ef320a24ba05e352547c9feb82f37ebeab58ad7c0a073d03b8c2d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:17Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:08:33Z is after 2025-08-24T17:21:41Z" Nov 26 15:08:33 crc kubenswrapper[4785]: I1126 15:08:33.666574 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-925q9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"862c58fd-3f79-4276-bd76-ce689d32cbd6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b754df1d5f2d20f3afdb10f2e77f4716de9d953c7322bb3e289d989d2699d17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://018b59a73e9d454f9156de17b00a8c5cccc3f321164cac374187afa18466bc45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e1a2013c12cbd23c8c9e4e0d5bd9f76df1c2c2f30f0d55fdb8c8cb03698ea576\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bc34631383b7470b2d582d037ea922f2c46d09ca9e37011d758e16837336a78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://77387f2d556a76807d809c75aa4d9e2fdf5de82d7a04e23d3f5461ccef860196\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://47be007cc11c8c2db5bc6dd5786cad4e45c1f8f08f872d060be0211012aad935\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8d76c8252001a8224d06473aabbbc9d9f0b4a73a415750b8414635ebf2b19da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e098d64d0baa7671b1693be648e0a05bcf267a0fbb27f8df8341589e7f2403a8\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-26T15:08:05Z\\\",\\\"message\\\":\\\" event handler 8\\\\nI1126 15:08:05.966708 6428 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1126 15:08:05.966737 6428 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1126 15:08:05.966799 6428 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1126 15:08:05.966828 6428 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1126 15:08:05.966851 6428 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1126 15:08:05.966838 6428 handler.go:208] Removed *v1.Node event handler 2\\\\nI1126 15:08:05.966854 6428 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1126 15:08:05.966902 6428 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1126 15:08:05.966906 6428 factory.go:656] Stopping watch factory\\\\nI1126 15:08:05.966902 6428 handler.go:208] Removed *v1.Node event handler 7\\\\nI1126 15:08:05.966903 6428 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI1126 15:08:05.966957 6428 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1126 15:08:05.967001 6428 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI1126 15:08:05.967033 6428 ovnkube.go:599] Stopped ovnkube\\\\nI1126 15:08:05.967055 6428 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1126 15:08:05.967109 6428 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-26T15:08:05Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84b98c7c476182b515072f71c6d81656debe5c03855324623b1eb31f53e6c5c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://049fc6de07d4b51e1c898035109c3427dff189cc509f5973db7cb75c3a022bc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://049fc6de07d4b51e1c898035109c3427dff189cc509f5973db7cb75c3a022bc1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-925q9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:08:33Z is after 2025-08-24T17:21:41Z" Nov 26 15:08:33 crc kubenswrapper[4785]: I1126 15:08:33.669033 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:33 crc kubenswrapper[4785]: I1126 15:08:33.669070 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:33 crc kubenswrapper[4785]: I1126 15:08:33.669081 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:33 crc kubenswrapper[4785]: I1126 15:08:33.669099 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:33 crc kubenswrapper[4785]: I1126 15:08:33.669110 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:33Z","lastTransitionTime":"2025-11-26T15:08:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:33 crc kubenswrapper[4785]: I1126 15:08:33.677630 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-hk884" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b9a87c55-b930-4993-88b8-15902e000caa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://018c4591e8ee49a626a6357d6672b723d221f62eaa62face83fd6c3d5f1b3bfa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n9rk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:38Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-hk884\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:08:33Z is after 2025-08-24T17:21:41Z" Nov 26 15:08:33 crc kubenswrapper[4785]: I1126 15:08:33.693010 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6q4xd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"855bd894-cca9-4fe1-a0d5-8b72afe7c93a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:08:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:08:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96bab32cef462780b564492521f2bdddda092ed263374eb243d3e20ed8a068ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://041064cd63f3d654e02e5e310fbc190ca527a986959c9d7a35ab87962184f2a0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-26T15:08:26Z\\\",\\\"message\\\":\\\"2025-11-26T15:07:40+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_bd6b8591-1e03-43be-806d-3dc58de1780b\\\\n2025-11-26T15:07:40+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_bd6b8591-1e03-43be-806d-3dc58de1780b to /host/opt/cni/bin/\\\\n2025-11-26T15:07:41Z [verbose] multus-daemon started\\\\n2025-11-26T15:07:41Z [verbose] Readiness Indicator file check\\\\n2025-11-26T15:08:26Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:39Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7zv2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6q4xd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:08:33Z is after 2025-08-24T17:21:41Z" Nov 26 15:08:33 crc kubenswrapper[4785]: I1126 15:08:33.704006 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-qdfwp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72903df2-b694-4229-96b5-167500cab723\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrprh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrprh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:52Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-qdfwp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:08:33Z is after 2025-08-24T17:21:41Z" Nov 26 15:08:33 crc kubenswrapper[4785]: I1126 15:08:33.716717 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zk6jt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"09c65f70-203d-40eb-a45e-ed8d7e36912f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69bcee7670290dd116b68ea6d15585455861981551919b4b553fc699f43161e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dg52p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba1f2e54aa3b8e81f5bc6ad4531d5976cfd8d6c4c15f16cefecc75c3c6189ef4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dg52p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-zk6jt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:08:33Z is after 2025-08-24T17:21:41Z" Nov 26 15:08:33 crc kubenswrapper[4785]: I1126 15:08:33.726731 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2fc8b9de-d617-46d7-8834-84da7611f363\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08b2ab738589b29c6c67e3319ccab97ca16955fbc26dfa0dcf8a4374f6cc0d4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad113c4b2e70dde555f17e5b90363d0c0d5bcd70fa99ed5422ffc32bce3253c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ad113c4b2e70dde555f17e5b90363d0c0d5bcd70fa99ed5422ffc32bce3253c1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:17Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:08:33Z is after 2025-08-24T17:21:41Z" Nov 26 15:08:33 crc kubenswrapper[4785]: I1126 15:08:33.745191 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:08:33Z is after 2025-08-24T17:21:41Z" Nov 26 15:08:33 crc kubenswrapper[4785]: I1126 15:08:33.756477 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-smv28" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d77660b-5b10-4573-84ab-3dc318d4b4ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3bb62d64073e9a5efa6ec04394fb33bf1498892d59e0843f323a728dff2e7b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mwfft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:38Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-smv28\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:08:33Z is after 2025-08-24T17:21:41Z" Nov 26 15:08:33 crc kubenswrapper[4785]: I1126 15:08:33.770920 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:33 crc kubenswrapper[4785]: I1126 15:08:33.770954 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:33 crc kubenswrapper[4785]: I1126 15:08:33.770968 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:33 crc kubenswrapper[4785]: I1126 15:08:33.770983 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:33 crc kubenswrapper[4785]: I1126 15:08:33.770994 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:33Z","lastTransitionTime":"2025-11-26T15:08:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:33 crc kubenswrapper[4785]: I1126 15:08:33.872748 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:33 crc kubenswrapper[4785]: I1126 15:08:33.872781 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:33 crc kubenswrapper[4785]: I1126 15:08:33.872791 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:33 crc kubenswrapper[4785]: I1126 15:08:33.872804 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:33 crc kubenswrapper[4785]: I1126 15:08:33.872815 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:33Z","lastTransitionTime":"2025-11-26T15:08:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:33 crc kubenswrapper[4785]: I1126 15:08:33.974748 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:33 crc kubenswrapper[4785]: I1126 15:08:33.974789 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:33 crc kubenswrapper[4785]: I1126 15:08:33.974801 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:33 crc kubenswrapper[4785]: I1126 15:08:33.974817 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:33 crc kubenswrapper[4785]: I1126 15:08:33.974832 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:33Z","lastTransitionTime":"2025-11-26T15:08:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:34 crc kubenswrapper[4785]: I1126 15:08:34.035856 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 26 15:08:34 crc kubenswrapper[4785]: I1126 15:08:34.035965 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qdfwp" Nov 26 15:08:34 crc kubenswrapper[4785]: E1126 15:08:34.035987 4785 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 26 15:08:34 crc kubenswrapper[4785]: E1126 15:08:34.036067 4785 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qdfwp" podUID="72903df2-b694-4229-96b5-167500cab723" Nov 26 15:08:34 crc kubenswrapper[4785]: I1126 15:08:34.076811 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:34 crc kubenswrapper[4785]: I1126 15:08:34.076874 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:34 crc kubenswrapper[4785]: I1126 15:08:34.076899 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:34 crc kubenswrapper[4785]: I1126 15:08:34.076929 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:34 crc kubenswrapper[4785]: I1126 15:08:34.076952 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:34Z","lastTransitionTime":"2025-11-26T15:08:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:34 crc kubenswrapper[4785]: I1126 15:08:34.180498 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:34 crc kubenswrapper[4785]: I1126 15:08:34.180536 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:34 crc kubenswrapper[4785]: I1126 15:08:34.180544 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:34 crc kubenswrapper[4785]: I1126 15:08:34.180583 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:34 crc kubenswrapper[4785]: I1126 15:08:34.180601 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:34Z","lastTransitionTime":"2025-11-26T15:08:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:34 crc kubenswrapper[4785]: I1126 15:08:34.284304 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:34 crc kubenswrapper[4785]: I1126 15:08:34.284398 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:34 crc kubenswrapper[4785]: I1126 15:08:34.284425 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:34 crc kubenswrapper[4785]: I1126 15:08:34.284455 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:34 crc kubenswrapper[4785]: I1126 15:08:34.284477 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:34Z","lastTransitionTime":"2025-11-26T15:08:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:34 crc kubenswrapper[4785]: I1126 15:08:34.389217 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:34 crc kubenswrapper[4785]: I1126 15:08:34.389368 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:34 crc kubenswrapper[4785]: I1126 15:08:34.389394 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:34 crc kubenswrapper[4785]: I1126 15:08:34.389412 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:34 crc kubenswrapper[4785]: I1126 15:08:34.389423 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:34Z","lastTransitionTime":"2025-11-26T15:08:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:34 crc kubenswrapper[4785]: I1126 15:08:34.492266 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:34 crc kubenswrapper[4785]: I1126 15:08:34.492331 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:34 crc kubenswrapper[4785]: I1126 15:08:34.492348 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:34 crc kubenswrapper[4785]: I1126 15:08:34.492371 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:34 crc kubenswrapper[4785]: I1126 15:08:34.492388 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:34Z","lastTransitionTime":"2025-11-26T15:08:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:34 crc kubenswrapper[4785]: I1126 15:08:34.503488 4785 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-925q9_862c58fd-3f79-4276-bd76-ce689d32cbd6/ovnkube-controller/3.log" Nov 26 15:08:34 crc kubenswrapper[4785]: I1126 15:08:34.504452 4785 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-925q9_862c58fd-3f79-4276-bd76-ce689d32cbd6/ovnkube-controller/2.log" Nov 26 15:08:34 crc kubenswrapper[4785]: I1126 15:08:34.508837 4785 generic.go:334] "Generic (PLEG): container finished" podID="862c58fd-3f79-4276-bd76-ce689d32cbd6" containerID="c8d76c8252001a8224d06473aabbbc9d9f0b4a73a415750b8414635ebf2b19da" exitCode=1 Nov 26 15:08:34 crc kubenswrapper[4785]: I1126 15:08:34.508900 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-925q9" event={"ID":"862c58fd-3f79-4276-bd76-ce689d32cbd6","Type":"ContainerDied","Data":"c8d76c8252001a8224d06473aabbbc9d9f0b4a73a415750b8414635ebf2b19da"} Nov 26 15:08:34 crc kubenswrapper[4785]: I1126 15:08:34.509008 4785 scope.go:117] "RemoveContainer" containerID="e098d64d0baa7671b1693be648e0a05bcf267a0fbb27f8df8341589e7f2403a8" Nov 26 15:08:34 crc kubenswrapper[4785]: I1126 15:08:34.509489 4785 scope.go:117] "RemoveContainer" containerID="c8d76c8252001a8224d06473aabbbc9d9f0b4a73a415750b8414635ebf2b19da" Nov 26 15:08:34 crc kubenswrapper[4785]: E1126 15:08:34.509678 4785 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-925q9_openshift-ovn-kubernetes(862c58fd-3f79-4276-bd76-ce689d32cbd6)\"" pod="openshift-ovn-kubernetes/ovnkube-node-925q9" podUID="862c58fd-3f79-4276-bd76-ce689d32cbd6" Nov 26 15:08:34 crc kubenswrapper[4785]: I1126 15:08:34.526027 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2fc8b9de-d617-46d7-8834-84da7611f363\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08b2ab738589b29c6c67e3319ccab97ca16955fbc26dfa0dcf8a4374f6cc0d4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad113c4b2e70dde555f17e5b90363d0c0d5bcd70fa99ed5422ffc32bce3253c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ad113c4b2e70dde555f17e5b90363d0c0d5bcd70fa99ed5422ffc32bce3253c1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:17Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:08:34Z is after 2025-08-24T17:21:41Z" Nov 26 15:08:34 crc kubenswrapper[4785]: I1126 15:08:34.540522 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:08:34Z is after 2025-08-24T17:21:41Z" Nov 26 15:08:34 crc kubenswrapper[4785]: I1126 15:08:34.555342 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-smv28" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d77660b-5b10-4573-84ab-3dc318d4b4ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3bb62d64073e9a5efa6ec04394fb33bf1498892d59e0843f323a728dff2e7b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mwfft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:38Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-smv28\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:08:34Z is after 2025-08-24T17:21:41Z" Nov 26 15:08:34 crc kubenswrapper[4785]: I1126 15:08:34.567441 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zk6jt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"09c65f70-203d-40eb-a45e-ed8d7e36912f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69bcee7670290dd116b68ea6d15585455861981551919b4b553fc699f43161e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dg52p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba1f2e54aa3b8e81f5bc6ad4531d5976cfd8d6c4c15f16cefecc75c3c6189ef4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dg52p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-zk6jt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:08:34Z is after 2025-08-24T17:21:41Z" Nov 26 15:08:34 crc kubenswrapper[4785]: I1126 15:08:34.587676 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ce2473df-8540-437a-9b68-a0c79c8f1189\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26237e500f4655636096b017418a9029308b9ad9e5c12041546e51d919766529\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7904d95c1ece08fadcdc54e83f80b295f3904cc950fb84624bc2a6967e4d5bdd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://936454b4c6cdedceb140c50e2dba0e677ce2b126075b203bdae8a77bbdefde76\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2202f005c9b704cc7a97a7387e4563a14bda9e52001c63918f1067a7e676d34d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a6d4414478527bb76f539125e4be6ebf2a6470ddb46fd35e07180319fa98ede\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-26T15:07:36Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1126 15:07:30.557519 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1126 15:07:30.559919 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2275430498/tls.crt::/tmp/serving-cert-2275430498/tls.key\\\\\\\"\\\\nI1126 15:07:36.843783 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1126 15:07:36.847546 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1126 15:07:36.847601 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1126 15:07:36.847630 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1126 15:07:36.847642 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1126 15:07:36.858915 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1126 15:07:36.858955 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1126 15:07:36.858960 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1126 15:07:36.858971 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1126 15:07:36.858975 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1126 15:07:36.858978 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1126 15:07:36.858982 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1126 15:07:36.859422 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1126 15:07:36.862125 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:20Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcce80995d95ff6993051a4f1ca260133d2513d216a291e3f2c905ec29f88d1b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e86ca1ef167bd0c31ad5ae1283d40945c72c4e528f3702b577bbbf9818e84c08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e86ca1ef167bd0c31ad5ae1283d40945c72c4e528f3702b577bbbf9818e84c08\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:08:34Z is after 2025-08-24T17:21:41Z" Nov 26 15:08:34 crc kubenswrapper[4785]: I1126 15:08:34.594732 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:34 crc kubenswrapper[4785]: I1126 15:08:34.594768 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:34 crc kubenswrapper[4785]: I1126 15:08:34.594778 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:34 crc kubenswrapper[4785]: I1126 15:08:34.594792 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:34 crc kubenswrapper[4785]: I1126 15:08:34.594801 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:34Z","lastTransitionTime":"2025-11-26T15:08:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:34 crc kubenswrapper[4785]: I1126 15:08:34.604067 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f369bd7df062792e57a0cee19e7be96c6850b20d6e739a0c7fa5eccc14ae1d1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:08:34Z is after 2025-08-24T17:21:41Z" Nov 26 15:08:34 crc kubenswrapper[4785]: I1126 15:08:34.618284 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:08:34Z is after 2025-08-24T17:21:41Z" Nov 26 15:08:34 crc kubenswrapper[4785]: I1126 15:08:34.633951 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-gkxdl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5539d39a-e2bc-4e7f-8b5a-a5e3e10c4ba4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://183451ec5e6ad1983b6e56a53adfcf2eaf3489dcfbc3cbc709f67002691b9010\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqzs8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5538d579d2e0ac36c778f58da6e8176f8f7a0995797e8bc7c3ad21a054346867\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqzs8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-gkxdl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:08:34Z is after 2025-08-24T17:21:41Z" Nov 26 15:08:34 crc kubenswrapper[4785]: I1126 15:08:34.648318 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dbac4f7f-578e-41e6-88a0-4ad8d2f5eef2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://320fc519a91dfed4d5c233e7c1c2f21aadf25336ccade8830069aae4bd804a68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b79ab88118b1e683ad26fc3bd390961d15af99cce73126c56587c0df819f3a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6c964ff713703fe4484d76d86473279ffc4a3eef0cf3dd8b29aea7fdc6271616\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03ac9c745ec9eb21263a599561496630202872ef631f5e846e0475efedee6943\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:08:34Z is after 2025-08-24T17:21:41Z" Nov 26 15:08:34 crc kubenswrapper[4785]: I1126 15:08:34.662575 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58c0357f-6901-420b-a246-f5ef95e4fb7e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:08:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:08:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2553398db49910f840349bfd11fd5ce7908438bdbcf5a654355c10fb3c3d610\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3c4c50721c470ba97725e59a42ae3cf740017ed7027a9e5b0a7e1906f40fd49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f8dc60997fbeb399187b6bc95276994ff0a181732022bfff1e229928996f90b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://169def9ac03274e05424365ce5b2fc1236d495e846603c29fa6efbaae27b53b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://169def9ac03274e05424365ce5b2fc1236d495e846603c29fa6efbaae27b53b2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:18Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:17Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:08:34Z is after 2025-08-24T17:21:41Z" Nov 26 15:08:34 crc kubenswrapper[4785]: I1126 15:08:34.685075 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ec3c2133-1999-46b4-b5a6-c1b800f04c30\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee866b365ef5e15c3cc262ab6ad0649b10fa7fdf0c10c3f747d9e20d48535e76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79b8f6bd2047642cacd4effcee8f9760c7c9480092993c7a25eabe4727bd8238\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dee1454ae94951e83f437b999f871e4965a9199a20bcd6392a6e07a7bfa9c6aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://257e51cb9b93727ccb616e193abc21dfed1ee9ee2f1ce459e35728457dae5280\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc743861f85f56a87c6a20d7152af59e554284fe13b8212c5dae854438771406\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ff60b31932a511f9156f1416e112d115fcbc79a0a6030143ba9f5ddd327577a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ff60b31932a511f9156f1416e112d115fcbc79a0a6030143ba9f5ddd327577a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff524be439c6bf410b9377cd503516b43be3826e4c2fcd03a6ed9bf3fcfab95b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff524be439c6bf410b9377cd503516b43be3826e4c2fcd03a6ed9bf3fcfab95b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:19Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://26cb5ba5508ef320a24ba05e352547c9feb82f37ebeab58ad7c0a073d03b8c2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26cb5ba5508ef320a24ba05e352547c9feb82f37ebeab58ad7c0a073d03b8c2d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:17Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:08:34Z is after 2025-08-24T17:21:41Z" Nov 26 15:08:34 crc kubenswrapper[4785]: I1126 15:08:34.698417 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8acfa933b476aa5e196c195906225ae92af5cca3353292e23f29f87ee374d0ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f432643beb27bff31e146230a60837b1292cddb8595ede8ed7f929170fceeb17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:08:34Z is after 2025-08-24T17:21:41Z" Nov 26 15:08:34 crc kubenswrapper[4785]: I1126 15:08:34.699031 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:34 crc kubenswrapper[4785]: I1126 15:08:34.699104 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:34 crc kubenswrapper[4785]: I1126 15:08:34.699123 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:34 crc kubenswrapper[4785]: I1126 15:08:34.699144 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:34 crc kubenswrapper[4785]: I1126 15:08:34.699189 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:34Z","lastTransitionTime":"2025-11-26T15:08:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:34 crc kubenswrapper[4785]: I1126 15:08:34.712432 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ab8b8fda3321443cb24c214c10ba7171f01c80adf81860f4aa931062ec886d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:08:34Z is after 2025-08-24T17:21:41Z" Nov 26 15:08:34 crc kubenswrapper[4785]: I1126 15:08:34.725833 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:08:34Z is after 2025-08-24T17:21:41Z" Nov 26 15:08:34 crc kubenswrapper[4785]: I1126 15:08:34.742670 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xbz7b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84d83039-5d86-45bb-a5c1-ca5b94ed92c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5459c3414a7754407ac6d182e88047bbe1b7b9750d15c7a88767c50c4b9dc720\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gdb7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f7072a2a9067489a27300fea823075b89d71d45a1d972bd9d94226b01c2c123\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f7072a2a9067489a27300fea823075b89d71d45a1d972bd9d94226b01c2c123\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gdb7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://538d65a0b5f3e2c45c1b9761f2c73e958d5de0873652f20f4047631423e538eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://538d65a0b5f3e2c45c1b9761f2c73e958d5de0873652f20f4047631423e538eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gdb7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1abf2b4c1a00c0bc17acaf7547130008127579df59592382ae6e81430936e6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d1abf2b4c1a00c0bc17acaf7547130008127579df59592382ae6e81430936e6d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gdb7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://35a2151729e5cb2ae347214fc91e6de116a4062943e4cb4a27ab0553436953ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35a2151729e5cb2ae347214fc91e6de116a4062943e4cb4a27ab0553436953ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gdb7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b51e95d33c83c57a3b5a7f580eb9cac5e50575e699c066694a9ae8fd4dcdfb20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b51e95d33c83c57a3b5a7f580eb9cac5e50575e699c066694a9ae8fd4dcdfb20\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gdb7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20f87e07ed26b3b93d4dad87f971e4ee3bf3c46299491a95990ab34c63167f93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://20f87e07ed26b3b93d4dad87f971e4ee3bf3c46299491a95990ab34c63167f93\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gdb7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xbz7b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:08:34Z is after 2025-08-24T17:21:41Z" Nov 26 15:08:34 crc kubenswrapper[4785]: I1126 15:08:34.762122 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-925q9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"862c58fd-3f79-4276-bd76-ce689d32cbd6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b754df1d5f2d20f3afdb10f2e77f4716de9d953c7322bb3e289d989d2699d17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://018b59a73e9d454f9156de17b00a8c5cccc3f321164cac374187afa18466bc45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e1a2013c12cbd23c8c9e4e0d5bd9f76df1c2c2f30f0d55fdb8c8cb03698ea576\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bc34631383b7470b2d582d037ea922f2c46d09ca9e37011d758e16837336a78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://77387f2d556a76807d809c75aa4d9e2fdf5de82d7a04e23d3f5461ccef860196\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://47be007cc11c8c2db5bc6dd5786cad4e45c1f8f08f872d060be0211012aad935\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8d76c8252001a8224d06473aabbbc9d9f0b4a73a415750b8414635ebf2b19da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e098d64d0baa7671b1693be648e0a05bcf267a0fbb27f8df8341589e7f2403a8\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-26T15:08:05Z\\\",\\\"message\\\":\\\" event handler 8\\\\nI1126 15:08:05.966708 6428 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1126 15:08:05.966737 6428 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1126 15:08:05.966799 6428 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1126 15:08:05.966828 6428 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1126 15:08:05.966851 6428 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1126 15:08:05.966838 6428 handler.go:208] Removed *v1.Node event handler 2\\\\nI1126 15:08:05.966854 6428 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1126 15:08:05.966902 6428 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1126 15:08:05.966906 6428 factory.go:656] Stopping watch factory\\\\nI1126 15:08:05.966902 6428 handler.go:208] Removed *v1.Node event handler 7\\\\nI1126 15:08:05.966903 6428 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI1126 15:08:05.966957 6428 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1126 15:08:05.967001 6428 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI1126 15:08:05.967033 6428 ovnkube.go:599] Stopped ovnkube\\\\nI1126 15:08:05.967055 6428 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1126 15:08:05.967109 6428 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-26T15:08:05Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c8d76c8252001a8224d06473aabbbc9d9f0b4a73a415750b8414635ebf2b19da\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-26T15:08:33Z\\\",\\\"message\\\":\\\"40702 6788 obj_retry.go:551] Creating *factory.egressNode crc took: 3.419578ms\\\\nI1126 15:08:33.840718 6788 factory.go:1336] Added *v1.Node event handler 7\\\\nI1126 15:08:33.840744 6788 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI1126 15:08:33.840780 6788 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-authentication/oauth-openshift\\\\\\\"}\\\\nI1126 15:08:33.840740 6788 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-machine-api/cluster-autoscaler-operator\\\\\\\"}\\\\nI1126 15:08:33.840820 6788 services_controller.go:360] Finished syncing service cluster-autoscaler-operator on namespace openshift-machine-api for network=default : 870.792µs\\\\nI1126 15:08:33.840809 6788 services_controller.go:360] Finished syncing service oauth-openshift on namespace openshift-authentication for network=default : 2.189107ms\\\\nI1126 15:08:33.840943 6788 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI1126 15:08:33.841021 6788 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI1126 15:08:33.841047 6788 ovnkube.go:599] Stopped ovnkube\\\\nI1126 15:08:33.841064 6788 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1126 15:08:33.841175 6788 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-26T15:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84b98c7c476182b515072f71c6d81656debe5c03855324623b1eb31f53e6c5c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://049fc6de07d4b51e1c898035109c3427dff189cc509f5973db7cb75c3a022bc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://049fc6de07d4b51e1c898035109c3427dff189cc509f5973db7cb75c3a022bc1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-925q9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:08:34Z is after 2025-08-24T17:21:41Z" Nov 26 15:08:34 crc kubenswrapper[4785]: I1126 15:08:34.772297 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-hk884" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b9a87c55-b930-4993-88b8-15902e000caa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://018c4591e8ee49a626a6357d6672b723d221f62eaa62face83fd6c3d5f1b3bfa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n9rk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:38Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-hk884\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:08:34Z is after 2025-08-24T17:21:41Z" Nov 26 15:08:34 crc kubenswrapper[4785]: I1126 15:08:34.784023 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6q4xd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"855bd894-cca9-4fe1-a0d5-8b72afe7c93a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:08:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:08:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96bab32cef462780b564492521f2bdddda092ed263374eb243d3e20ed8a068ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://041064cd63f3d654e02e5e310fbc190ca527a986959c9d7a35ab87962184f2a0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-26T15:08:26Z\\\",\\\"message\\\":\\\"2025-11-26T15:07:40+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_bd6b8591-1e03-43be-806d-3dc58de1780b\\\\n2025-11-26T15:07:40+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_bd6b8591-1e03-43be-806d-3dc58de1780b to /host/opt/cni/bin/\\\\n2025-11-26T15:07:41Z [verbose] multus-daemon started\\\\n2025-11-26T15:07:41Z [verbose] Readiness Indicator file check\\\\n2025-11-26T15:08:26Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:39Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7zv2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6q4xd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:08:34Z is after 2025-08-24T17:21:41Z" Nov 26 15:08:34 crc kubenswrapper[4785]: I1126 15:08:34.792876 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-qdfwp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72903df2-b694-4229-96b5-167500cab723\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrprh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrprh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:52Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-qdfwp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:08:34Z is after 2025-08-24T17:21:41Z" Nov 26 15:08:34 crc kubenswrapper[4785]: I1126 15:08:34.801501 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:34 crc kubenswrapper[4785]: I1126 15:08:34.801593 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:34 crc kubenswrapper[4785]: I1126 15:08:34.801612 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:34 crc kubenswrapper[4785]: I1126 15:08:34.801636 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:34 crc kubenswrapper[4785]: I1126 15:08:34.801653 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:34Z","lastTransitionTime":"2025-11-26T15:08:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:34 crc kubenswrapper[4785]: I1126 15:08:34.904293 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:34 crc kubenswrapper[4785]: I1126 15:08:34.904332 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:34 crc kubenswrapper[4785]: I1126 15:08:34.904343 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:34 crc kubenswrapper[4785]: I1126 15:08:34.904357 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:34 crc kubenswrapper[4785]: I1126 15:08:34.904371 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:34Z","lastTransitionTime":"2025-11-26T15:08:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:35 crc kubenswrapper[4785]: I1126 15:08:35.007627 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:35 crc kubenswrapper[4785]: I1126 15:08:35.007677 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:35 crc kubenswrapper[4785]: I1126 15:08:35.007689 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:35 crc kubenswrapper[4785]: I1126 15:08:35.007711 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:35 crc kubenswrapper[4785]: I1126 15:08:35.007726 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:35Z","lastTransitionTime":"2025-11-26T15:08:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:35 crc kubenswrapper[4785]: I1126 15:08:35.036137 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 26 15:08:35 crc kubenswrapper[4785]: E1126 15:08:35.036264 4785 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 26 15:08:35 crc kubenswrapper[4785]: I1126 15:08:35.036137 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 26 15:08:35 crc kubenswrapper[4785]: E1126 15:08:35.036394 4785 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 26 15:08:35 crc kubenswrapper[4785]: I1126 15:08:35.045754 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:35 crc kubenswrapper[4785]: I1126 15:08:35.045786 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:35 crc kubenswrapper[4785]: I1126 15:08:35.045796 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:35 crc kubenswrapper[4785]: I1126 15:08:35.045808 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:35 crc kubenswrapper[4785]: I1126 15:08:35.045817 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:35Z","lastTransitionTime":"2025-11-26T15:08:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:35 crc kubenswrapper[4785]: E1126 15:08:35.059810 4785 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T15:08:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T15:08:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T15:08:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T15:08:35Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T15:08:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T15:08:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T15:08:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T15:08:35Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"564b0a72-079b-4004-bac6-7e7947bc6860\\\",\\\"systemUUID\\\":\\\"0559caf5-1d73-4afa-a2e0-e8d6b738bfd5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:08:35Z is after 2025-08-24T17:21:41Z" Nov 26 15:08:35 crc kubenswrapper[4785]: I1126 15:08:35.063847 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:35 crc kubenswrapper[4785]: I1126 15:08:35.063895 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:35 crc kubenswrapper[4785]: I1126 15:08:35.063910 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:35 crc kubenswrapper[4785]: I1126 15:08:35.063930 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:35 crc kubenswrapper[4785]: I1126 15:08:35.063943 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:35Z","lastTransitionTime":"2025-11-26T15:08:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:35 crc kubenswrapper[4785]: E1126 15:08:35.080539 4785 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T15:08:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T15:08:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T15:08:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T15:08:35Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T15:08:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T15:08:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T15:08:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T15:08:35Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"564b0a72-079b-4004-bac6-7e7947bc6860\\\",\\\"systemUUID\\\":\\\"0559caf5-1d73-4afa-a2e0-e8d6b738bfd5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:08:35Z is after 2025-08-24T17:21:41Z" Nov 26 15:08:35 crc kubenswrapper[4785]: I1126 15:08:35.085309 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:35 crc kubenswrapper[4785]: I1126 15:08:35.085384 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:35 crc kubenswrapper[4785]: I1126 15:08:35.085407 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:35 crc kubenswrapper[4785]: I1126 15:08:35.085441 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:35 crc kubenswrapper[4785]: I1126 15:08:35.085468 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:35Z","lastTransitionTime":"2025-11-26T15:08:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:35 crc kubenswrapper[4785]: E1126 15:08:35.107369 4785 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T15:08:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T15:08:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T15:08:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T15:08:35Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T15:08:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T15:08:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T15:08:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T15:08:35Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"564b0a72-079b-4004-bac6-7e7947bc6860\\\",\\\"systemUUID\\\":\\\"0559caf5-1d73-4afa-a2e0-e8d6b738bfd5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:08:35Z is after 2025-08-24T17:21:41Z" Nov 26 15:08:35 crc kubenswrapper[4785]: I1126 15:08:35.112120 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:35 crc kubenswrapper[4785]: I1126 15:08:35.112175 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:35 crc kubenswrapper[4785]: I1126 15:08:35.112190 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:35 crc kubenswrapper[4785]: I1126 15:08:35.112211 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:35 crc kubenswrapper[4785]: I1126 15:08:35.112227 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:35Z","lastTransitionTime":"2025-11-26T15:08:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:35 crc kubenswrapper[4785]: E1126 15:08:35.128166 4785 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T15:08:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T15:08:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T15:08:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T15:08:35Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T15:08:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T15:08:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T15:08:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T15:08:35Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"564b0a72-079b-4004-bac6-7e7947bc6860\\\",\\\"systemUUID\\\":\\\"0559caf5-1d73-4afa-a2e0-e8d6b738bfd5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:08:35Z is after 2025-08-24T17:21:41Z" Nov 26 15:08:35 crc kubenswrapper[4785]: I1126 15:08:35.131424 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:35 crc kubenswrapper[4785]: I1126 15:08:35.131474 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:35 crc kubenswrapper[4785]: I1126 15:08:35.131486 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:35 crc kubenswrapper[4785]: I1126 15:08:35.131506 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:35 crc kubenswrapper[4785]: I1126 15:08:35.131518 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:35Z","lastTransitionTime":"2025-11-26T15:08:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:35 crc kubenswrapper[4785]: E1126 15:08:35.144378 4785 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T15:08:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T15:08:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T15:08:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T15:08:35Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T15:08:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T15:08:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T15:08:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T15:08:35Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"564b0a72-079b-4004-bac6-7e7947bc6860\\\",\\\"systemUUID\\\":\\\"0559caf5-1d73-4afa-a2e0-e8d6b738bfd5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:08:35Z is after 2025-08-24T17:21:41Z" Nov 26 15:08:35 crc kubenswrapper[4785]: E1126 15:08:35.144520 4785 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Nov 26 15:08:35 crc kubenswrapper[4785]: I1126 15:08:35.146138 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:35 crc kubenswrapper[4785]: I1126 15:08:35.146170 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:35 crc kubenswrapper[4785]: I1126 15:08:35.146178 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:35 crc kubenswrapper[4785]: I1126 15:08:35.146193 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:35 crc kubenswrapper[4785]: I1126 15:08:35.146203 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:35Z","lastTransitionTime":"2025-11-26T15:08:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:35 crc kubenswrapper[4785]: I1126 15:08:35.248850 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:35 crc kubenswrapper[4785]: I1126 15:08:35.248890 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:35 crc kubenswrapper[4785]: I1126 15:08:35.248898 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:35 crc kubenswrapper[4785]: I1126 15:08:35.248943 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:35 crc kubenswrapper[4785]: I1126 15:08:35.248952 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:35Z","lastTransitionTime":"2025-11-26T15:08:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:35 crc kubenswrapper[4785]: I1126 15:08:35.351506 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:35 crc kubenswrapper[4785]: I1126 15:08:35.351541 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:35 crc kubenswrapper[4785]: I1126 15:08:35.351577 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:35 crc kubenswrapper[4785]: I1126 15:08:35.351592 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:35 crc kubenswrapper[4785]: I1126 15:08:35.351601 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:35Z","lastTransitionTime":"2025-11-26T15:08:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:35 crc kubenswrapper[4785]: I1126 15:08:35.454592 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:35 crc kubenswrapper[4785]: I1126 15:08:35.454666 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:35 crc kubenswrapper[4785]: I1126 15:08:35.454683 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:35 crc kubenswrapper[4785]: I1126 15:08:35.454699 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:35 crc kubenswrapper[4785]: I1126 15:08:35.454710 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:35Z","lastTransitionTime":"2025-11-26T15:08:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:35 crc kubenswrapper[4785]: I1126 15:08:35.514620 4785 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-925q9_862c58fd-3f79-4276-bd76-ce689d32cbd6/ovnkube-controller/3.log" Nov 26 15:08:35 crc kubenswrapper[4785]: I1126 15:08:35.520161 4785 scope.go:117] "RemoveContainer" containerID="c8d76c8252001a8224d06473aabbbc9d9f0b4a73a415750b8414635ebf2b19da" Nov 26 15:08:35 crc kubenswrapper[4785]: E1126 15:08:35.520466 4785 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-925q9_openshift-ovn-kubernetes(862c58fd-3f79-4276-bd76-ce689d32cbd6)\"" pod="openshift-ovn-kubernetes/ovnkube-node-925q9" podUID="862c58fd-3f79-4276-bd76-ce689d32cbd6" Nov 26 15:08:35 crc kubenswrapper[4785]: I1126 15:08:35.532813 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-hk884" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b9a87c55-b930-4993-88b8-15902e000caa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://018c4591e8ee49a626a6357d6672b723d221f62eaa62face83fd6c3d5f1b3bfa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n9rk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:38Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-hk884\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:08:35Z is after 2025-08-24T17:21:41Z" Nov 26 15:08:35 crc kubenswrapper[4785]: I1126 15:08:35.553622 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6q4xd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"855bd894-cca9-4fe1-a0d5-8b72afe7c93a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:08:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:08:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96bab32cef462780b564492521f2bdddda092ed263374eb243d3e20ed8a068ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://041064cd63f3d654e02e5e310fbc190ca527a986959c9d7a35ab87962184f2a0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-26T15:08:26Z\\\",\\\"message\\\":\\\"2025-11-26T15:07:40+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_bd6b8591-1e03-43be-806d-3dc58de1780b\\\\n2025-11-26T15:07:40+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_bd6b8591-1e03-43be-806d-3dc58de1780b to /host/opt/cni/bin/\\\\n2025-11-26T15:07:41Z [verbose] multus-daemon started\\\\n2025-11-26T15:07:41Z [verbose] Readiness Indicator file check\\\\n2025-11-26T15:08:26Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:39Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7zv2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6q4xd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:08:35Z is after 2025-08-24T17:21:41Z" Nov 26 15:08:35 crc kubenswrapper[4785]: I1126 15:08:35.557654 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:35 crc kubenswrapper[4785]: I1126 15:08:35.557713 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:35 crc kubenswrapper[4785]: I1126 15:08:35.557736 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:35 crc kubenswrapper[4785]: I1126 15:08:35.557766 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:35 crc kubenswrapper[4785]: I1126 15:08:35.557787 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:35Z","lastTransitionTime":"2025-11-26T15:08:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:35 crc kubenswrapper[4785]: I1126 15:08:35.569175 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-qdfwp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72903df2-b694-4229-96b5-167500cab723\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrprh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrprh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:52Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-qdfwp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:08:35Z is after 2025-08-24T17:21:41Z" Nov 26 15:08:35 crc kubenswrapper[4785]: I1126 15:08:35.583797 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2fc8b9de-d617-46d7-8834-84da7611f363\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08b2ab738589b29c6c67e3319ccab97ca16955fbc26dfa0dcf8a4374f6cc0d4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad113c4b2e70dde555f17e5b90363d0c0d5bcd70fa99ed5422ffc32bce3253c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ad113c4b2e70dde555f17e5b90363d0c0d5bcd70fa99ed5422ffc32bce3253c1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:17Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:08:35Z is after 2025-08-24T17:21:41Z" Nov 26 15:08:35 crc kubenswrapper[4785]: I1126 15:08:35.602058 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:08:35Z is after 2025-08-24T17:21:41Z" Nov 26 15:08:35 crc kubenswrapper[4785]: I1126 15:08:35.616309 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-smv28" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d77660b-5b10-4573-84ab-3dc318d4b4ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3bb62d64073e9a5efa6ec04394fb33bf1498892d59e0843f323a728dff2e7b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mwfft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:38Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-smv28\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:08:35Z is after 2025-08-24T17:21:41Z" Nov 26 15:08:35 crc kubenswrapper[4785]: I1126 15:08:35.632617 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zk6jt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"09c65f70-203d-40eb-a45e-ed8d7e36912f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69bcee7670290dd116b68ea6d15585455861981551919b4b553fc699f43161e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dg52p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba1f2e54aa3b8e81f5bc6ad4531d5976cfd8d6c4c15f16cefecc75c3c6189ef4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dg52p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-zk6jt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:08:35Z is after 2025-08-24T17:21:41Z" Nov 26 15:08:35 crc kubenswrapper[4785]: I1126 15:08:35.660795 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:35 crc kubenswrapper[4785]: I1126 15:08:35.660846 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:35 crc kubenswrapper[4785]: I1126 15:08:35.660861 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:35 crc kubenswrapper[4785]: I1126 15:08:35.660885 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:35 crc kubenswrapper[4785]: I1126 15:08:35.660900 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:35Z","lastTransitionTime":"2025-11-26T15:08:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:35 crc kubenswrapper[4785]: I1126 15:08:35.674811 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ce2473df-8540-437a-9b68-a0c79c8f1189\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26237e500f4655636096b017418a9029308b9ad9e5c12041546e51d919766529\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7904d95c1ece08fadcdc54e83f80b295f3904cc950fb84624bc2a6967e4d5bdd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://936454b4c6cdedceb140c50e2dba0e677ce2b126075b203bdae8a77bbdefde76\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2202f005c9b704cc7a97a7387e4563a14bda9e52001c63918f1067a7e676d34d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a6d4414478527bb76f539125e4be6ebf2a6470ddb46fd35e07180319fa98ede\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-26T15:07:36Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1126 15:07:30.557519 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1126 15:07:30.559919 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2275430498/tls.crt::/tmp/serving-cert-2275430498/tls.key\\\\\\\"\\\\nI1126 15:07:36.843783 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1126 15:07:36.847546 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1126 15:07:36.847601 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1126 15:07:36.847630 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1126 15:07:36.847642 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1126 15:07:36.858915 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1126 15:07:36.858955 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1126 15:07:36.858960 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1126 15:07:36.858971 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1126 15:07:36.858975 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1126 15:07:36.858978 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1126 15:07:36.858982 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1126 15:07:36.859422 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1126 15:07:36.862125 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:20Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcce80995d95ff6993051a4f1ca260133d2513d216a291e3f2c905ec29f88d1b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e86ca1ef167bd0c31ad5ae1283d40945c72c4e528f3702b577bbbf9818e84c08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e86ca1ef167bd0c31ad5ae1283d40945c72c4e528f3702b577bbbf9818e84c08\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:08:35Z is after 2025-08-24T17:21:41Z" Nov 26 15:08:35 crc kubenswrapper[4785]: I1126 15:08:35.702046 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f369bd7df062792e57a0cee19e7be96c6850b20d6e739a0c7fa5eccc14ae1d1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:08:35Z is after 2025-08-24T17:21:41Z" Nov 26 15:08:35 crc kubenswrapper[4785]: I1126 15:08:35.715285 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:08:35Z is after 2025-08-24T17:21:41Z" Nov 26 15:08:35 crc kubenswrapper[4785]: I1126 15:08:35.724969 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-gkxdl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5539d39a-e2bc-4e7f-8b5a-a5e3e10c4ba4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://183451ec5e6ad1983b6e56a53adfcf2eaf3489dcfbc3cbc709f67002691b9010\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqzs8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5538d579d2e0ac36c778f58da6e8176f8f7a0995797e8bc7c3ad21a054346867\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqzs8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-gkxdl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:08:35Z is after 2025-08-24T17:21:41Z" Nov 26 15:08:35 crc kubenswrapper[4785]: I1126 15:08:35.735790 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dbac4f7f-578e-41e6-88a0-4ad8d2f5eef2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://320fc519a91dfed4d5c233e7c1c2f21aadf25336ccade8830069aae4bd804a68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b79ab88118b1e683ad26fc3bd390961d15af99cce73126c56587c0df819f3a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6c964ff713703fe4484d76d86473279ffc4a3eef0cf3dd8b29aea7fdc6271616\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03ac9c745ec9eb21263a599561496630202872ef631f5e846e0475efedee6943\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:08:35Z is after 2025-08-24T17:21:41Z" Nov 26 15:08:35 crc kubenswrapper[4785]: I1126 15:08:35.746659 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58c0357f-6901-420b-a246-f5ef95e4fb7e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:08:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:08:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2553398db49910f840349bfd11fd5ce7908438bdbcf5a654355c10fb3c3d610\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3c4c50721c470ba97725e59a42ae3cf740017ed7027a9e5b0a7e1906f40fd49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f8dc60997fbeb399187b6bc95276994ff0a181732022bfff1e229928996f90b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://169def9ac03274e05424365ce5b2fc1236d495e846603c29fa6efbaae27b53b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://169def9ac03274e05424365ce5b2fc1236d495e846603c29fa6efbaae27b53b2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:18Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:17Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:08:35Z is after 2025-08-24T17:21:41Z" Nov 26 15:08:35 crc kubenswrapper[4785]: I1126 15:08:35.762938 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:35 crc kubenswrapper[4785]: I1126 15:08:35.763015 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:35 crc kubenswrapper[4785]: I1126 15:08:35.763047 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:35 crc kubenswrapper[4785]: I1126 15:08:35.763078 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:35 crc kubenswrapper[4785]: I1126 15:08:35.763099 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:35Z","lastTransitionTime":"2025-11-26T15:08:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:35 crc kubenswrapper[4785]: I1126 15:08:35.767179 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ec3c2133-1999-46b4-b5a6-c1b800f04c30\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee866b365ef5e15c3cc262ab6ad0649b10fa7fdf0c10c3f747d9e20d48535e76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79b8f6bd2047642cacd4effcee8f9760c7c9480092993c7a25eabe4727bd8238\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dee1454ae94951e83f437b999f871e4965a9199a20bcd6392a6e07a7bfa9c6aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://257e51cb9b93727ccb616e193abc21dfed1ee9ee2f1ce459e35728457dae5280\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc743861f85f56a87c6a20d7152af59e554284fe13b8212c5dae854438771406\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ff60b31932a511f9156f1416e112d115fcbc79a0a6030143ba9f5ddd327577a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ff60b31932a511f9156f1416e112d115fcbc79a0a6030143ba9f5ddd327577a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff524be439c6bf410b9377cd503516b43be3826e4c2fcd03a6ed9bf3fcfab95b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff524be439c6bf410b9377cd503516b43be3826e4c2fcd03a6ed9bf3fcfab95b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:19Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://26cb5ba5508ef320a24ba05e352547c9feb82f37ebeab58ad7c0a073d03b8c2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26cb5ba5508ef320a24ba05e352547c9feb82f37ebeab58ad7c0a073d03b8c2d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:17Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:08:35Z is after 2025-08-24T17:21:41Z" Nov 26 15:08:35 crc kubenswrapper[4785]: I1126 15:08:35.779343 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8acfa933b476aa5e196c195906225ae92af5cca3353292e23f29f87ee374d0ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f432643beb27bff31e146230a60837b1292cddb8595ede8ed7f929170fceeb17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:08:35Z is after 2025-08-24T17:21:41Z" Nov 26 15:08:35 crc kubenswrapper[4785]: I1126 15:08:35.789761 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ab8b8fda3321443cb24c214c10ba7171f01c80adf81860f4aa931062ec886d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:08:35Z is after 2025-08-24T17:21:41Z" Nov 26 15:08:35 crc kubenswrapper[4785]: I1126 15:08:35.801403 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:08:35Z is after 2025-08-24T17:21:41Z" Nov 26 15:08:35 crc kubenswrapper[4785]: I1126 15:08:35.815913 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xbz7b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84d83039-5d86-45bb-a5c1-ca5b94ed92c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5459c3414a7754407ac6d182e88047bbe1b7b9750d15c7a88767c50c4b9dc720\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gdb7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f7072a2a9067489a27300fea823075b89d71d45a1d972bd9d94226b01c2c123\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f7072a2a9067489a27300fea823075b89d71d45a1d972bd9d94226b01c2c123\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gdb7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://538d65a0b5f3e2c45c1b9761f2c73e958d5de0873652f20f4047631423e538eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://538d65a0b5f3e2c45c1b9761f2c73e958d5de0873652f20f4047631423e538eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gdb7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1abf2b4c1a00c0bc17acaf7547130008127579df59592382ae6e81430936e6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d1abf2b4c1a00c0bc17acaf7547130008127579df59592382ae6e81430936e6d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gdb7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://35a2151729e5cb2ae347214fc91e6de116a4062943e4cb4a27ab0553436953ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35a2151729e5cb2ae347214fc91e6de116a4062943e4cb4a27ab0553436953ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gdb7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b51e95d33c83c57a3b5a7f580eb9cac5e50575e699c066694a9ae8fd4dcdfb20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b51e95d33c83c57a3b5a7f580eb9cac5e50575e699c066694a9ae8fd4dcdfb20\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gdb7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20f87e07ed26b3b93d4dad87f971e4ee3bf3c46299491a95990ab34c63167f93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://20f87e07ed26b3b93d4dad87f971e4ee3bf3c46299491a95990ab34c63167f93\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gdb7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xbz7b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:08:35Z is after 2025-08-24T17:21:41Z" Nov 26 15:08:35 crc kubenswrapper[4785]: I1126 15:08:35.834663 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-925q9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"862c58fd-3f79-4276-bd76-ce689d32cbd6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b754df1d5f2d20f3afdb10f2e77f4716de9d953c7322bb3e289d989d2699d17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://018b59a73e9d454f9156de17b00a8c5cccc3f321164cac374187afa18466bc45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e1a2013c12cbd23c8c9e4e0d5bd9f76df1c2c2f30f0d55fdb8c8cb03698ea576\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bc34631383b7470b2d582d037ea922f2c46d09ca9e37011d758e16837336a78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://77387f2d556a76807d809c75aa4d9e2fdf5de82d7a04e23d3f5461ccef860196\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://47be007cc11c8c2db5bc6dd5786cad4e45c1f8f08f872d060be0211012aad935\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8d76c8252001a8224d06473aabbbc9d9f0b4a73a415750b8414635ebf2b19da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c8d76c8252001a8224d06473aabbbc9d9f0b4a73a415750b8414635ebf2b19da\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-26T15:08:33Z\\\",\\\"message\\\":\\\"40702 6788 obj_retry.go:551] Creating *factory.egressNode crc took: 3.419578ms\\\\nI1126 15:08:33.840718 6788 factory.go:1336] Added *v1.Node event handler 7\\\\nI1126 15:08:33.840744 6788 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI1126 15:08:33.840780 6788 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-authentication/oauth-openshift\\\\\\\"}\\\\nI1126 15:08:33.840740 6788 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-machine-api/cluster-autoscaler-operator\\\\\\\"}\\\\nI1126 15:08:33.840820 6788 services_controller.go:360] Finished syncing service cluster-autoscaler-operator on namespace openshift-machine-api for network=default : 870.792µs\\\\nI1126 15:08:33.840809 6788 services_controller.go:360] Finished syncing service oauth-openshift on namespace openshift-authentication for network=default : 2.189107ms\\\\nI1126 15:08:33.840943 6788 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI1126 15:08:33.841021 6788 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI1126 15:08:33.841047 6788 ovnkube.go:599] Stopped ovnkube\\\\nI1126 15:08:33.841064 6788 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1126 15:08:33.841175 6788 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-26T15:08:33Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-925q9_openshift-ovn-kubernetes(862c58fd-3f79-4276-bd76-ce689d32cbd6)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84b98c7c476182b515072f71c6d81656debe5c03855324623b1eb31f53e6c5c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://049fc6de07d4b51e1c898035109c3427dff189cc509f5973db7cb75c3a022bc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://049fc6de07d4b51e1c898035109c3427dff189cc509f5973db7cb75c3a022bc1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-925q9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:08:35Z is after 2025-08-24T17:21:41Z" Nov 26 15:08:35 crc kubenswrapper[4785]: I1126 15:08:35.865490 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:35 crc kubenswrapper[4785]: I1126 15:08:35.865539 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:35 crc kubenswrapper[4785]: I1126 15:08:35.865570 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:35 crc kubenswrapper[4785]: I1126 15:08:35.865589 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:35 crc kubenswrapper[4785]: I1126 15:08:35.865603 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:35Z","lastTransitionTime":"2025-11-26T15:08:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:35 crc kubenswrapper[4785]: I1126 15:08:35.968049 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:35 crc kubenswrapper[4785]: I1126 15:08:35.968136 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:35 crc kubenswrapper[4785]: I1126 15:08:35.968160 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:35 crc kubenswrapper[4785]: I1126 15:08:35.968192 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:35 crc kubenswrapper[4785]: I1126 15:08:35.968210 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:35Z","lastTransitionTime":"2025-11-26T15:08:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:36 crc kubenswrapper[4785]: I1126 15:08:36.035715 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 26 15:08:36 crc kubenswrapper[4785]: I1126 15:08:36.035713 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qdfwp" Nov 26 15:08:36 crc kubenswrapper[4785]: E1126 15:08:36.035924 4785 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 26 15:08:36 crc kubenswrapper[4785]: E1126 15:08:36.036097 4785 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qdfwp" podUID="72903df2-b694-4229-96b5-167500cab723" Nov 26 15:08:36 crc kubenswrapper[4785]: I1126 15:08:36.070386 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:36 crc kubenswrapper[4785]: I1126 15:08:36.070432 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:36 crc kubenswrapper[4785]: I1126 15:08:36.070444 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:36 crc kubenswrapper[4785]: I1126 15:08:36.070462 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:36 crc kubenswrapper[4785]: I1126 15:08:36.070475 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:36Z","lastTransitionTime":"2025-11-26T15:08:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:36 crc kubenswrapper[4785]: I1126 15:08:36.172540 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:36 crc kubenswrapper[4785]: I1126 15:08:36.172624 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:36 crc kubenswrapper[4785]: I1126 15:08:36.172640 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:36 crc kubenswrapper[4785]: I1126 15:08:36.172662 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:36 crc kubenswrapper[4785]: I1126 15:08:36.172678 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:36Z","lastTransitionTime":"2025-11-26T15:08:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:36 crc kubenswrapper[4785]: I1126 15:08:36.275459 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:36 crc kubenswrapper[4785]: I1126 15:08:36.275593 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:36 crc kubenswrapper[4785]: I1126 15:08:36.275619 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:36 crc kubenswrapper[4785]: I1126 15:08:36.275648 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:36 crc kubenswrapper[4785]: I1126 15:08:36.275670 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:36Z","lastTransitionTime":"2025-11-26T15:08:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:36 crc kubenswrapper[4785]: I1126 15:08:36.378165 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:36 crc kubenswrapper[4785]: I1126 15:08:36.378216 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:36 crc kubenswrapper[4785]: I1126 15:08:36.378232 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:36 crc kubenswrapper[4785]: I1126 15:08:36.378249 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:36 crc kubenswrapper[4785]: I1126 15:08:36.378261 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:36Z","lastTransitionTime":"2025-11-26T15:08:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:36 crc kubenswrapper[4785]: I1126 15:08:36.481147 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:36 crc kubenswrapper[4785]: I1126 15:08:36.481184 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:36 crc kubenswrapper[4785]: I1126 15:08:36.481192 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:36 crc kubenswrapper[4785]: I1126 15:08:36.481204 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:36 crc kubenswrapper[4785]: I1126 15:08:36.481212 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:36Z","lastTransitionTime":"2025-11-26T15:08:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:36 crc kubenswrapper[4785]: I1126 15:08:36.584349 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:36 crc kubenswrapper[4785]: I1126 15:08:36.584380 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:36 crc kubenswrapper[4785]: I1126 15:08:36.584388 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:36 crc kubenswrapper[4785]: I1126 15:08:36.584401 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:36 crc kubenswrapper[4785]: I1126 15:08:36.584409 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:36Z","lastTransitionTime":"2025-11-26T15:08:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:36 crc kubenswrapper[4785]: I1126 15:08:36.686910 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:36 crc kubenswrapper[4785]: I1126 15:08:36.686946 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:36 crc kubenswrapper[4785]: I1126 15:08:36.686955 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:36 crc kubenswrapper[4785]: I1126 15:08:36.686967 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:36 crc kubenswrapper[4785]: I1126 15:08:36.686976 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:36Z","lastTransitionTime":"2025-11-26T15:08:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:36 crc kubenswrapper[4785]: I1126 15:08:36.789068 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:36 crc kubenswrapper[4785]: I1126 15:08:36.789121 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:36 crc kubenswrapper[4785]: I1126 15:08:36.789139 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:36 crc kubenswrapper[4785]: I1126 15:08:36.789157 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:36 crc kubenswrapper[4785]: I1126 15:08:36.789171 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:36Z","lastTransitionTime":"2025-11-26T15:08:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:36 crc kubenswrapper[4785]: I1126 15:08:36.891904 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:36 crc kubenswrapper[4785]: I1126 15:08:36.891959 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:36 crc kubenswrapper[4785]: I1126 15:08:36.891973 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:36 crc kubenswrapper[4785]: I1126 15:08:36.891993 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:36 crc kubenswrapper[4785]: I1126 15:08:36.892006 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:36Z","lastTransitionTime":"2025-11-26T15:08:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:36 crc kubenswrapper[4785]: I1126 15:08:36.994325 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:36 crc kubenswrapper[4785]: I1126 15:08:36.994363 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:36 crc kubenswrapper[4785]: I1126 15:08:36.994375 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:36 crc kubenswrapper[4785]: I1126 15:08:36.994394 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:36 crc kubenswrapper[4785]: I1126 15:08:36.994406 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:36Z","lastTransitionTime":"2025-11-26T15:08:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:37 crc kubenswrapper[4785]: I1126 15:08:37.035390 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 26 15:08:37 crc kubenswrapper[4785]: I1126 15:08:37.035441 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 26 15:08:37 crc kubenswrapper[4785]: E1126 15:08:37.035510 4785 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 26 15:08:37 crc kubenswrapper[4785]: E1126 15:08:37.035961 4785 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 26 15:08:37 crc kubenswrapper[4785]: I1126 15:08:37.048320 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-smv28" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d77660b-5b10-4573-84ab-3dc318d4b4ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3bb62d64073e9a5efa6ec04394fb33bf1498892d59e0843f323a728dff2e7b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mwfft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:38Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-smv28\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:08:37Z is after 2025-08-24T17:21:41Z" Nov 26 15:08:37 crc kubenswrapper[4785]: I1126 15:08:37.058671 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zk6jt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"09c65f70-203d-40eb-a45e-ed8d7e36912f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69bcee7670290dd116b68ea6d15585455861981551919b4b553fc699f43161e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dg52p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba1f2e54aa3b8e81f5bc6ad4531d5976cfd8d6c4c15f16cefecc75c3c6189ef4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dg52p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-zk6jt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:08:37Z is after 2025-08-24T17:21:41Z" Nov 26 15:08:37 crc kubenswrapper[4785]: I1126 15:08:37.069780 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2fc8b9de-d617-46d7-8834-84da7611f363\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08b2ab738589b29c6c67e3319ccab97ca16955fbc26dfa0dcf8a4374f6cc0d4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad113c4b2e70dde555f17e5b90363d0c0d5bcd70fa99ed5422ffc32bce3253c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ad113c4b2e70dde555f17e5b90363d0c0d5bcd70fa99ed5422ffc32bce3253c1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:17Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:08:37Z is after 2025-08-24T17:21:41Z" Nov 26 15:08:37 crc kubenswrapper[4785]: I1126 15:08:37.080623 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:08:37Z is after 2025-08-24T17:21:41Z" Nov 26 15:08:37 crc kubenswrapper[4785]: I1126 15:08:37.092254 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:08:37Z is after 2025-08-24T17:21:41Z" Nov 26 15:08:37 crc kubenswrapper[4785]: I1126 15:08:37.103176 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:37 crc kubenswrapper[4785]: I1126 15:08:37.103234 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:37 crc kubenswrapper[4785]: I1126 15:08:37.103246 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:37 crc kubenswrapper[4785]: I1126 15:08:37.103263 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:37 crc kubenswrapper[4785]: I1126 15:08:37.103296 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:37Z","lastTransitionTime":"2025-11-26T15:08:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:37 crc kubenswrapper[4785]: I1126 15:08:37.106107 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ce2473df-8540-437a-9b68-a0c79c8f1189\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26237e500f4655636096b017418a9029308b9ad9e5c12041546e51d919766529\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7904d95c1ece08fadcdc54e83f80b295f3904cc950fb84624bc2a6967e4d5bdd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://936454b4c6cdedceb140c50e2dba0e677ce2b126075b203bdae8a77bbdefde76\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2202f005c9b704cc7a97a7387e4563a14bda9e52001c63918f1067a7e676d34d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a6d4414478527bb76f539125e4be6ebf2a6470ddb46fd35e07180319fa98ede\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-26T15:07:36Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1126 15:07:30.557519 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1126 15:07:30.559919 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2275430498/tls.crt::/tmp/serving-cert-2275430498/tls.key\\\\\\\"\\\\nI1126 15:07:36.843783 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1126 15:07:36.847546 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1126 15:07:36.847601 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1126 15:07:36.847630 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1126 15:07:36.847642 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1126 15:07:36.858915 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1126 15:07:36.858955 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1126 15:07:36.858960 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1126 15:07:36.858971 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1126 15:07:36.858975 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1126 15:07:36.858978 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1126 15:07:36.858982 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1126 15:07:36.859422 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1126 15:07:36.862125 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:20Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcce80995d95ff6993051a4f1ca260133d2513d216a291e3f2c905ec29f88d1b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e86ca1ef167bd0c31ad5ae1283d40945c72c4e528f3702b577bbbf9818e84c08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e86ca1ef167bd0c31ad5ae1283d40945c72c4e528f3702b577bbbf9818e84c08\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:08:37Z is after 2025-08-24T17:21:41Z" Nov 26 15:08:37 crc kubenswrapper[4785]: I1126 15:08:37.120970 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f369bd7df062792e57a0cee19e7be96c6850b20d6e739a0c7fa5eccc14ae1d1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:08:37Z is after 2025-08-24T17:21:41Z" Nov 26 15:08:37 crc kubenswrapper[4785]: I1126 15:08:37.137070 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ec3c2133-1999-46b4-b5a6-c1b800f04c30\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee866b365ef5e15c3cc262ab6ad0649b10fa7fdf0c10c3f747d9e20d48535e76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79b8f6bd2047642cacd4effcee8f9760c7c9480092993c7a25eabe4727bd8238\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dee1454ae94951e83f437b999f871e4965a9199a20bcd6392a6e07a7bfa9c6aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://257e51cb9b93727ccb616e193abc21dfed1ee9ee2f1ce459e35728457dae5280\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc743861f85f56a87c6a20d7152af59e554284fe13b8212c5dae854438771406\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ff60b31932a511f9156f1416e112d115fcbc79a0a6030143ba9f5ddd327577a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ff60b31932a511f9156f1416e112d115fcbc79a0a6030143ba9f5ddd327577a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff524be439c6bf410b9377cd503516b43be3826e4c2fcd03a6ed9bf3fcfab95b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff524be439c6bf410b9377cd503516b43be3826e4c2fcd03a6ed9bf3fcfab95b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:19Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://26cb5ba5508ef320a24ba05e352547c9feb82f37ebeab58ad7c0a073d03b8c2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26cb5ba5508ef320a24ba05e352547c9feb82f37ebeab58ad7c0a073d03b8c2d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:17Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:08:37Z is after 2025-08-24T17:21:41Z" Nov 26 15:08:37 crc kubenswrapper[4785]: I1126 15:08:37.149660 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8acfa933b476aa5e196c195906225ae92af5cca3353292e23f29f87ee374d0ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f432643beb27bff31e146230a60837b1292cddb8595ede8ed7f929170fceeb17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:08:37Z is after 2025-08-24T17:21:41Z" Nov 26 15:08:37 crc kubenswrapper[4785]: I1126 15:08:37.161248 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ab8b8fda3321443cb24c214c10ba7171f01c80adf81860f4aa931062ec886d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:08:37Z is after 2025-08-24T17:21:41Z" Nov 26 15:08:37 crc kubenswrapper[4785]: I1126 15:08:37.173963 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:08:37Z is after 2025-08-24T17:21:41Z" Nov 26 15:08:37 crc kubenswrapper[4785]: I1126 15:08:37.186346 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xbz7b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84d83039-5d86-45bb-a5c1-ca5b94ed92c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5459c3414a7754407ac6d182e88047bbe1b7b9750d15c7a88767c50c4b9dc720\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gdb7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f7072a2a9067489a27300fea823075b89d71d45a1d972bd9d94226b01c2c123\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f7072a2a9067489a27300fea823075b89d71d45a1d972bd9d94226b01c2c123\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gdb7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://538d65a0b5f3e2c45c1b9761f2c73e958d5de0873652f20f4047631423e538eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://538d65a0b5f3e2c45c1b9761f2c73e958d5de0873652f20f4047631423e538eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gdb7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1abf2b4c1a00c0bc17acaf7547130008127579df59592382ae6e81430936e6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d1abf2b4c1a00c0bc17acaf7547130008127579df59592382ae6e81430936e6d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gdb7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://35a2151729e5cb2ae347214fc91e6de116a4062943e4cb4a27ab0553436953ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35a2151729e5cb2ae347214fc91e6de116a4062943e4cb4a27ab0553436953ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gdb7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b51e95d33c83c57a3b5a7f580eb9cac5e50575e699c066694a9ae8fd4dcdfb20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b51e95d33c83c57a3b5a7f580eb9cac5e50575e699c066694a9ae8fd4dcdfb20\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gdb7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20f87e07ed26b3b93d4dad87f971e4ee3bf3c46299491a95990ab34c63167f93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://20f87e07ed26b3b93d4dad87f971e4ee3bf3c46299491a95990ab34c63167f93\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gdb7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xbz7b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:08:37Z is after 2025-08-24T17:21:41Z" Nov 26 15:08:37 crc kubenswrapper[4785]: I1126 15:08:37.196363 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-gkxdl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5539d39a-e2bc-4e7f-8b5a-a5e3e10c4ba4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://183451ec5e6ad1983b6e56a53adfcf2eaf3489dcfbc3cbc709f67002691b9010\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqzs8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5538d579d2e0ac36c778f58da6e8176f8f7a0995797e8bc7c3ad21a054346867\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqzs8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-gkxdl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:08:37Z is after 2025-08-24T17:21:41Z" Nov 26 15:08:37 crc kubenswrapper[4785]: I1126 15:08:37.206825 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:37 crc kubenswrapper[4785]: I1126 15:08:37.206860 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:37 crc kubenswrapper[4785]: I1126 15:08:37.206869 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:37 crc kubenswrapper[4785]: I1126 15:08:37.206882 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:37 crc kubenswrapper[4785]: I1126 15:08:37.206893 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:37Z","lastTransitionTime":"2025-11-26T15:08:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:37 crc kubenswrapper[4785]: I1126 15:08:37.211675 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dbac4f7f-578e-41e6-88a0-4ad8d2f5eef2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://320fc519a91dfed4d5c233e7c1c2f21aadf25336ccade8830069aae4bd804a68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b79ab88118b1e683ad26fc3bd390961d15af99cce73126c56587c0df819f3a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6c964ff713703fe4484d76d86473279ffc4a3eef0cf3dd8b29aea7fdc6271616\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03ac9c745ec9eb21263a599561496630202872ef631f5e846e0475efedee6943\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:08:37Z is after 2025-08-24T17:21:41Z" Nov 26 15:08:37 crc kubenswrapper[4785]: I1126 15:08:37.222177 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58c0357f-6901-420b-a246-f5ef95e4fb7e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:08:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:08:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2553398db49910f840349bfd11fd5ce7908438bdbcf5a654355c10fb3c3d610\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3c4c50721c470ba97725e59a42ae3cf740017ed7027a9e5b0a7e1906f40fd49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f8dc60997fbeb399187b6bc95276994ff0a181732022bfff1e229928996f90b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://169def9ac03274e05424365ce5b2fc1236d495e846603c29fa6efbaae27b53b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://169def9ac03274e05424365ce5b2fc1236d495e846603c29fa6efbaae27b53b2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:18Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:17Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:08:37Z is after 2025-08-24T17:21:41Z" Nov 26 15:08:37 crc kubenswrapper[4785]: I1126 15:08:37.244837 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-925q9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"862c58fd-3f79-4276-bd76-ce689d32cbd6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b754df1d5f2d20f3afdb10f2e77f4716de9d953c7322bb3e289d989d2699d17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://018b59a73e9d454f9156de17b00a8c5cccc3f321164cac374187afa18466bc45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e1a2013c12cbd23c8c9e4e0d5bd9f76df1c2c2f30f0d55fdb8c8cb03698ea576\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bc34631383b7470b2d582d037ea922f2c46d09ca9e37011d758e16837336a78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://77387f2d556a76807d809c75aa4d9e2fdf5de82d7a04e23d3f5461ccef860196\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://47be007cc11c8c2db5bc6dd5786cad4e45c1f8f08f872d060be0211012aad935\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8d76c8252001a8224d06473aabbbc9d9f0b4a73a415750b8414635ebf2b19da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c8d76c8252001a8224d06473aabbbc9d9f0b4a73a415750b8414635ebf2b19da\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-26T15:08:33Z\\\",\\\"message\\\":\\\"40702 6788 obj_retry.go:551] Creating *factory.egressNode crc took: 3.419578ms\\\\nI1126 15:08:33.840718 6788 factory.go:1336] Added *v1.Node event handler 7\\\\nI1126 15:08:33.840744 6788 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI1126 15:08:33.840780 6788 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-authentication/oauth-openshift\\\\\\\"}\\\\nI1126 15:08:33.840740 6788 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-machine-api/cluster-autoscaler-operator\\\\\\\"}\\\\nI1126 15:08:33.840820 6788 services_controller.go:360] Finished syncing service cluster-autoscaler-operator on namespace openshift-machine-api for network=default : 870.792µs\\\\nI1126 15:08:33.840809 6788 services_controller.go:360] Finished syncing service oauth-openshift on namespace openshift-authentication for network=default : 2.189107ms\\\\nI1126 15:08:33.840943 6788 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI1126 15:08:33.841021 6788 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI1126 15:08:33.841047 6788 ovnkube.go:599] Stopped ovnkube\\\\nI1126 15:08:33.841064 6788 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1126 15:08:33.841175 6788 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-26T15:08:33Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-925q9_openshift-ovn-kubernetes(862c58fd-3f79-4276-bd76-ce689d32cbd6)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84b98c7c476182b515072f71c6d81656debe5c03855324623b1eb31f53e6c5c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://049fc6de07d4b51e1c898035109c3427dff189cc509f5973db7cb75c3a022bc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://049fc6de07d4b51e1c898035109c3427dff189cc509f5973db7cb75c3a022bc1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-925q9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:08:37Z is after 2025-08-24T17:21:41Z" Nov 26 15:08:37 crc kubenswrapper[4785]: I1126 15:08:37.256704 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-qdfwp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72903df2-b694-4229-96b5-167500cab723\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrprh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrprh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:52Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-qdfwp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:08:37Z is after 2025-08-24T17:21:41Z" Nov 26 15:08:37 crc kubenswrapper[4785]: I1126 15:08:37.268970 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-hk884" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b9a87c55-b930-4993-88b8-15902e000caa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://018c4591e8ee49a626a6357d6672b723d221f62eaa62face83fd6c3d5f1b3bfa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n9rk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:38Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-hk884\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:08:37Z is after 2025-08-24T17:21:41Z" Nov 26 15:08:37 crc kubenswrapper[4785]: I1126 15:08:37.283575 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6q4xd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"855bd894-cca9-4fe1-a0d5-8b72afe7c93a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:08:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:08:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96bab32cef462780b564492521f2bdddda092ed263374eb243d3e20ed8a068ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://041064cd63f3d654e02e5e310fbc190ca527a986959c9d7a35ab87962184f2a0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-26T15:08:26Z\\\",\\\"message\\\":\\\"2025-11-26T15:07:40+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_bd6b8591-1e03-43be-806d-3dc58de1780b\\\\n2025-11-26T15:07:40+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_bd6b8591-1e03-43be-806d-3dc58de1780b to /host/opt/cni/bin/\\\\n2025-11-26T15:07:41Z [verbose] multus-daemon started\\\\n2025-11-26T15:07:41Z [verbose] Readiness Indicator file check\\\\n2025-11-26T15:08:26Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:39Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7zv2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6q4xd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:08:37Z is after 2025-08-24T17:21:41Z" Nov 26 15:08:37 crc kubenswrapper[4785]: I1126 15:08:37.310046 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:37 crc kubenswrapper[4785]: I1126 15:08:37.310082 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:37 crc kubenswrapper[4785]: I1126 15:08:37.310093 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:37 crc kubenswrapper[4785]: I1126 15:08:37.310110 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:37 crc kubenswrapper[4785]: I1126 15:08:37.310123 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:37Z","lastTransitionTime":"2025-11-26T15:08:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:37 crc kubenswrapper[4785]: I1126 15:08:37.412934 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:37 crc kubenswrapper[4785]: I1126 15:08:37.413006 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:37 crc kubenswrapper[4785]: I1126 15:08:37.413022 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:37 crc kubenswrapper[4785]: I1126 15:08:37.413042 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:37 crc kubenswrapper[4785]: I1126 15:08:37.413053 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:37Z","lastTransitionTime":"2025-11-26T15:08:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:37 crc kubenswrapper[4785]: I1126 15:08:37.515863 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:37 crc kubenswrapper[4785]: I1126 15:08:37.516001 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:37 crc kubenswrapper[4785]: I1126 15:08:37.516024 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:37 crc kubenswrapper[4785]: I1126 15:08:37.516049 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:37 crc kubenswrapper[4785]: I1126 15:08:37.516068 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:37Z","lastTransitionTime":"2025-11-26T15:08:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:37 crc kubenswrapper[4785]: I1126 15:08:37.618815 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:37 crc kubenswrapper[4785]: I1126 15:08:37.618858 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:37 crc kubenswrapper[4785]: I1126 15:08:37.618869 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:37 crc kubenswrapper[4785]: I1126 15:08:37.618886 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:37 crc kubenswrapper[4785]: I1126 15:08:37.618897 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:37Z","lastTransitionTime":"2025-11-26T15:08:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:37 crc kubenswrapper[4785]: I1126 15:08:37.722042 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:37 crc kubenswrapper[4785]: I1126 15:08:37.722100 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:37 crc kubenswrapper[4785]: I1126 15:08:37.722120 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:37 crc kubenswrapper[4785]: I1126 15:08:37.722144 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:37 crc kubenswrapper[4785]: I1126 15:08:37.722162 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:37Z","lastTransitionTime":"2025-11-26T15:08:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:37 crc kubenswrapper[4785]: I1126 15:08:37.825702 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:37 crc kubenswrapper[4785]: I1126 15:08:37.825757 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:37 crc kubenswrapper[4785]: I1126 15:08:37.825773 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:37 crc kubenswrapper[4785]: I1126 15:08:37.825796 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:37 crc kubenswrapper[4785]: I1126 15:08:37.825817 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:37Z","lastTransitionTime":"2025-11-26T15:08:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:37 crc kubenswrapper[4785]: I1126 15:08:37.929132 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:37 crc kubenswrapper[4785]: I1126 15:08:37.929203 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:37 crc kubenswrapper[4785]: I1126 15:08:37.929221 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:37 crc kubenswrapper[4785]: I1126 15:08:37.929240 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:37 crc kubenswrapper[4785]: I1126 15:08:37.929252 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:37Z","lastTransitionTime":"2025-11-26T15:08:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:38 crc kubenswrapper[4785]: I1126 15:08:38.033171 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:38 crc kubenswrapper[4785]: I1126 15:08:38.033290 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:38 crc kubenswrapper[4785]: I1126 15:08:38.033309 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:38 crc kubenswrapper[4785]: I1126 15:08:38.033338 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:38 crc kubenswrapper[4785]: I1126 15:08:38.033358 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:38Z","lastTransitionTime":"2025-11-26T15:08:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:38 crc kubenswrapper[4785]: I1126 15:08:38.035743 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qdfwp" Nov 26 15:08:38 crc kubenswrapper[4785]: I1126 15:08:38.035743 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 26 15:08:38 crc kubenswrapper[4785]: E1126 15:08:38.036337 4785 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qdfwp" podUID="72903df2-b694-4229-96b5-167500cab723" Nov 26 15:08:38 crc kubenswrapper[4785]: E1126 15:08:38.036447 4785 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 26 15:08:38 crc kubenswrapper[4785]: I1126 15:08:38.136455 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:38 crc kubenswrapper[4785]: I1126 15:08:38.136499 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:38 crc kubenswrapper[4785]: I1126 15:08:38.136508 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:38 crc kubenswrapper[4785]: I1126 15:08:38.136527 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:38 crc kubenswrapper[4785]: I1126 15:08:38.136536 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:38Z","lastTransitionTime":"2025-11-26T15:08:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:38 crc kubenswrapper[4785]: I1126 15:08:38.241204 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:38 crc kubenswrapper[4785]: I1126 15:08:38.241243 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:38 crc kubenswrapper[4785]: I1126 15:08:38.241255 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:38 crc kubenswrapper[4785]: I1126 15:08:38.241272 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:38 crc kubenswrapper[4785]: I1126 15:08:38.241282 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:38Z","lastTransitionTime":"2025-11-26T15:08:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:38 crc kubenswrapper[4785]: I1126 15:08:38.344579 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:38 crc kubenswrapper[4785]: I1126 15:08:38.344644 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:38 crc kubenswrapper[4785]: I1126 15:08:38.344659 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:38 crc kubenswrapper[4785]: I1126 15:08:38.344682 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:38 crc kubenswrapper[4785]: I1126 15:08:38.344697 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:38Z","lastTransitionTime":"2025-11-26T15:08:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:38 crc kubenswrapper[4785]: I1126 15:08:38.447302 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:38 crc kubenswrapper[4785]: I1126 15:08:38.447341 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:38 crc kubenswrapper[4785]: I1126 15:08:38.447351 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:38 crc kubenswrapper[4785]: I1126 15:08:38.447365 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:38 crc kubenswrapper[4785]: I1126 15:08:38.447374 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:38Z","lastTransitionTime":"2025-11-26T15:08:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:38 crc kubenswrapper[4785]: I1126 15:08:38.549807 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:38 crc kubenswrapper[4785]: I1126 15:08:38.549843 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:38 crc kubenswrapper[4785]: I1126 15:08:38.549856 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:38 crc kubenswrapper[4785]: I1126 15:08:38.549871 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:38 crc kubenswrapper[4785]: I1126 15:08:38.549881 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:38Z","lastTransitionTime":"2025-11-26T15:08:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:38 crc kubenswrapper[4785]: I1126 15:08:38.652494 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:38 crc kubenswrapper[4785]: I1126 15:08:38.652888 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:38 crc kubenswrapper[4785]: I1126 15:08:38.653043 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:38 crc kubenswrapper[4785]: I1126 15:08:38.653274 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:38 crc kubenswrapper[4785]: I1126 15:08:38.653501 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:38Z","lastTransitionTime":"2025-11-26T15:08:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:38 crc kubenswrapper[4785]: I1126 15:08:38.756297 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:38 crc kubenswrapper[4785]: I1126 15:08:38.757072 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:38 crc kubenswrapper[4785]: I1126 15:08:38.757254 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:38 crc kubenswrapper[4785]: I1126 15:08:38.757492 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:38 crc kubenswrapper[4785]: I1126 15:08:38.757888 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:38Z","lastTransitionTime":"2025-11-26T15:08:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:38 crc kubenswrapper[4785]: I1126 15:08:38.860333 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:38 crc kubenswrapper[4785]: I1126 15:08:38.860652 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:38 crc kubenswrapper[4785]: I1126 15:08:38.860735 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:38 crc kubenswrapper[4785]: I1126 15:08:38.860828 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:38 crc kubenswrapper[4785]: I1126 15:08:38.860915 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:38Z","lastTransitionTime":"2025-11-26T15:08:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:38 crc kubenswrapper[4785]: I1126 15:08:38.963118 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:38 crc kubenswrapper[4785]: I1126 15:08:38.963164 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:38 crc kubenswrapper[4785]: I1126 15:08:38.963176 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:38 crc kubenswrapper[4785]: I1126 15:08:38.963194 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:38 crc kubenswrapper[4785]: I1126 15:08:38.963206 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:38Z","lastTransitionTime":"2025-11-26T15:08:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:39 crc kubenswrapper[4785]: I1126 15:08:39.036369 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 26 15:08:39 crc kubenswrapper[4785]: E1126 15:08:39.036494 4785 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 26 15:08:39 crc kubenswrapper[4785]: I1126 15:08:39.036756 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 26 15:08:39 crc kubenswrapper[4785]: E1126 15:08:39.036819 4785 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 26 15:08:39 crc kubenswrapper[4785]: I1126 15:08:39.065077 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:39 crc kubenswrapper[4785]: I1126 15:08:39.065109 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:39 crc kubenswrapper[4785]: I1126 15:08:39.065116 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:39 crc kubenswrapper[4785]: I1126 15:08:39.065128 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:39 crc kubenswrapper[4785]: I1126 15:08:39.065136 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:39Z","lastTransitionTime":"2025-11-26T15:08:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:39 crc kubenswrapper[4785]: I1126 15:08:39.167710 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:39 crc kubenswrapper[4785]: I1126 15:08:39.167769 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:39 crc kubenswrapper[4785]: I1126 15:08:39.167780 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:39 crc kubenswrapper[4785]: I1126 15:08:39.167796 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:39 crc kubenswrapper[4785]: I1126 15:08:39.167807 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:39Z","lastTransitionTime":"2025-11-26T15:08:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:39 crc kubenswrapper[4785]: I1126 15:08:39.270479 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:39 crc kubenswrapper[4785]: I1126 15:08:39.270605 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:39 crc kubenswrapper[4785]: I1126 15:08:39.270642 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:39 crc kubenswrapper[4785]: I1126 15:08:39.270674 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:39 crc kubenswrapper[4785]: I1126 15:08:39.270696 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:39Z","lastTransitionTime":"2025-11-26T15:08:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:39 crc kubenswrapper[4785]: I1126 15:08:39.374079 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:39 crc kubenswrapper[4785]: I1126 15:08:39.374119 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:39 crc kubenswrapper[4785]: I1126 15:08:39.374129 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:39 crc kubenswrapper[4785]: I1126 15:08:39.374145 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:39 crc kubenswrapper[4785]: I1126 15:08:39.374156 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:39Z","lastTransitionTime":"2025-11-26T15:08:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:39 crc kubenswrapper[4785]: I1126 15:08:39.476545 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:39 crc kubenswrapper[4785]: I1126 15:08:39.476741 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:39 crc kubenswrapper[4785]: I1126 15:08:39.476761 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:39 crc kubenswrapper[4785]: I1126 15:08:39.476818 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:39 crc kubenswrapper[4785]: I1126 15:08:39.476836 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:39Z","lastTransitionTime":"2025-11-26T15:08:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:39 crc kubenswrapper[4785]: I1126 15:08:39.580130 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:39 crc kubenswrapper[4785]: I1126 15:08:39.580181 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:39 crc kubenswrapper[4785]: I1126 15:08:39.580197 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:39 crc kubenswrapper[4785]: I1126 15:08:39.580217 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:39 crc kubenswrapper[4785]: I1126 15:08:39.580231 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:39Z","lastTransitionTime":"2025-11-26T15:08:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:39 crc kubenswrapper[4785]: I1126 15:08:39.683219 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:39 crc kubenswrapper[4785]: I1126 15:08:39.683428 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:39 crc kubenswrapper[4785]: I1126 15:08:39.683449 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:39 crc kubenswrapper[4785]: I1126 15:08:39.683476 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:39 crc kubenswrapper[4785]: I1126 15:08:39.683492 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:39Z","lastTransitionTime":"2025-11-26T15:08:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:39 crc kubenswrapper[4785]: I1126 15:08:39.786046 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:39 crc kubenswrapper[4785]: I1126 15:08:39.786108 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:39 crc kubenswrapper[4785]: I1126 15:08:39.786127 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:39 crc kubenswrapper[4785]: I1126 15:08:39.786149 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:39 crc kubenswrapper[4785]: I1126 15:08:39.786166 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:39Z","lastTransitionTime":"2025-11-26T15:08:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:39 crc kubenswrapper[4785]: I1126 15:08:39.888998 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:39 crc kubenswrapper[4785]: I1126 15:08:39.889064 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:39 crc kubenswrapper[4785]: I1126 15:08:39.889078 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:39 crc kubenswrapper[4785]: I1126 15:08:39.889101 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:39 crc kubenswrapper[4785]: I1126 15:08:39.889117 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:39Z","lastTransitionTime":"2025-11-26T15:08:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:39 crc kubenswrapper[4785]: I1126 15:08:39.991954 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:39 crc kubenswrapper[4785]: I1126 15:08:39.992011 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:39 crc kubenswrapper[4785]: I1126 15:08:39.992027 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:39 crc kubenswrapper[4785]: I1126 15:08:39.992050 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:39 crc kubenswrapper[4785]: I1126 15:08:39.992069 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:39Z","lastTransitionTime":"2025-11-26T15:08:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:40 crc kubenswrapper[4785]: I1126 15:08:40.036130 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qdfwp" Nov 26 15:08:40 crc kubenswrapper[4785]: I1126 15:08:40.036156 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 26 15:08:40 crc kubenswrapper[4785]: E1126 15:08:40.036354 4785 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qdfwp" podUID="72903df2-b694-4229-96b5-167500cab723" Nov 26 15:08:40 crc kubenswrapper[4785]: E1126 15:08:40.036523 4785 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 26 15:08:40 crc kubenswrapper[4785]: I1126 15:08:40.094435 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:40 crc kubenswrapper[4785]: I1126 15:08:40.094522 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:40 crc kubenswrapper[4785]: I1126 15:08:40.094547 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:40 crc kubenswrapper[4785]: I1126 15:08:40.094632 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:40 crc kubenswrapper[4785]: I1126 15:08:40.094658 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:40Z","lastTransitionTime":"2025-11-26T15:08:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:40 crc kubenswrapper[4785]: I1126 15:08:40.197844 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:40 crc kubenswrapper[4785]: I1126 15:08:40.197889 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:40 crc kubenswrapper[4785]: I1126 15:08:40.197898 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:40 crc kubenswrapper[4785]: I1126 15:08:40.197912 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:40 crc kubenswrapper[4785]: I1126 15:08:40.197921 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:40Z","lastTransitionTime":"2025-11-26T15:08:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:40 crc kubenswrapper[4785]: I1126 15:08:40.301227 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:40 crc kubenswrapper[4785]: I1126 15:08:40.301323 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:40 crc kubenswrapper[4785]: I1126 15:08:40.301338 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:40 crc kubenswrapper[4785]: I1126 15:08:40.301355 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:40 crc kubenswrapper[4785]: I1126 15:08:40.301368 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:40Z","lastTransitionTime":"2025-11-26T15:08:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:40 crc kubenswrapper[4785]: I1126 15:08:40.404306 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:40 crc kubenswrapper[4785]: I1126 15:08:40.404367 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:40 crc kubenswrapper[4785]: I1126 15:08:40.404400 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:40 crc kubenswrapper[4785]: I1126 15:08:40.404427 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:40 crc kubenswrapper[4785]: I1126 15:08:40.404450 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:40Z","lastTransitionTime":"2025-11-26T15:08:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:40 crc kubenswrapper[4785]: I1126 15:08:40.507505 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:40 crc kubenswrapper[4785]: I1126 15:08:40.507690 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:40 crc kubenswrapper[4785]: I1126 15:08:40.507718 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:40 crc kubenswrapper[4785]: I1126 15:08:40.507747 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:40 crc kubenswrapper[4785]: I1126 15:08:40.507764 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:40Z","lastTransitionTime":"2025-11-26T15:08:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:40 crc kubenswrapper[4785]: I1126 15:08:40.610252 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:40 crc kubenswrapper[4785]: I1126 15:08:40.610333 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:40 crc kubenswrapper[4785]: I1126 15:08:40.610371 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:40 crc kubenswrapper[4785]: I1126 15:08:40.610402 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:40 crc kubenswrapper[4785]: I1126 15:08:40.610424 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:40Z","lastTransitionTime":"2025-11-26T15:08:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:40 crc kubenswrapper[4785]: I1126 15:08:40.713951 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:40 crc kubenswrapper[4785]: I1126 15:08:40.714025 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:40 crc kubenswrapper[4785]: I1126 15:08:40.714048 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:40 crc kubenswrapper[4785]: I1126 15:08:40.714078 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:40 crc kubenswrapper[4785]: I1126 15:08:40.714100 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:40Z","lastTransitionTime":"2025-11-26T15:08:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:40 crc kubenswrapper[4785]: I1126 15:08:40.817484 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:40 crc kubenswrapper[4785]: I1126 15:08:40.817581 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:40 crc kubenswrapper[4785]: I1126 15:08:40.817607 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:40 crc kubenswrapper[4785]: I1126 15:08:40.817635 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:40 crc kubenswrapper[4785]: I1126 15:08:40.817658 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:40Z","lastTransitionTime":"2025-11-26T15:08:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:40 crc kubenswrapper[4785]: I1126 15:08:40.830344 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 26 15:08:40 crc kubenswrapper[4785]: E1126 15:08:40.830514 4785 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-26 15:09:44.830479694 +0000 UTC m=+148.508845498 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 15:08:40 crc kubenswrapper[4785]: I1126 15:08:40.830668 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 26 15:08:40 crc kubenswrapper[4785]: I1126 15:08:40.830730 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 26 15:08:40 crc kubenswrapper[4785]: I1126 15:08:40.830813 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 26 15:08:40 crc kubenswrapper[4785]: I1126 15:08:40.830871 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 26 15:08:40 crc kubenswrapper[4785]: E1126 15:08:40.830884 4785 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Nov 26 15:08:40 crc kubenswrapper[4785]: E1126 15:08:40.830969 4785 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-26 15:09:44.830943816 +0000 UTC m=+148.509309620 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Nov 26 15:08:40 crc kubenswrapper[4785]: E1126 15:08:40.830986 4785 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 26 15:08:40 crc kubenswrapper[4785]: E1126 15:08:40.831017 4785 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 26 15:08:40 crc kubenswrapper[4785]: E1126 15:08:40.831032 4785 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 26 15:08:40 crc kubenswrapper[4785]: E1126 15:08:40.831053 4785 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 26 15:08:40 crc kubenswrapper[4785]: E1126 15:08:40.831077 4785 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 26 15:08:40 crc kubenswrapper[4785]: E1126 15:08:40.831117 4785 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-26 15:09:44.83109173 +0000 UTC m=+148.509457534 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 26 15:08:40 crc kubenswrapper[4785]: E1126 15:08:40.831032 4785 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 26 15:08:40 crc kubenswrapper[4785]: E1126 15:08:40.831154 4785 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-11-26 15:09:44.831130571 +0000 UTC m=+148.509496365 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 26 15:08:40 crc kubenswrapper[4785]: E1126 15:08:40.831167 4785 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 26 15:08:40 crc kubenswrapper[4785]: E1126 15:08:40.831237 4785 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-11-26 15:09:44.831219303 +0000 UTC m=+148.509585097 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 26 15:08:40 crc kubenswrapper[4785]: I1126 15:08:40.921305 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:40 crc kubenswrapper[4785]: I1126 15:08:40.921397 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:40 crc kubenswrapper[4785]: I1126 15:08:40.921423 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:40 crc kubenswrapper[4785]: I1126 15:08:40.921451 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:40 crc kubenswrapper[4785]: I1126 15:08:40.921469 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:40Z","lastTransitionTime":"2025-11-26T15:08:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:41 crc kubenswrapper[4785]: I1126 15:08:41.025018 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:41 crc kubenswrapper[4785]: I1126 15:08:41.025094 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:41 crc kubenswrapper[4785]: I1126 15:08:41.025111 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:41 crc kubenswrapper[4785]: I1126 15:08:41.025134 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:41 crc kubenswrapper[4785]: I1126 15:08:41.025153 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:41Z","lastTransitionTime":"2025-11-26T15:08:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:41 crc kubenswrapper[4785]: I1126 15:08:41.035701 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 26 15:08:41 crc kubenswrapper[4785]: I1126 15:08:41.035777 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 26 15:08:41 crc kubenswrapper[4785]: E1126 15:08:41.035885 4785 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 26 15:08:41 crc kubenswrapper[4785]: E1126 15:08:41.036096 4785 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 26 15:08:41 crc kubenswrapper[4785]: I1126 15:08:41.128049 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:41 crc kubenswrapper[4785]: I1126 15:08:41.128171 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:41 crc kubenswrapper[4785]: I1126 15:08:41.128191 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:41 crc kubenswrapper[4785]: I1126 15:08:41.128214 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:41 crc kubenswrapper[4785]: I1126 15:08:41.128231 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:41Z","lastTransitionTime":"2025-11-26T15:08:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:41 crc kubenswrapper[4785]: I1126 15:08:41.231523 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:41 crc kubenswrapper[4785]: I1126 15:08:41.231615 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:41 crc kubenswrapper[4785]: I1126 15:08:41.231636 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:41 crc kubenswrapper[4785]: I1126 15:08:41.231661 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:41 crc kubenswrapper[4785]: I1126 15:08:41.231678 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:41Z","lastTransitionTime":"2025-11-26T15:08:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:41 crc kubenswrapper[4785]: I1126 15:08:41.335213 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:41 crc kubenswrapper[4785]: I1126 15:08:41.335267 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:41 crc kubenswrapper[4785]: I1126 15:08:41.335279 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:41 crc kubenswrapper[4785]: I1126 15:08:41.335296 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:41 crc kubenswrapper[4785]: I1126 15:08:41.335308 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:41Z","lastTransitionTime":"2025-11-26T15:08:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:41 crc kubenswrapper[4785]: I1126 15:08:41.437643 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:41 crc kubenswrapper[4785]: I1126 15:08:41.437687 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:41 crc kubenswrapper[4785]: I1126 15:08:41.437695 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:41 crc kubenswrapper[4785]: I1126 15:08:41.437709 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:41 crc kubenswrapper[4785]: I1126 15:08:41.437718 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:41Z","lastTransitionTime":"2025-11-26T15:08:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:41 crc kubenswrapper[4785]: I1126 15:08:41.539385 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:41 crc kubenswrapper[4785]: I1126 15:08:41.539425 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:41 crc kubenswrapper[4785]: I1126 15:08:41.539433 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:41 crc kubenswrapper[4785]: I1126 15:08:41.539445 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:41 crc kubenswrapper[4785]: I1126 15:08:41.539453 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:41Z","lastTransitionTime":"2025-11-26T15:08:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:41 crc kubenswrapper[4785]: I1126 15:08:41.642324 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:41 crc kubenswrapper[4785]: I1126 15:08:41.642407 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:41 crc kubenswrapper[4785]: I1126 15:08:41.642425 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:41 crc kubenswrapper[4785]: I1126 15:08:41.642449 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:41 crc kubenswrapper[4785]: I1126 15:08:41.642467 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:41Z","lastTransitionTime":"2025-11-26T15:08:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:41 crc kubenswrapper[4785]: I1126 15:08:41.745316 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:41 crc kubenswrapper[4785]: I1126 15:08:41.745377 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:41 crc kubenswrapper[4785]: I1126 15:08:41.745403 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:41 crc kubenswrapper[4785]: I1126 15:08:41.745431 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:41 crc kubenswrapper[4785]: I1126 15:08:41.745453 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:41Z","lastTransitionTime":"2025-11-26T15:08:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:41 crc kubenswrapper[4785]: I1126 15:08:41.848920 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:41 crc kubenswrapper[4785]: I1126 15:08:41.848991 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:41 crc kubenswrapper[4785]: I1126 15:08:41.849025 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:41 crc kubenswrapper[4785]: I1126 15:08:41.849054 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:41 crc kubenswrapper[4785]: I1126 15:08:41.849074 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:41Z","lastTransitionTime":"2025-11-26T15:08:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:41 crc kubenswrapper[4785]: I1126 15:08:41.952202 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:41 crc kubenswrapper[4785]: I1126 15:08:41.952282 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:41 crc kubenswrapper[4785]: I1126 15:08:41.952301 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:41 crc kubenswrapper[4785]: I1126 15:08:41.952317 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:41 crc kubenswrapper[4785]: I1126 15:08:41.952328 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:41Z","lastTransitionTime":"2025-11-26T15:08:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:42 crc kubenswrapper[4785]: I1126 15:08:42.035471 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 26 15:08:42 crc kubenswrapper[4785]: I1126 15:08:42.035474 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qdfwp" Nov 26 15:08:42 crc kubenswrapper[4785]: E1126 15:08:42.035631 4785 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 26 15:08:42 crc kubenswrapper[4785]: E1126 15:08:42.035712 4785 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qdfwp" podUID="72903df2-b694-4229-96b5-167500cab723" Nov 26 15:08:42 crc kubenswrapper[4785]: I1126 15:08:42.055673 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:42 crc kubenswrapper[4785]: I1126 15:08:42.055755 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:42 crc kubenswrapper[4785]: I1126 15:08:42.055812 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:42 crc kubenswrapper[4785]: I1126 15:08:42.055832 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:42 crc kubenswrapper[4785]: I1126 15:08:42.055845 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:42Z","lastTransitionTime":"2025-11-26T15:08:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:42 crc kubenswrapper[4785]: I1126 15:08:42.161265 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:42 crc kubenswrapper[4785]: I1126 15:08:42.161308 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:42 crc kubenswrapper[4785]: I1126 15:08:42.161319 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:42 crc kubenswrapper[4785]: I1126 15:08:42.161334 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:42 crc kubenswrapper[4785]: I1126 15:08:42.161345 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:42Z","lastTransitionTime":"2025-11-26T15:08:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:42 crc kubenswrapper[4785]: I1126 15:08:42.264445 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:42 crc kubenswrapper[4785]: I1126 15:08:42.264483 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:42 crc kubenswrapper[4785]: I1126 15:08:42.264495 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:42 crc kubenswrapper[4785]: I1126 15:08:42.264510 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:42 crc kubenswrapper[4785]: I1126 15:08:42.264521 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:42Z","lastTransitionTime":"2025-11-26T15:08:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:42 crc kubenswrapper[4785]: I1126 15:08:42.367350 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:42 crc kubenswrapper[4785]: I1126 15:08:42.367415 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:42 crc kubenswrapper[4785]: I1126 15:08:42.367432 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:42 crc kubenswrapper[4785]: I1126 15:08:42.367455 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:42 crc kubenswrapper[4785]: I1126 15:08:42.367475 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:42Z","lastTransitionTime":"2025-11-26T15:08:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:42 crc kubenswrapper[4785]: I1126 15:08:42.472311 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:42 crc kubenswrapper[4785]: I1126 15:08:42.472397 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:42 crc kubenswrapper[4785]: I1126 15:08:42.472411 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:42 crc kubenswrapper[4785]: I1126 15:08:42.472443 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:42 crc kubenswrapper[4785]: I1126 15:08:42.472470 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:42Z","lastTransitionTime":"2025-11-26T15:08:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:42 crc kubenswrapper[4785]: I1126 15:08:42.574335 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:42 crc kubenswrapper[4785]: I1126 15:08:42.574631 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:42 crc kubenswrapper[4785]: I1126 15:08:42.574710 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:42 crc kubenswrapper[4785]: I1126 15:08:42.574785 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:42 crc kubenswrapper[4785]: I1126 15:08:42.574847 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:42Z","lastTransitionTime":"2025-11-26T15:08:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:42 crc kubenswrapper[4785]: I1126 15:08:42.678360 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:42 crc kubenswrapper[4785]: I1126 15:08:42.678795 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:42 crc kubenswrapper[4785]: I1126 15:08:42.679064 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:42 crc kubenswrapper[4785]: I1126 15:08:42.679273 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:42 crc kubenswrapper[4785]: I1126 15:08:42.679445 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:42Z","lastTransitionTime":"2025-11-26T15:08:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:42 crc kubenswrapper[4785]: I1126 15:08:42.782864 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:42 crc kubenswrapper[4785]: I1126 15:08:42.782946 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:42 crc kubenswrapper[4785]: I1126 15:08:42.782972 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:42 crc kubenswrapper[4785]: I1126 15:08:42.783002 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:42 crc kubenswrapper[4785]: I1126 15:08:42.783025 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:42Z","lastTransitionTime":"2025-11-26T15:08:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:42 crc kubenswrapper[4785]: I1126 15:08:42.886075 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:42 crc kubenswrapper[4785]: I1126 15:08:42.888132 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:42 crc kubenswrapper[4785]: I1126 15:08:42.888280 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:42 crc kubenswrapper[4785]: I1126 15:08:42.888457 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:42 crc kubenswrapper[4785]: I1126 15:08:42.888653 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:42Z","lastTransitionTime":"2025-11-26T15:08:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:42 crc kubenswrapper[4785]: I1126 15:08:42.992297 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:42 crc kubenswrapper[4785]: I1126 15:08:42.992369 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:42 crc kubenswrapper[4785]: I1126 15:08:42.992393 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:42 crc kubenswrapper[4785]: I1126 15:08:42.992423 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:42 crc kubenswrapper[4785]: I1126 15:08:42.992446 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:42Z","lastTransitionTime":"2025-11-26T15:08:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:43 crc kubenswrapper[4785]: I1126 15:08:43.036059 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 26 15:08:43 crc kubenswrapper[4785]: E1126 15:08:43.036304 4785 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 26 15:08:43 crc kubenswrapper[4785]: I1126 15:08:43.036730 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 26 15:08:43 crc kubenswrapper[4785]: E1126 15:08:43.036925 4785 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 26 15:08:43 crc kubenswrapper[4785]: I1126 15:08:43.097019 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:43 crc kubenswrapper[4785]: I1126 15:08:43.097086 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:43 crc kubenswrapper[4785]: I1126 15:08:43.097104 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:43 crc kubenswrapper[4785]: I1126 15:08:43.097130 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:43 crc kubenswrapper[4785]: I1126 15:08:43.097149 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:43Z","lastTransitionTime":"2025-11-26T15:08:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:43 crc kubenswrapper[4785]: I1126 15:08:43.199969 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:43 crc kubenswrapper[4785]: I1126 15:08:43.200021 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:43 crc kubenswrapper[4785]: I1126 15:08:43.200033 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:43 crc kubenswrapper[4785]: I1126 15:08:43.200052 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:43 crc kubenswrapper[4785]: I1126 15:08:43.200066 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:43Z","lastTransitionTime":"2025-11-26T15:08:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:43 crc kubenswrapper[4785]: I1126 15:08:43.302836 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:43 crc kubenswrapper[4785]: I1126 15:08:43.302874 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:43 crc kubenswrapper[4785]: I1126 15:08:43.302884 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:43 crc kubenswrapper[4785]: I1126 15:08:43.302898 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:43 crc kubenswrapper[4785]: I1126 15:08:43.302908 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:43Z","lastTransitionTime":"2025-11-26T15:08:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:43 crc kubenswrapper[4785]: I1126 15:08:43.405914 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:43 crc kubenswrapper[4785]: I1126 15:08:43.406043 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:43 crc kubenswrapper[4785]: I1126 15:08:43.406117 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:43 crc kubenswrapper[4785]: I1126 15:08:43.406154 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:43 crc kubenswrapper[4785]: I1126 15:08:43.406232 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:43Z","lastTransitionTime":"2025-11-26T15:08:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:43 crc kubenswrapper[4785]: I1126 15:08:43.509359 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:43 crc kubenswrapper[4785]: I1126 15:08:43.509429 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:43 crc kubenswrapper[4785]: I1126 15:08:43.509446 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:43 crc kubenswrapper[4785]: I1126 15:08:43.509470 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:43 crc kubenswrapper[4785]: I1126 15:08:43.509491 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:43Z","lastTransitionTime":"2025-11-26T15:08:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:43 crc kubenswrapper[4785]: I1126 15:08:43.612333 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:43 crc kubenswrapper[4785]: I1126 15:08:43.612408 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:43 crc kubenswrapper[4785]: I1126 15:08:43.612432 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:43 crc kubenswrapper[4785]: I1126 15:08:43.612466 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:43 crc kubenswrapper[4785]: I1126 15:08:43.612492 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:43Z","lastTransitionTime":"2025-11-26T15:08:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:43 crc kubenswrapper[4785]: I1126 15:08:43.716528 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:43 crc kubenswrapper[4785]: I1126 15:08:43.716641 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:43 crc kubenswrapper[4785]: I1126 15:08:43.716665 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:43 crc kubenswrapper[4785]: I1126 15:08:43.716698 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:43 crc kubenswrapper[4785]: I1126 15:08:43.716723 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:43Z","lastTransitionTime":"2025-11-26T15:08:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:43 crc kubenswrapper[4785]: I1126 15:08:43.820950 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:43 crc kubenswrapper[4785]: I1126 15:08:43.821007 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:43 crc kubenswrapper[4785]: I1126 15:08:43.821026 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:43 crc kubenswrapper[4785]: I1126 15:08:43.821055 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:43 crc kubenswrapper[4785]: I1126 15:08:43.821073 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:43Z","lastTransitionTime":"2025-11-26T15:08:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:43 crc kubenswrapper[4785]: I1126 15:08:43.924159 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:43 crc kubenswrapper[4785]: I1126 15:08:43.924214 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:43 crc kubenswrapper[4785]: I1126 15:08:43.924231 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:43 crc kubenswrapper[4785]: I1126 15:08:43.924257 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:43 crc kubenswrapper[4785]: I1126 15:08:43.924274 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:43Z","lastTransitionTime":"2025-11-26T15:08:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:44 crc kubenswrapper[4785]: I1126 15:08:44.026958 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:44 crc kubenswrapper[4785]: I1126 15:08:44.027023 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:44 crc kubenswrapper[4785]: I1126 15:08:44.027040 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:44 crc kubenswrapper[4785]: I1126 15:08:44.027067 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:44 crc kubenswrapper[4785]: I1126 15:08:44.027087 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:44Z","lastTransitionTime":"2025-11-26T15:08:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:44 crc kubenswrapper[4785]: I1126 15:08:44.035583 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 26 15:08:44 crc kubenswrapper[4785]: I1126 15:08:44.035631 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qdfwp" Nov 26 15:08:44 crc kubenswrapper[4785]: E1126 15:08:44.035751 4785 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 26 15:08:44 crc kubenswrapper[4785]: E1126 15:08:44.035908 4785 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qdfwp" podUID="72903df2-b694-4229-96b5-167500cab723" Nov 26 15:08:44 crc kubenswrapper[4785]: I1126 15:08:44.129893 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:44 crc kubenswrapper[4785]: I1126 15:08:44.129969 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:44 crc kubenswrapper[4785]: I1126 15:08:44.129995 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:44 crc kubenswrapper[4785]: I1126 15:08:44.130026 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:44 crc kubenswrapper[4785]: I1126 15:08:44.130050 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:44Z","lastTransitionTime":"2025-11-26T15:08:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:44 crc kubenswrapper[4785]: I1126 15:08:44.233470 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:44 crc kubenswrapper[4785]: I1126 15:08:44.233627 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:44 crc kubenswrapper[4785]: I1126 15:08:44.233650 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:44 crc kubenswrapper[4785]: I1126 15:08:44.233672 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:44 crc kubenswrapper[4785]: I1126 15:08:44.233690 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:44Z","lastTransitionTime":"2025-11-26T15:08:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:44 crc kubenswrapper[4785]: I1126 15:08:44.336730 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:44 crc kubenswrapper[4785]: I1126 15:08:44.336791 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:44 crc kubenswrapper[4785]: I1126 15:08:44.336814 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:44 crc kubenswrapper[4785]: I1126 15:08:44.336845 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:44 crc kubenswrapper[4785]: I1126 15:08:44.336869 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:44Z","lastTransitionTime":"2025-11-26T15:08:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:44 crc kubenswrapper[4785]: I1126 15:08:44.440096 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:44 crc kubenswrapper[4785]: I1126 15:08:44.440181 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:44 crc kubenswrapper[4785]: I1126 15:08:44.440219 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:44 crc kubenswrapper[4785]: I1126 15:08:44.440250 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:44 crc kubenswrapper[4785]: I1126 15:08:44.440279 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:44Z","lastTransitionTime":"2025-11-26T15:08:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:44 crc kubenswrapper[4785]: I1126 15:08:44.542545 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:44 crc kubenswrapper[4785]: I1126 15:08:44.542674 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:44 crc kubenswrapper[4785]: I1126 15:08:44.542723 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:44 crc kubenswrapper[4785]: I1126 15:08:44.542756 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:44 crc kubenswrapper[4785]: I1126 15:08:44.542777 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:44Z","lastTransitionTime":"2025-11-26T15:08:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:44 crc kubenswrapper[4785]: I1126 15:08:44.645534 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:44 crc kubenswrapper[4785]: I1126 15:08:44.645652 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:44 crc kubenswrapper[4785]: I1126 15:08:44.645675 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:44 crc kubenswrapper[4785]: I1126 15:08:44.645700 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:44 crc kubenswrapper[4785]: I1126 15:08:44.645721 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:44Z","lastTransitionTime":"2025-11-26T15:08:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:44 crc kubenswrapper[4785]: I1126 15:08:44.750023 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:44 crc kubenswrapper[4785]: I1126 15:08:44.750085 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:44 crc kubenswrapper[4785]: I1126 15:08:44.750099 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:44 crc kubenswrapper[4785]: I1126 15:08:44.750119 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:44 crc kubenswrapper[4785]: I1126 15:08:44.750130 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:44Z","lastTransitionTime":"2025-11-26T15:08:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:44 crc kubenswrapper[4785]: I1126 15:08:44.852908 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:44 crc kubenswrapper[4785]: I1126 15:08:44.852990 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:44 crc kubenswrapper[4785]: I1126 15:08:44.853013 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:44 crc kubenswrapper[4785]: I1126 15:08:44.853039 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:44 crc kubenswrapper[4785]: I1126 15:08:44.853057 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:44Z","lastTransitionTime":"2025-11-26T15:08:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:44 crc kubenswrapper[4785]: I1126 15:08:44.957041 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:44 crc kubenswrapper[4785]: I1126 15:08:44.957106 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:44 crc kubenswrapper[4785]: I1126 15:08:44.957118 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:44 crc kubenswrapper[4785]: I1126 15:08:44.957137 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:44 crc kubenswrapper[4785]: I1126 15:08:44.957150 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:44Z","lastTransitionTime":"2025-11-26T15:08:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:45 crc kubenswrapper[4785]: I1126 15:08:45.036151 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 26 15:08:45 crc kubenswrapper[4785]: I1126 15:08:45.036336 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 26 15:08:45 crc kubenswrapper[4785]: E1126 15:08:45.036517 4785 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 26 15:08:45 crc kubenswrapper[4785]: E1126 15:08:45.036835 4785 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 26 15:08:45 crc kubenswrapper[4785]: I1126 15:08:45.060874 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:45 crc kubenswrapper[4785]: I1126 15:08:45.060947 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:45 crc kubenswrapper[4785]: I1126 15:08:45.060967 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:45 crc kubenswrapper[4785]: I1126 15:08:45.060994 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:45 crc kubenswrapper[4785]: I1126 15:08:45.061015 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:45Z","lastTransitionTime":"2025-11-26T15:08:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:45 crc kubenswrapper[4785]: I1126 15:08:45.164673 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:45 crc kubenswrapper[4785]: I1126 15:08:45.164753 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:45 crc kubenswrapper[4785]: I1126 15:08:45.164773 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:45 crc kubenswrapper[4785]: I1126 15:08:45.164796 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:45 crc kubenswrapper[4785]: I1126 15:08:45.164812 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:45Z","lastTransitionTime":"2025-11-26T15:08:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:45 crc kubenswrapper[4785]: I1126 15:08:45.268298 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:45 crc kubenswrapper[4785]: I1126 15:08:45.268372 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:45 crc kubenswrapper[4785]: I1126 15:08:45.268395 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:45 crc kubenswrapper[4785]: I1126 15:08:45.268425 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:45 crc kubenswrapper[4785]: I1126 15:08:45.268447 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:45Z","lastTransitionTime":"2025-11-26T15:08:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:45 crc kubenswrapper[4785]: I1126 15:08:45.333716 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:45 crc kubenswrapper[4785]: I1126 15:08:45.333770 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:45 crc kubenswrapper[4785]: I1126 15:08:45.333785 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:45 crc kubenswrapper[4785]: I1126 15:08:45.333804 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:45 crc kubenswrapper[4785]: I1126 15:08:45.333818 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:45Z","lastTransitionTime":"2025-11-26T15:08:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:45 crc kubenswrapper[4785]: E1126 15:08:45.351504 4785 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T15:08:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T15:08:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T15:08:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T15:08:45Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T15:08:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T15:08:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T15:08:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T15:08:45Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"564b0a72-079b-4004-bac6-7e7947bc6860\\\",\\\"systemUUID\\\":\\\"0559caf5-1d73-4afa-a2e0-e8d6b738bfd5\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:08:45Z is after 2025-08-24T17:21:41Z" Nov 26 15:08:45 crc kubenswrapper[4785]: I1126 15:08:45.358146 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:45 crc kubenswrapper[4785]: I1126 15:08:45.358232 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:45 crc kubenswrapper[4785]: I1126 15:08:45.358264 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:45 crc kubenswrapper[4785]: I1126 15:08:45.358306 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:45 crc kubenswrapper[4785]: I1126 15:08:45.358334 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:45Z","lastTransitionTime":"2025-11-26T15:08:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:45 crc kubenswrapper[4785]: E1126 15:08:45.381000 4785 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T15:08:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T15:08:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T15:08:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T15:08:45Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T15:08:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T15:08:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T15:08:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T15:08:45Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"564b0a72-079b-4004-bac6-7e7947bc6860\\\",\\\"systemUUID\\\":\\\"0559caf5-1d73-4afa-a2e0-e8d6b738bfd5\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:08:45Z is after 2025-08-24T17:21:41Z" Nov 26 15:08:45 crc kubenswrapper[4785]: I1126 15:08:45.385701 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:45 crc kubenswrapper[4785]: I1126 15:08:45.385773 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:45 crc kubenswrapper[4785]: I1126 15:08:45.385800 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:45 crc kubenswrapper[4785]: I1126 15:08:45.385831 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:45 crc kubenswrapper[4785]: I1126 15:08:45.385855 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:45Z","lastTransitionTime":"2025-11-26T15:08:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:45 crc kubenswrapper[4785]: E1126 15:08:45.407798 4785 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T15:08:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T15:08:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T15:08:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T15:08:45Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T15:08:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T15:08:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T15:08:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T15:08:45Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"564b0a72-079b-4004-bac6-7e7947bc6860\\\",\\\"systemUUID\\\":\\\"0559caf5-1d73-4afa-a2e0-e8d6b738bfd5\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:08:45Z is after 2025-08-24T17:21:41Z" Nov 26 15:08:45 crc kubenswrapper[4785]: I1126 15:08:45.412473 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:45 crc kubenswrapper[4785]: I1126 15:08:45.412526 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:45 crc kubenswrapper[4785]: I1126 15:08:45.412542 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:45 crc kubenswrapper[4785]: I1126 15:08:45.412583 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:45 crc kubenswrapper[4785]: I1126 15:08:45.412598 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:45Z","lastTransitionTime":"2025-11-26T15:08:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:45 crc kubenswrapper[4785]: E1126 15:08:45.435576 4785 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T15:08:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T15:08:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T15:08:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T15:08:45Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T15:08:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T15:08:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T15:08:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T15:08:45Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"564b0a72-079b-4004-bac6-7e7947bc6860\\\",\\\"systemUUID\\\":\\\"0559caf5-1d73-4afa-a2e0-e8d6b738bfd5\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:08:45Z is after 2025-08-24T17:21:41Z" Nov 26 15:08:45 crc kubenswrapper[4785]: I1126 15:08:45.446610 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:45 crc kubenswrapper[4785]: I1126 15:08:45.446681 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:45 crc kubenswrapper[4785]: I1126 15:08:45.446698 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:45 crc kubenswrapper[4785]: I1126 15:08:45.446723 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:45 crc kubenswrapper[4785]: I1126 15:08:45.446741 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:45Z","lastTransitionTime":"2025-11-26T15:08:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:45 crc kubenswrapper[4785]: E1126 15:08:45.464117 4785 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T15:08:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T15:08:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T15:08:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T15:08:45Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T15:08:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T15:08:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T15:08:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T15:08:45Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"564b0a72-079b-4004-bac6-7e7947bc6860\\\",\\\"systemUUID\\\":\\\"0559caf5-1d73-4afa-a2e0-e8d6b738bfd5\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:08:45Z is after 2025-08-24T17:21:41Z" Nov 26 15:08:45 crc kubenswrapper[4785]: E1126 15:08:45.464335 4785 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Nov 26 15:08:45 crc kubenswrapper[4785]: I1126 15:08:45.466090 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:45 crc kubenswrapper[4785]: I1126 15:08:45.466123 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:45 crc kubenswrapper[4785]: I1126 15:08:45.466137 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:45 crc kubenswrapper[4785]: I1126 15:08:45.466155 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:45 crc kubenswrapper[4785]: I1126 15:08:45.466167 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:45Z","lastTransitionTime":"2025-11-26T15:08:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:45 crc kubenswrapper[4785]: I1126 15:08:45.568640 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:45 crc kubenswrapper[4785]: I1126 15:08:45.568718 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:45 crc kubenswrapper[4785]: I1126 15:08:45.568742 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:45 crc kubenswrapper[4785]: I1126 15:08:45.568772 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:45 crc kubenswrapper[4785]: I1126 15:08:45.568798 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:45Z","lastTransitionTime":"2025-11-26T15:08:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:45 crc kubenswrapper[4785]: I1126 15:08:45.672325 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:45 crc kubenswrapper[4785]: I1126 15:08:45.672395 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:45 crc kubenswrapper[4785]: I1126 15:08:45.672424 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:45 crc kubenswrapper[4785]: I1126 15:08:45.672456 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:45 crc kubenswrapper[4785]: I1126 15:08:45.672479 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:45Z","lastTransitionTime":"2025-11-26T15:08:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:45 crc kubenswrapper[4785]: I1126 15:08:45.775810 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:45 crc kubenswrapper[4785]: I1126 15:08:45.775852 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:45 crc kubenswrapper[4785]: I1126 15:08:45.775866 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:45 crc kubenswrapper[4785]: I1126 15:08:45.775898 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:45 crc kubenswrapper[4785]: I1126 15:08:45.775912 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:45Z","lastTransitionTime":"2025-11-26T15:08:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:45 crc kubenswrapper[4785]: I1126 15:08:45.878517 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:45 crc kubenswrapper[4785]: I1126 15:08:45.878664 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:45 crc kubenswrapper[4785]: I1126 15:08:45.878694 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:45 crc kubenswrapper[4785]: I1126 15:08:45.878725 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:45 crc kubenswrapper[4785]: I1126 15:08:45.878747 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:45Z","lastTransitionTime":"2025-11-26T15:08:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:45 crc kubenswrapper[4785]: I1126 15:08:45.982903 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:45 crc kubenswrapper[4785]: I1126 15:08:45.982987 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:45 crc kubenswrapper[4785]: I1126 15:08:45.983009 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:45 crc kubenswrapper[4785]: I1126 15:08:45.983041 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:45 crc kubenswrapper[4785]: I1126 15:08:45.983062 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:45Z","lastTransitionTime":"2025-11-26T15:08:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:46 crc kubenswrapper[4785]: I1126 15:08:46.035297 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qdfwp" Nov 26 15:08:46 crc kubenswrapper[4785]: I1126 15:08:46.035297 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 26 15:08:46 crc kubenswrapper[4785]: E1126 15:08:46.035458 4785 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qdfwp" podUID="72903df2-b694-4229-96b5-167500cab723" Nov 26 15:08:46 crc kubenswrapper[4785]: E1126 15:08:46.035520 4785 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 26 15:08:46 crc kubenswrapper[4785]: I1126 15:08:46.085592 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:46 crc kubenswrapper[4785]: I1126 15:08:46.085660 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:46 crc kubenswrapper[4785]: I1126 15:08:46.085678 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:46 crc kubenswrapper[4785]: I1126 15:08:46.085703 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:46 crc kubenswrapper[4785]: I1126 15:08:46.085721 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:46Z","lastTransitionTime":"2025-11-26T15:08:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:46 crc kubenswrapper[4785]: I1126 15:08:46.188317 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:46 crc kubenswrapper[4785]: I1126 15:08:46.188351 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:46 crc kubenswrapper[4785]: I1126 15:08:46.188360 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:46 crc kubenswrapper[4785]: I1126 15:08:46.188372 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:46 crc kubenswrapper[4785]: I1126 15:08:46.188380 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:46Z","lastTransitionTime":"2025-11-26T15:08:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:46 crc kubenswrapper[4785]: I1126 15:08:46.291161 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:46 crc kubenswrapper[4785]: I1126 15:08:46.291232 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:46 crc kubenswrapper[4785]: I1126 15:08:46.291257 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:46 crc kubenswrapper[4785]: I1126 15:08:46.291284 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:46 crc kubenswrapper[4785]: I1126 15:08:46.291306 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:46Z","lastTransitionTime":"2025-11-26T15:08:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:46 crc kubenswrapper[4785]: I1126 15:08:46.394023 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:46 crc kubenswrapper[4785]: I1126 15:08:46.394087 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:46 crc kubenswrapper[4785]: I1126 15:08:46.394099 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:46 crc kubenswrapper[4785]: I1126 15:08:46.394119 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:46 crc kubenswrapper[4785]: I1126 15:08:46.394131 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:46Z","lastTransitionTime":"2025-11-26T15:08:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:46 crc kubenswrapper[4785]: I1126 15:08:46.496971 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:46 crc kubenswrapper[4785]: I1126 15:08:46.497045 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:46 crc kubenswrapper[4785]: I1126 15:08:46.497068 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:46 crc kubenswrapper[4785]: I1126 15:08:46.497093 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:46 crc kubenswrapper[4785]: I1126 15:08:46.497110 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:46Z","lastTransitionTime":"2025-11-26T15:08:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:46 crc kubenswrapper[4785]: I1126 15:08:46.599796 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:46 crc kubenswrapper[4785]: I1126 15:08:46.599877 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:46 crc kubenswrapper[4785]: I1126 15:08:46.599899 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:46 crc kubenswrapper[4785]: I1126 15:08:46.599928 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:46 crc kubenswrapper[4785]: I1126 15:08:46.599951 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:46Z","lastTransitionTime":"2025-11-26T15:08:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:46 crc kubenswrapper[4785]: I1126 15:08:46.702843 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:46 crc kubenswrapper[4785]: I1126 15:08:46.702896 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:46 crc kubenswrapper[4785]: I1126 15:08:46.702910 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:46 crc kubenswrapper[4785]: I1126 15:08:46.702931 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:46 crc kubenswrapper[4785]: I1126 15:08:46.702945 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:46Z","lastTransitionTime":"2025-11-26T15:08:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:46 crc kubenswrapper[4785]: I1126 15:08:46.805878 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:46 crc kubenswrapper[4785]: I1126 15:08:46.805925 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:46 crc kubenswrapper[4785]: I1126 15:08:46.805936 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:46 crc kubenswrapper[4785]: I1126 15:08:46.805959 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:46 crc kubenswrapper[4785]: I1126 15:08:46.805981 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:46Z","lastTransitionTime":"2025-11-26T15:08:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:46 crc kubenswrapper[4785]: I1126 15:08:46.908631 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:46 crc kubenswrapper[4785]: I1126 15:08:46.908706 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:46 crc kubenswrapper[4785]: I1126 15:08:46.908722 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:46 crc kubenswrapper[4785]: I1126 15:08:46.908745 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:46 crc kubenswrapper[4785]: I1126 15:08:46.908759 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:46Z","lastTransitionTime":"2025-11-26T15:08:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:47 crc kubenswrapper[4785]: I1126 15:08:47.012482 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:47 crc kubenswrapper[4785]: I1126 15:08:47.012634 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:47 crc kubenswrapper[4785]: I1126 15:08:47.012669 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:47 crc kubenswrapper[4785]: I1126 15:08:47.012698 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:47 crc kubenswrapper[4785]: I1126 15:08:47.012718 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:47Z","lastTransitionTime":"2025-11-26T15:08:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:47 crc kubenswrapper[4785]: I1126 15:08:47.035772 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 26 15:08:47 crc kubenswrapper[4785]: E1126 15:08:47.036391 4785 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 26 15:08:47 crc kubenswrapper[4785]: I1126 15:08:47.036977 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 26 15:08:47 crc kubenswrapper[4785]: E1126 15:08:47.037160 4785 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 26 15:08:47 crc kubenswrapper[4785]: I1126 15:08:47.039498 4785 scope.go:117] "RemoveContainer" containerID="c8d76c8252001a8224d06473aabbbc9d9f0b4a73a415750b8414635ebf2b19da" Nov 26 15:08:47 crc kubenswrapper[4785]: E1126 15:08:47.040071 4785 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-925q9_openshift-ovn-kubernetes(862c58fd-3f79-4276-bd76-ce689d32cbd6)\"" pod="openshift-ovn-kubernetes/ovnkube-node-925q9" podUID="862c58fd-3f79-4276-bd76-ce689d32cbd6" Nov 26 15:08:47 crc kubenswrapper[4785]: I1126 15:08:47.049866 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-hk884" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b9a87c55-b930-4993-88b8-15902e000caa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://018c4591e8ee49a626a6357d6672b723d221f62eaa62face83fd6c3d5f1b3bfa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n9rk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:38Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-hk884\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:08:47Z is after 2025-08-24T17:21:41Z" Nov 26 15:08:47 crc kubenswrapper[4785]: I1126 15:08:47.066439 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6q4xd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"855bd894-cca9-4fe1-a0d5-8b72afe7c93a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:08:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:08:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96bab32cef462780b564492521f2bdddda092ed263374eb243d3e20ed8a068ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://041064cd63f3d654e02e5e310fbc190ca527a986959c9d7a35ab87962184f2a0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-26T15:08:26Z\\\",\\\"message\\\":\\\"2025-11-26T15:07:40+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_bd6b8591-1e03-43be-806d-3dc58de1780b\\\\n2025-11-26T15:07:40+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_bd6b8591-1e03-43be-806d-3dc58de1780b to /host/opt/cni/bin/\\\\n2025-11-26T15:07:41Z [verbose] multus-daemon started\\\\n2025-11-26T15:07:41Z [verbose] Readiness Indicator file check\\\\n2025-11-26T15:08:26Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:39Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7zv2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6q4xd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:08:47Z is after 2025-08-24T17:21:41Z" Nov 26 15:08:47 crc kubenswrapper[4785]: I1126 15:08:47.079227 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-qdfwp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72903df2-b694-4229-96b5-167500cab723\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrprh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrprh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:52Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-qdfwp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:08:47Z is after 2025-08-24T17:21:41Z" Nov 26 15:08:47 crc kubenswrapper[4785]: I1126 15:08:47.091031 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2fc8b9de-d617-46d7-8834-84da7611f363\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08b2ab738589b29c6c67e3319ccab97ca16955fbc26dfa0dcf8a4374f6cc0d4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad113c4b2e70dde555f17e5b90363d0c0d5bcd70fa99ed5422ffc32bce3253c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ad113c4b2e70dde555f17e5b90363d0c0d5bcd70fa99ed5422ffc32bce3253c1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:17Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:08:47Z is after 2025-08-24T17:21:41Z" Nov 26 15:08:47 crc kubenswrapper[4785]: I1126 15:08:47.105593 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:08:47Z is after 2025-08-24T17:21:41Z" Nov 26 15:08:47 crc kubenswrapper[4785]: I1126 15:08:47.116303 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:47 crc kubenswrapper[4785]: I1126 15:08:47.116385 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:47 crc kubenswrapper[4785]: I1126 15:08:47.116412 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:47 crc kubenswrapper[4785]: I1126 15:08:47.116448 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:47 crc kubenswrapper[4785]: I1126 15:08:47.116474 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:47Z","lastTransitionTime":"2025-11-26T15:08:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:47 crc kubenswrapper[4785]: I1126 15:08:47.117530 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-smv28" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d77660b-5b10-4573-84ab-3dc318d4b4ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3bb62d64073e9a5efa6ec04394fb33bf1498892d59e0843f323a728dff2e7b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mwfft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:38Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-smv28\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:08:47Z is after 2025-08-24T17:21:41Z" Nov 26 15:08:47 crc kubenswrapper[4785]: I1126 15:08:47.130669 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zk6jt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"09c65f70-203d-40eb-a45e-ed8d7e36912f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69bcee7670290dd116b68ea6d15585455861981551919b4b553fc699f43161e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dg52p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba1f2e54aa3b8e81f5bc6ad4531d5976cfd8d6c4c15f16cefecc75c3c6189ef4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dg52p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-zk6jt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:08:47Z is after 2025-08-24T17:21:41Z" Nov 26 15:08:47 crc kubenswrapper[4785]: I1126 15:08:47.153164 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ce2473df-8540-437a-9b68-a0c79c8f1189\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26237e500f4655636096b017418a9029308b9ad9e5c12041546e51d919766529\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7904d95c1ece08fadcdc54e83f80b295f3904cc950fb84624bc2a6967e4d5bdd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://936454b4c6cdedceb140c50e2dba0e677ce2b126075b203bdae8a77bbdefde76\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2202f005c9b704cc7a97a7387e4563a14bda9e52001c63918f1067a7e676d34d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a6d4414478527bb76f539125e4be6ebf2a6470ddb46fd35e07180319fa98ede\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-26T15:07:36Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1126 15:07:30.557519 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1126 15:07:30.559919 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2275430498/tls.crt::/tmp/serving-cert-2275430498/tls.key\\\\\\\"\\\\nI1126 15:07:36.843783 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1126 15:07:36.847546 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1126 15:07:36.847601 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1126 15:07:36.847630 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1126 15:07:36.847642 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1126 15:07:36.858915 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1126 15:07:36.858955 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1126 15:07:36.858960 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1126 15:07:36.858971 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1126 15:07:36.858975 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1126 15:07:36.858978 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1126 15:07:36.858982 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1126 15:07:36.859422 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1126 15:07:36.862125 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:20Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcce80995d95ff6993051a4f1ca260133d2513d216a291e3f2c905ec29f88d1b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e86ca1ef167bd0c31ad5ae1283d40945c72c4e528f3702b577bbbf9818e84c08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e86ca1ef167bd0c31ad5ae1283d40945c72c4e528f3702b577bbbf9818e84c08\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:08:47Z is after 2025-08-24T17:21:41Z" Nov 26 15:08:47 crc kubenswrapper[4785]: I1126 15:08:47.169465 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f369bd7df062792e57a0cee19e7be96c6850b20d6e739a0c7fa5eccc14ae1d1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:08:47Z is after 2025-08-24T17:21:41Z" Nov 26 15:08:47 crc kubenswrapper[4785]: I1126 15:08:47.186851 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:08:47Z is after 2025-08-24T17:21:41Z" Nov 26 15:08:47 crc kubenswrapper[4785]: I1126 15:08:47.204973 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dbac4f7f-578e-41e6-88a0-4ad8d2f5eef2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://320fc519a91dfed4d5c233e7c1c2f21aadf25336ccade8830069aae4bd804a68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b79ab88118b1e683ad26fc3bd390961d15af99cce73126c56587c0df819f3a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6c964ff713703fe4484d76d86473279ffc4a3eef0cf3dd8b29aea7fdc6271616\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03ac9c745ec9eb21263a599561496630202872ef631f5e846e0475efedee6943\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:08:47Z is after 2025-08-24T17:21:41Z" Nov 26 15:08:47 crc kubenswrapper[4785]: I1126 15:08:47.220022 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58c0357f-6901-420b-a246-f5ef95e4fb7e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:08:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:08:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2553398db49910f840349bfd11fd5ce7908438bdbcf5a654355c10fb3c3d610\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3c4c50721c470ba97725e59a42ae3cf740017ed7027a9e5b0a7e1906f40fd49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f8dc60997fbeb399187b6bc95276994ff0a181732022bfff1e229928996f90b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://169def9ac03274e05424365ce5b2fc1236d495e846603c29fa6efbaae27b53b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://169def9ac03274e05424365ce5b2fc1236d495e846603c29fa6efbaae27b53b2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:18Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:17Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:08:47Z is after 2025-08-24T17:21:41Z" Nov 26 15:08:47 crc kubenswrapper[4785]: I1126 15:08:47.220324 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:47 crc kubenswrapper[4785]: I1126 15:08:47.220351 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:47 crc kubenswrapper[4785]: I1126 15:08:47.220379 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:47 crc kubenswrapper[4785]: I1126 15:08:47.220393 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:47 crc kubenswrapper[4785]: I1126 15:08:47.220403 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:47Z","lastTransitionTime":"2025-11-26T15:08:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:47 crc kubenswrapper[4785]: I1126 15:08:47.238051 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ec3c2133-1999-46b4-b5a6-c1b800f04c30\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee866b365ef5e15c3cc262ab6ad0649b10fa7fdf0c10c3f747d9e20d48535e76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79b8f6bd2047642cacd4effcee8f9760c7c9480092993c7a25eabe4727bd8238\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dee1454ae94951e83f437b999f871e4965a9199a20bcd6392a6e07a7bfa9c6aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://257e51cb9b93727ccb616e193abc21dfed1ee9ee2f1ce459e35728457dae5280\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc743861f85f56a87c6a20d7152af59e554284fe13b8212c5dae854438771406\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ff60b31932a511f9156f1416e112d115fcbc79a0a6030143ba9f5ddd327577a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ff60b31932a511f9156f1416e112d115fcbc79a0a6030143ba9f5ddd327577a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff524be439c6bf410b9377cd503516b43be3826e4c2fcd03a6ed9bf3fcfab95b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff524be439c6bf410b9377cd503516b43be3826e4c2fcd03a6ed9bf3fcfab95b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:19Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://26cb5ba5508ef320a24ba05e352547c9feb82f37ebeab58ad7c0a073d03b8c2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26cb5ba5508ef320a24ba05e352547c9feb82f37ebeab58ad7c0a073d03b8c2d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:17Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:08:47Z is after 2025-08-24T17:21:41Z" Nov 26 15:08:47 crc kubenswrapper[4785]: I1126 15:08:47.251961 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8acfa933b476aa5e196c195906225ae92af5cca3353292e23f29f87ee374d0ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f432643beb27bff31e146230a60837b1292cddb8595ede8ed7f929170fceeb17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:08:47Z is after 2025-08-24T17:21:41Z" Nov 26 15:08:47 crc kubenswrapper[4785]: I1126 15:08:47.261803 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ab8b8fda3321443cb24c214c10ba7171f01c80adf81860f4aa931062ec886d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:08:47Z is after 2025-08-24T17:21:41Z" Nov 26 15:08:47 crc kubenswrapper[4785]: I1126 15:08:47.273585 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:08:47Z is after 2025-08-24T17:21:41Z" Nov 26 15:08:47 crc kubenswrapper[4785]: I1126 15:08:47.289925 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xbz7b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84d83039-5d86-45bb-a5c1-ca5b94ed92c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5459c3414a7754407ac6d182e88047bbe1b7b9750d15c7a88767c50c4b9dc720\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gdb7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f7072a2a9067489a27300fea823075b89d71d45a1d972bd9d94226b01c2c123\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f7072a2a9067489a27300fea823075b89d71d45a1d972bd9d94226b01c2c123\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gdb7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://538d65a0b5f3e2c45c1b9761f2c73e958d5de0873652f20f4047631423e538eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://538d65a0b5f3e2c45c1b9761f2c73e958d5de0873652f20f4047631423e538eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gdb7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1abf2b4c1a00c0bc17acaf7547130008127579df59592382ae6e81430936e6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d1abf2b4c1a00c0bc17acaf7547130008127579df59592382ae6e81430936e6d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gdb7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://35a2151729e5cb2ae347214fc91e6de116a4062943e4cb4a27ab0553436953ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35a2151729e5cb2ae347214fc91e6de116a4062943e4cb4a27ab0553436953ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gdb7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b51e95d33c83c57a3b5a7f580eb9cac5e50575e699c066694a9ae8fd4dcdfb20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b51e95d33c83c57a3b5a7f580eb9cac5e50575e699c066694a9ae8fd4dcdfb20\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gdb7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20f87e07ed26b3b93d4dad87f971e4ee3bf3c46299491a95990ab34c63167f93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://20f87e07ed26b3b93d4dad87f971e4ee3bf3c46299491a95990ab34c63167f93\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gdb7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xbz7b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:08:47Z is after 2025-08-24T17:21:41Z" Nov 26 15:08:47 crc kubenswrapper[4785]: I1126 15:08:47.301303 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-gkxdl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5539d39a-e2bc-4e7f-8b5a-a5e3e10c4ba4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://183451ec5e6ad1983b6e56a53adfcf2eaf3489dcfbc3cbc709f67002691b9010\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqzs8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5538d579d2e0ac36c778f58da6e8176f8f7a0995797e8bc7c3ad21a054346867\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqzs8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-gkxdl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:08:47Z is after 2025-08-24T17:21:41Z" Nov 26 15:08:47 crc kubenswrapper[4785]: I1126 15:08:47.322943 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:47 crc kubenswrapper[4785]: I1126 15:08:47.322989 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:47 crc kubenswrapper[4785]: I1126 15:08:47.322998 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:47 crc kubenswrapper[4785]: I1126 15:08:47.323013 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:47 crc kubenswrapper[4785]: I1126 15:08:47.323025 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:47Z","lastTransitionTime":"2025-11-26T15:08:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:47 crc kubenswrapper[4785]: I1126 15:08:47.329043 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-925q9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"862c58fd-3f79-4276-bd76-ce689d32cbd6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b754df1d5f2d20f3afdb10f2e77f4716de9d953c7322bb3e289d989d2699d17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://018b59a73e9d454f9156de17b00a8c5cccc3f321164cac374187afa18466bc45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e1a2013c12cbd23c8c9e4e0d5bd9f76df1c2c2f30f0d55fdb8c8cb03698ea576\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bc34631383b7470b2d582d037ea922f2c46d09ca9e37011d758e16837336a78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://77387f2d556a76807d809c75aa4d9e2fdf5de82d7a04e23d3f5461ccef860196\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://47be007cc11c8c2db5bc6dd5786cad4e45c1f8f08f872d060be0211012aad935\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8d76c8252001a8224d06473aabbbc9d9f0b4a73a415750b8414635ebf2b19da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c8d76c8252001a8224d06473aabbbc9d9f0b4a73a415750b8414635ebf2b19da\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-26T15:08:33Z\\\",\\\"message\\\":\\\"40702 6788 obj_retry.go:551] Creating *factory.egressNode crc took: 3.419578ms\\\\nI1126 15:08:33.840718 6788 factory.go:1336] Added *v1.Node event handler 7\\\\nI1126 15:08:33.840744 6788 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI1126 15:08:33.840780 6788 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-authentication/oauth-openshift\\\\\\\"}\\\\nI1126 15:08:33.840740 6788 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-machine-api/cluster-autoscaler-operator\\\\\\\"}\\\\nI1126 15:08:33.840820 6788 services_controller.go:360] Finished syncing service cluster-autoscaler-operator on namespace openshift-machine-api for network=default : 870.792µs\\\\nI1126 15:08:33.840809 6788 services_controller.go:360] Finished syncing service oauth-openshift on namespace openshift-authentication for network=default : 2.189107ms\\\\nI1126 15:08:33.840943 6788 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI1126 15:08:33.841021 6788 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI1126 15:08:33.841047 6788 ovnkube.go:599] Stopped ovnkube\\\\nI1126 15:08:33.841064 6788 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1126 15:08:33.841175 6788 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-26T15:08:33Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-925q9_openshift-ovn-kubernetes(862c58fd-3f79-4276-bd76-ce689d32cbd6)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84b98c7c476182b515072f71c6d81656debe5c03855324623b1eb31f53e6c5c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://049fc6de07d4b51e1c898035109c3427dff189cc509f5973db7cb75c3a022bc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://049fc6de07d4b51e1c898035109c3427dff189cc509f5973db7cb75c3a022bc1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-925q9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:08:47Z is after 2025-08-24T17:21:41Z" Nov 26 15:08:47 crc kubenswrapper[4785]: I1126 15:08:47.425948 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:47 crc kubenswrapper[4785]: I1126 15:08:47.426189 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:47 crc kubenswrapper[4785]: I1126 15:08:47.426260 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:47 crc kubenswrapper[4785]: I1126 15:08:47.426328 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:47 crc kubenswrapper[4785]: I1126 15:08:47.426396 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:47Z","lastTransitionTime":"2025-11-26T15:08:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:47 crc kubenswrapper[4785]: I1126 15:08:47.528971 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:47 crc kubenswrapper[4785]: I1126 15:08:47.529033 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:47 crc kubenswrapper[4785]: I1126 15:08:47.529056 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:47 crc kubenswrapper[4785]: I1126 15:08:47.529086 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:47 crc kubenswrapper[4785]: I1126 15:08:47.529111 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:47Z","lastTransitionTime":"2025-11-26T15:08:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:47 crc kubenswrapper[4785]: I1126 15:08:47.632253 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:47 crc kubenswrapper[4785]: I1126 15:08:47.632324 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:47 crc kubenswrapper[4785]: I1126 15:08:47.632350 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:47 crc kubenswrapper[4785]: I1126 15:08:47.632400 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:47 crc kubenswrapper[4785]: I1126 15:08:47.632424 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:47Z","lastTransitionTime":"2025-11-26T15:08:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:47 crc kubenswrapper[4785]: I1126 15:08:47.735308 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:47 crc kubenswrapper[4785]: I1126 15:08:47.735367 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:47 crc kubenswrapper[4785]: I1126 15:08:47.735388 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:47 crc kubenswrapper[4785]: I1126 15:08:47.735416 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:47 crc kubenswrapper[4785]: I1126 15:08:47.735437 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:47Z","lastTransitionTime":"2025-11-26T15:08:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:47 crc kubenswrapper[4785]: I1126 15:08:47.838645 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:47 crc kubenswrapper[4785]: I1126 15:08:47.838711 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:47 crc kubenswrapper[4785]: I1126 15:08:47.838735 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:47 crc kubenswrapper[4785]: I1126 15:08:47.838766 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:47 crc kubenswrapper[4785]: I1126 15:08:47.838785 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:47Z","lastTransitionTime":"2025-11-26T15:08:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:47 crc kubenswrapper[4785]: I1126 15:08:47.942503 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:47 crc kubenswrapper[4785]: I1126 15:08:47.942859 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:47 crc kubenswrapper[4785]: I1126 15:08:47.943004 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:47 crc kubenswrapper[4785]: I1126 15:08:47.943150 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:47 crc kubenswrapper[4785]: I1126 15:08:47.943295 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:47Z","lastTransitionTime":"2025-11-26T15:08:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:48 crc kubenswrapper[4785]: I1126 15:08:48.035600 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 26 15:08:48 crc kubenswrapper[4785]: E1126 15:08:48.035835 4785 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 26 15:08:48 crc kubenswrapper[4785]: I1126 15:08:48.036093 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qdfwp" Nov 26 15:08:48 crc kubenswrapper[4785]: E1126 15:08:48.036271 4785 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qdfwp" podUID="72903df2-b694-4229-96b5-167500cab723" Nov 26 15:08:48 crc kubenswrapper[4785]: I1126 15:08:48.045758 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:48 crc kubenswrapper[4785]: I1126 15:08:48.045835 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:48 crc kubenswrapper[4785]: I1126 15:08:48.045860 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:48 crc kubenswrapper[4785]: I1126 15:08:48.045887 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:48 crc kubenswrapper[4785]: I1126 15:08:48.045908 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:48Z","lastTransitionTime":"2025-11-26T15:08:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:48 crc kubenswrapper[4785]: I1126 15:08:48.154445 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:48 crc kubenswrapper[4785]: I1126 15:08:48.154516 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:48 crc kubenswrapper[4785]: I1126 15:08:48.154535 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:48 crc kubenswrapper[4785]: I1126 15:08:48.154590 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:48 crc kubenswrapper[4785]: I1126 15:08:48.154609 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:48Z","lastTransitionTime":"2025-11-26T15:08:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:48 crc kubenswrapper[4785]: I1126 15:08:48.257860 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:48 crc kubenswrapper[4785]: I1126 15:08:48.257937 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:48 crc kubenswrapper[4785]: I1126 15:08:48.257959 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:48 crc kubenswrapper[4785]: I1126 15:08:48.257986 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:48 crc kubenswrapper[4785]: I1126 15:08:48.258009 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:48Z","lastTransitionTime":"2025-11-26T15:08:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:48 crc kubenswrapper[4785]: I1126 15:08:48.361283 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:48 crc kubenswrapper[4785]: I1126 15:08:48.361378 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:48 crc kubenswrapper[4785]: I1126 15:08:48.361413 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:48 crc kubenswrapper[4785]: I1126 15:08:48.361449 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:48 crc kubenswrapper[4785]: I1126 15:08:48.361473 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:48Z","lastTransitionTime":"2025-11-26T15:08:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:48 crc kubenswrapper[4785]: I1126 15:08:48.463964 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:48 crc kubenswrapper[4785]: I1126 15:08:48.464041 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:48 crc kubenswrapper[4785]: I1126 15:08:48.464063 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:48 crc kubenswrapper[4785]: I1126 15:08:48.464092 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:48 crc kubenswrapper[4785]: I1126 15:08:48.464114 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:48Z","lastTransitionTime":"2025-11-26T15:08:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:48 crc kubenswrapper[4785]: I1126 15:08:48.566684 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:48 crc kubenswrapper[4785]: I1126 15:08:48.566722 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:48 crc kubenswrapper[4785]: I1126 15:08:48.566733 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:48 crc kubenswrapper[4785]: I1126 15:08:48.566749 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:48 crc kubenswrapper[4785]: I1126 15:08:48.566759 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:48Z","lastTransitionTime":"2025-11-26T15:08:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:48 crc kubenswrapper[4785]: I1126 15:08:48.669389 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:48 crc kubenswrapper[4785]: I1126 15:08:48.669511 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:48 crc kubenswrapper[4785]: I1126 15:08:48.669539 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:48 crc kubenswrapper[4785]: I1126 15:08:48.669614 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:48 crc kubenswrapper[4785]: I1126 15:08:48.669637 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:48Z","lastTransitionTime":"2025-11-26T15:08:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:48 crc kubenswrapper[4785]: I1126 15:08:48.772530 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:48 crc kubenswrapper[4785]: I1126 15:08:48.772617 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:48 crc kubenswrapper[4785]: I1126 15:08:48.772634 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:48 crc kubenswrapper[4785]: I1126 15:08:48.772656 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:48 crc kubenswrapper[4785]: I1126 15:08:48.772672 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:48Z","lastTransitionTime":"2025-11-26T15:08:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:48 crc kubenswrapper[4785]: I1126 15:08:48.875441 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:48 crc kubenswrapper[4785]: I1126 15:08:48.875490 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:48 crc kubenswrapper[4785]: I1126 15:08:48.875502 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:48 crc kubenswrapper[4785]: I1126 15:08:48.875522 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:48 crc kubenswrapper[4785]: I1126 15:08:48.875534 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:48Z","lastTransitionTime":"2025-11-26T15:08:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:48 crc kubenswrapper[4785]: I1126 15:08:48.978955 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:48 crc kubenswrapper[4785]: I1126 15:08:48.979022 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:48 crc kubenswrapper[4785]: I1126 15:08:48.979042 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:48 crc kubenswrapper[4785]: I1126 15:08:48.979067 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:48 crc kubenswrapper[4785]: I1126 15:08:48.979088 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:48Z","lastTransitionTime":"2025-11-26T15:08:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:49 crc kubenswrapper[4785]: I1126 15:08:49.035453 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 26 15:08:49 crc kubenswrapper[4785]: I1126 15:08:49.035618 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 26 15:08:49 crc kubenswrapper[4785]: E1126 15:08:49.035831 4785 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 26 15:08:49 crc kubenswrapper[4785]: E1126 15:08:49.035999 4785 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 26 15:08:49 crc kubenswrapper[4785]: I1126 15:08:49.082831 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:49 crc kubenswrapper[4785]: I1126 15:08:49.082927 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:49 crc kubenswrapper[4785]: I1126 15:08:49.082949 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:49 crc kubenswrapper[4785]: I1126 15:08:49.082977 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:49 crc kubenswrapper[4785]: I1126 15:08:49.082996 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:49Z","lastTransitionTime":"2025-11-26T15:08:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:49 crc kubenswrapper[4785]: I1126 15:08:49.184881 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:49 crc kubenswrapper[4785]: I1126 15:08:49.184920 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:49 crc kubenswrapper[4785]: I1126 15:08:49.184929 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:49 crc kubenswrapper[4785]: I1126 15:08:49.184944 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:49 crc kubenswrapper[4785]: I1126 15:08:49.184954 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:49Z","lastTransitionTime":"2025-11-26T15:08:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:49 crc kubenswrapper[4785]: I1126 15:08:49.287374 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:49 crc kubenswrapper[4785]: I1126 15:08:49.287444 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:49 crc kubenswrapper[4785]: I1126 15:08:49.287465 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:49 crc kubenswrapper[4785]: I1126 15:08:49.287491 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:49 crc kubenswrapper[4785]: I1126 15:08:49.287511 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:49Z","lastTransitionTime":"2025-11-26T15:08:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:49 crc kubenswrapper[4785]: I1126 15:08:49.390344 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:49 crc kubenswrapper[4785]: I1126 15:08:49.390423 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:49 crc kubenswrapper[4785]: I1126 15:08:49.390450 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:49 crc kubenswrapper[4785]: I1126 15:08:49.390481 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:49 crc kubenswrapper[4785]: I1126 15:08:49.390506 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:49Z","lastTransitionTime":"2025-11-26T15:08:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:49 crc kubenswrapper[4785]: I1126 15:08:49.493049 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:49 crc kubenswrapper[4785]: I1126 15:08:49.493096 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:49 crc kubenswrapper[4785]: I1126 15:08:49.493109 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:49 crc kubenswrapper[4785]: I1126 15:08:49.493128 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:49 crc kubenswrapper[4785]: I1126 15:08:49.493143 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:49Z","lastTransitionTime":"2025-11-26T15:08:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:49 crc kubenswrapper[4785]: I1126 15:08:49.595863 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:49 crc kubenswrapper[4785]: I1126 15:08:49.595926 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:49 crc kubenswrapper[4785]: I1126 15:08:49.595938 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:49 crc kubenswrapper[4785]: I1126 15:08:49.595959 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:49 crc kubenswrapper[4785]: I1126 15:08:49.595973 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:49Z","lastTransitionTime":"2025-11-26T15:08:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:49 crc kubenswrapper[4785]: I1126 15:08:49.698503 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:49 crc kubenswrapper[4785]: I1126 15:08:49.698547 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:49 crc kubenswrapper[4785]: I1126 15:08:49.698573 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:49 crc kubenswrapper[4785]: I1126 15:08:49.698593 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:49 crc kubenswrapper[4785]: I1126 15:08:49.698605 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:49Z","lastTransitionTime":"2025-11-26T15:08:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:49 crc kubenswrapper[4785]: I1126 15:08:49.801989 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:49 crc kubenswrapper[4785]: I1126 15:08:49.802058 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:49 crc kubenswrapper[4785]: I1126 15:08:49.802074 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:49 crc kubenswrapper[4785]: I1126 15:08:49.802097 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:49 crc kubenswrapper[4785]: I1126 15:08:49.802117 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:49Z","lastTransitionTime":"2025-11-26T15:08:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:49 crc kubenswrapper[4785]: I1126 15:08:49.905164 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:49 crc kubenswrapper[4785]: I1126 15:08:49.905218 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:49 crc kubenswrapper[4785]: I1126 15:08:49.905229 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:49 crc kubenswrapper[4785]: I1126 15:08:49.905248 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:49 crc kubenswrapper[4785]: I1126 15:08:49.905261 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:49Z","lastTransitionTime":"2025-11-26T15:08:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:50 crc kubenswrapper[4785]: I1126 15:08:50.008116 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:50 crc kubenswrapper[4785]: I1126 15:08:50.008170 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:50 crc kubenswrapper[4785]: I1126 15:08:50.008187 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:50 crc kubenswrapper[4785]: I1126 15:08:50.008210 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:50 crc kubenswrapper[4785]: I1126 15:08:50.008228 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:50Z","lastTransitionTime":"2025-11-26T15:08:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:50 crc kubenswrapper[4785]: I1126 15:08:50.035754 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 26 15:08:50 crc kubenswrapper[4785]: I1126 15:08:50.035770 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qdfwp" Nov 26 15:08:50 crc kubenswrapper[4785]: E1126 15:08:50.035983 4785 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 26 15:08:50 crc kubenswrapper[4785]: E1126 15:08:50.036081 4785 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qdfwp" podUID="72903df2-b694-4229-96b5-167500cab723" Nov 26 15:08:50 crc kubenswrapper[4785]: I1126 15:08:50.111525 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:50 crc kubenswrapper[4785]: I1126 15:08:50.111595 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:50 crc kubenswrapper[4785]: I1126 15:08:50.111606 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:50 crc kubenswrapper[4785]: I1126 15:08:50.111622 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:50 crc kubenswrapper[4785]: I1126 15:08:50.111637 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:50Z","lastTransitionTime":"2025-11-26T15:08:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:50 crc kubenswrapper[4785]: I1126 15:08:50.215211 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:50 crc kubenswrapper[4785]: I1126 15:08:50.215261 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:50 crc kubenswrapper[4785]: I1126 15:08:50.215272 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:50 crc kubenswrapper[4785]: I1126 15:08:50.215288 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:50 crc kubenswrapper[4785]: I1126 15:08:50.215300 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:50Z","lastTransitionTime":"2025-11-26T15:08:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:50 crc kubenswrapper[4785]: I1126 15:08:50.317913 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:50 crc kubenswrapper[4785]: I1126 15:08:50.317963 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:50 crc kubenswrapper[4785]: I1126 15:08:50.317975 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:50 crc kubenswrapper[4785]: I1126 15:08:50.317991 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:50 crc kubenswrapper[4785]: I1126 15:08:50.318003 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:50Z","lastTransitionTime":"2025-11-26T15:08:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:50 crc kubenswrapper[4785]: I1126 15:08:50.421627 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:50 crc kubenswrapper[4785]: I1126 15:08:50.421706 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:50 crc kubenswrapper[4785]: I1126 15:08:50.421728 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:50 crc kubenswrapper[4785]: I1126 15:08:50.421756 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:50 crc kubenswrapper[4785]: I1126 15:08:50.421778 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:50Z","lastTransitionTime":"2025-11-26T15:08:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:50 crc kubenswrapper[4785]: I1126 15:08:50.525041 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:50 crc kubenswrapper[4785]: I1126 15:08:50.525104 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:50 crc kubenswrapper[4785]: I1126 15:08:50.525121 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:50 crc kubenswrapper[4785]: I1126 15:08:50.525145 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:50 crc kubenswrapper[4785]: I1126 15:08:50.525161 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:50Z","lastTransitionTime":"2025-11-26T15:08:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:50 crc kubenswrapper[4785]: I1126 15:08:50.628503 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:50 crc kubenswrapper[4785]: I1126 15:08:50.628614 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:50 crc kubenswrapper[4785]: I1126 15:08:50.628640 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:50 crc kubenswrapper[4785]: I1126 15:08:50.628669 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:50 crc kubenswrapper[4785]: I1126 15:08:50.628688 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:50Z","lastTransitionTime":"2025-11-26T15:08:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:50 crc kubenswrapper[4785]: I1126 15:08:50.731999 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:50 crc kubenswrapper[4785]: I1126 15:08:50.732047 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:50 crc kubenswrapper[4785]: I1126 15:08:50.732059 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:50 crc kubenswrapper[4785]: I1126 15:08:50.732078 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:50 crc kubenswrapper[4785]: I1126 15:08:50.732090 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:50Z","lastTransitionTime":"2025-11-26T15:08:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:50 crc kubenswrapper[4785]: I1126 15:08:50.835413 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:50 crc kubenswrapper[4785]: I1126 15:08:50.835487 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:50 crc kubenswrapper[4785]: I1126 15:08:50.835510 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:50 crc kubenswrapper[4785]: I1126 15:08:50.835537 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:50 crc kubenswrapper[4785]: I1126 15:08:50.835585 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:50Z","lastTransitionTime":"2025-11-26T15:08:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:50 crc kubenswrapper[4785]: I1126 15:08:50.945756 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:50 crc kubenswrapper[4785]: I1126 15:08:50.945829 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:50 crc kubenswrapper[4785]: I1126 15:08:50.945848 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:50 crc kubenswrapper[4785]: I1126 15:08:50.945872 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:50 crc kubenswrapper[4785]: I1126 15:08:50.945889 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:50Z","lastTransitionTime":"2025-11-26T15:08:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:51 crc kubenswrapper[4785]: I1126 15:08:51.035758 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 26 15:08:51 crc kubenswrapper[4785]: I1126 15:08:51.036033 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 26 15:08:51 crc kubenswrapper[4785]: E1126 15:08:51.036156 4785 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 26 15:08:51 crc kubenswrapper[4785]: E1126 15:08:51.036237 4785 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 26 15:08:51 crc kubenswrapper[4785]: I1126 15:08:51.048490 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:51 crc kubenswrapper[4785]: I1126 15:08:51.048611 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:51 crc kubenswrapper[4785]: I1126 15:08:51.048638 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:51 crc kubenswrapper[4785]: I1126 15:08:51.048670 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:51 crc kubenswrapper[4785]: I1126 15:08:51.048694 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:51Z","lastTransitionTime":"2025-11-26T15:08:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:51 crc kubenswrapper[4785]: I1126 15:08:51.152362 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:51 crc kubenswrapper[4785]: I1126 15:08:51.152461 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:51 crc kubenswrapper[4785]: I1126 15:08:51.152479 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:51 crc kubenswrapper[4785]: I1126 15:08:51.152504 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:51 crc kubenswrapper[4785]: I1126 15:08:51.152521 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:51Z","lastTransitionTime":"2025-11-26T15:08:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:51 crc kubenswrapper[4785]: I1126 15:08:51.255605 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:51 crc kubenswrapper[4785]: I1126 15:08:51.255691 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:51 crc kubenswrapper[4785]: I1126 15:08:51.255714 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:51 crc kubenswrapper[4785]: I1126 15:08:51.255747 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:51 crc kubenswrapper[4785]: I1126 15:08:51.255770 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:51Z","lastTransitionTime":"2025-11-26T15:08:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:51 crc kubenswrapper[4785]: I1126 15:08:51.359359 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:51 crc kubenswrapper[4785]: I1126 15:08:51.359434 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:51 crc kubenswrapper[4785]: I1126 15:08:51.359499 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:51 crc kubenswrapper[4785]: I1126 15:08:51.359530 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:51 crc kubenswrapper[4785]: I1126 15:08:51.359603 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:51Z","lastTransitionTime":"2025-11-26T15:08:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:51 crc kubenswrapper[4785]: I1126 15:08:51.462497 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:51 crc kubenswrapper[4785]: I1126 15:08:51.462616 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:51 crc kubenswrapper[4785]: I1126 15:08:51.462642 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:51 crc kubenswrapper[4785]: I1126 15:08:51.462665 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:51 crc kubenswrapper[4785]: I1126 15:08:51.462682 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:51Z","lastTransitionTime":"2025-11-26T15:08:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:51 crc kubenswrapper[4785]: I1126 15:08:51.565366 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:51 crc kubenswrapper[4785]: I1126 15:08:51.565408 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:51 crc kubenswrapper[4785]: I1126 15:08:51.565418 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:51 crc kubenswrapper[4785]: I1126 15:08:51.565433 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:51 crc kubenswrapper[4785]: I1126 15:08:51.565445 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:51Z","lastTransitionTime":"2025-11-26T15:08:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:51 crc kubenswrapper[4785]: I1126 15:08:51.670912 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:51 crc kubenswrapper[4785]: I1126 15:08:51.670981 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:51 crc kubenswrapper[4785]: I1126 15:08:51.670999 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:51 crc kubenswrapper[4785]: I1126 15:08:51.671024 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:51 crc kubenswrapper[4785]: I1126 15:08:51.671045 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:51Z","lastTransitionTime":"2025-11-26T15:08:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:51 crc kubenswrapper[4785]: I1126 15:08:51.774083 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:51 crc kubenswrapper[4785]: I1126 15:08:51.774149 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:51 crc kubenswrapper[4785]: I1126 15:08:51.774163 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:51 crc kubenswrapper[4785]: I1126 15:08:51.774180 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:51 crc kubenswrapper[4785]: I1126 15:08:51.774192 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:51Z","lastTransitionTime":"2025-11-26T15:08:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:51 crc kubenswrapper[4785]: I1126 15:08:51.877456 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:51 crc kubenswrapper[4785]: I1126 15:08:51.877521 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:51 crc kubenswrapper[4785]: I1126 15:08:51.877544 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:51 crc kubenswrapper[4785]: I1126 15:08:51.877605 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:51 crc kubenswrapper[4785]: I1126 15:08:51.877624 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:51Z","lastTransitionTime":"2025-11-26T15:08:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:51 crc kubenswrapper[4785]: I1126 15:08:51.984648 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:51 crc kubenswrapper[4785]: I1126 15:08:51.984719 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:51 crc kubenswrapper[4785]: I1126 15:08:51.984735 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:51 crc kubenswrapper[4785]: I1126 15:08:51.984762 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:51 crc kubenswrapper[4785]: I1126 15:08:51.984782 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:51Z","lastTransitionTime":"2025-11-26T15:08:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:52 crc kubenswrapper[4785]: I1126 15:08:52.036440 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 26 15:08:52 crc kubenswrapper[4785]: I1126 15:08:52.036466 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qdfwp" Nov 26 15:08:52 crc kubenswrapper[4785]: E1126 15:08:52.036736 4785 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 26 15:08:52 crc kubenswrapper[4785]: E1126 15:08:52.036973 4785 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qdfwp" podUID="72903df2-b694-4229-96b5-167500cab723" Nov 26 15:08:52 crc kubenswrapper[4785]: I1126 15:08:52.088233 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:52 crc kubenswrapper[4785]: I1126 15:08:52.088401 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:52 crc kubenswrapper[4785]: I1126 15:08:52.088430 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:52 crc kubenswrapper[4785]: I1126 15:08:52.088461 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:52 crc kubenswrapper[4785]: I1126 15:08:52.088487 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:52Z","lastTransitionTime":"2025-11-26T15:08:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:52 crc kubenswrapper[4785]: I1126 15:08:52.191227 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:52 crc kubenswrapper[4785]: I1126 15:08:52.191352 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:52 crc kubenswrapper[4785]: I1126 15:08:52.191374 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:52 crc kubenswrapper[4785]: I1126 15:08:52.191406 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:52 crc kubenswrapper[4785]: I1126 15:08:52.191431 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:52Z","lastTransitionTime":"2025-11-26T15:08:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:52 crc kubenswrapper[4785]: I1126 15:08:52.294344 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:52 crc kubenswrapper[4785]: I1126 15:08:52.294391 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:52 crc kubenswrapper[4785]: I1126 15:08:52.294402 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:52 crc kubenswrapper[4785]: I1126 15:08:52.294418 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:52 crc kubenswrapper[4785]: I1126 15:08:52.294429 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:52Z","lastTransitionTime":"2025-11-26T15:08:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:52 crc kubenswrapper[4785]: I1126 15:08:52.397427 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:52 crc kubenswrapper[4785]: I1126 15:08:52.397476 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:52 crc kubenswrapper[4785]: I1126 15:08:52.397491 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:52 crc kubenswrapper[4785]: I1126 15:08:52.397511 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:52 crc kubenswrapper[4785]: I1126 15:08:52.397524 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:52Z","lastTransitionTime":"2025-11-26T15:08:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:52 crc kubenswrapper[4785]: I1126 15:08:52.500785 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:52 crc kubenswrapper[4785]: I1126 15:08:52.500870 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:52 crc kubenswrapper[4785]: I1126 15:08:52.500888 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:52 crc kubenswrapper[4785]: I1126 15:08:52.500921 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:52 crc kubenswrapper[4785]: I1126 15:08:52.500947 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:52Z","lastTransitionTime":"2025-11-26T15:08:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:52 crc kubenswrapper[4785]: I1126 15:08:52.607757 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:52 crc kubenswrapper[4785]: I1126 15:08:52.607807 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:52 crc kubenswrapper[4785]: I1126 15:08:52.607826 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:52 crc kubenswrapper[4785]: I1126 15:08:52.607848 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:52 crc kubenswrapper[4785]: I1126 15:08:52.607864 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:52Z","lastTransitionTime":"2025-11-26T15:08:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:52 crc kubenswrapper[4785]: I1126 15:08:52.710019 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:52 crc kubenswrapper[4785]: I1126 15:08:52.710043 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:52 crc kubenswrapper[4785]: I1126 15:08:52.710050 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:52 crc kubenswrapper[4785]: I1126 15:08:52.710063 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:52 crc kubenswrapper[4785]: I1126 15:08:52.710072 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:52Z","lastTransitionTime":"2025-11-26T15:08:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:52 crc kubenswrapper[4785]: I1126 15:08:52.812923 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:52 crc kubenswrapper[4785]: I1126 15:08:52.812983 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:52 crc kubenswrapper[4785]: I1126 15:08:52.812999 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:52 crc kubenswrapper[4785]: I1126 15:08:52.813023 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:52 crc kubenswrapper[4785]: I1126 15:08:52.813040 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:52Z","lastTransitionTime":"2025-11-26T15:08:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:52 crc kubenswrapper[4785]: I1126 15:08:52.914615 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:52 crc kubenswrapper[4785]: I1126 15:08:52.914652 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:52 crc kubenswrapper[4785]: I1126 15:08:52.914660 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:52 crc kubenswrapper[4785]: I1126 15:08:52.914672 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:52 crc kubenswrapper[4785]: I1126 15:08:52.914680 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:52Z","lastTransitionTime":"2025-11-26T15:08:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:53 crc kubenswrapper[4785]: I1126 15:08:53.017791 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:53 crc kubenswrapper[4785]: I1126 15:08:53.017839 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:53 crc kubenswrapper[4785]: I1126 15:08:53.017852 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:53 crc kubenswrapper[4785]: I1126 15:08:53.017878 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:53 crc kubenswrapper[4785]: I1126 15:08:53.017890 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:53Z","lastTransitionTime":"2025-11-26T15:08:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:53 crc kubenswrapper[4785]: I1126 15:08:53.035748 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 26 15:08:53 crc kubenswrapper[4785]: E1126 15:08:53.035865 4785 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 26 15:08:53 crc kubenswrapper[4785]: I1126 15:08:53.036005 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 26 15:08:53 crc kubenswrapper[4785]: E1126 15:08:53.036196 4785 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 26 15:08:53 crc kubenswrapper[4785]: I1126 15:08:53.121034 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:53 crc kubenswrapper[4785]: I1126 15:08:53.121092 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:53 crc kubenswrapper[4785]: I1126 15:08:53.121109 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:53 crc kubenswrapper[4785]: I1126 15:08:53.121131 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:53 crc kubenswrapper[4785]: I1126 15:08:53.121148 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:53Z","lastTransitionTime":"2025-11-26T15:08:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:53 crc kubenswrapper[4785]: I1126 15:08:53.225892 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:53 crc kubenswrapper[4785]: I1126 15:08:53.225953 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:53 crc kubenswrapper[4785]: I1126 15:08:53.225969 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:53 crc kubenswrapper[4785]: I1126 15:08:53.225991 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:53 crc kubenswrapper[4785]: I1126 15:08:53.226007 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:53Z","lastTransitionTime":"2025-11-26T15:08:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:53 crc kubenswrapper[4785]: I1126 15:08:53.328719 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:53 crc kubenswrapper[4785]: I1126 15:08:53.328766 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:53 crc kubenswrapper[4785]: I1126 15:08:53.328781 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:53 crc kubenswrapper[4785]: I1126 15:08:53.328802 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:53 crc kubenswrapper[4785]: I1126 15:08:53.328818 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:53Z","lastTransitionTime":"2025-11-26T15:08:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:53 crc kubenswrapper[4785]: I1126 15:08:53.432374 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:53 crc kubenswrapper[4785]: I1126 15:08:53.432528 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:53 crc kubenswrapper[4785]: I1126 15:08:53.432548 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:53 crc kubenswrapper[4785]: I1126 15:08:53.432608 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:53 crc kubenswrapper[4785]: I1126 15:08:53.432627 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:53Z","lastTransitionTime":"2025-11-26T15:08:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:53 crc kubenswrapper[4785]: I1126 15:08:53.536320 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:53 crc kubenswrapper[4785]: I1126 15:08:53.536385 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:53 crc kubenswrapper[4785]: I1126 15:08:53.536408 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:53 crc kubenswrapper[4785]: I1126 15:08:53.536437 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:53 crc kubenswrapper[4785]: I1126 15:08:53.536463 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:53Z","lastTransitionTime":"2025-11-26T15:08:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:53 crc kubenswrapper[4785]: I1126 15:08:53.638841 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:53 crc kubenswrapper[4785]: I1126 15:08:53.638939 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:53 crc kubenswrapper[4785]: I1126 15:08:53.638958 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:53 crc kubenswrapper[4785]: I1126 15:08:53.638982 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:53 crc kubenswrapper[4785]: I1126 15:08:53.639003 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:53Z","lastTransitionTime":"2025-11-26T15:08:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:53 crc kubenswrapper[4785]: I1126 15:08:53.742170 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:53 crc kubenswrapper[4785]: I1126 15:08:53.742241 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:53 crc kubenswrapper[4785]: I1126 15:08:53.742262 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:53 crc kubenswrapper[4785]: I1126 15:08:53.742286 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:53 crc kubenswrapper[4785]: I1126 15:08:53.742303 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:53Z","lastTransitionTime":"2025-11-26T15:08:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:53 crc kubenswrapper[4785]: I1126 15:08:53.845121 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:53 crc kubenswrapper[4785]: I1126 15:08:53.845193 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:53 crc kubenswrapper[4785]: I1126 15:08:53.845216 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:53 crc kubenswrapper[4785]: I1126 15:08:53.845245 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:53 crc kubenswrapper[4785]: I1126 15:08:53.845266 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:53Z","lastTransitionTime":"2025-11-26T15:08:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:53 crc kubenswrapper[4785]: I1126 15:08:53.949065 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:53 crc kubenswrapper[4785]: I1126 15:08:53.949149 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:53 crc kubenswrapper[4785]: I1126 15:08:53.949194 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:53 crc kubenswrapper[4785]: I1126 15:08:53.949230 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:53 crc kubenswrapper[4785]: I1126 15:08:53.949250 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:53Z","lastTransitionTime":"2025-11-26T15:08:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:54 crc kubenswrapper[4785]: I1126 15:08:54.036543 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qdfwp" Nov 26 15:08:54 crc kubenswrapper[4785]: I1126 15:08:54.036687 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 26 15:08:54 crc kubenswrapper[4785]: E1126 15:08:54.036791 4785 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qdfwp" podUID="72903df2-b694-4229-96b5-167500cab723" Nov 26 15:08:54 crc kubenswrapper[4785]: E1126 15:08:54.036877 4785 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 26 15:08:54 crc kubenswrapper[4785]: I1126 15:08:54.052711 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:54 crc kubenswrapper[4785]: I1126 15:08:54.052778 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:54 crc kubenswrapper[4785]: I1126 15:08:54.052801 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:54 crc kubenswrapper[4785]: I1126 15:08:54.052830 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:54 crc kubenswrapper[4785]: I1126 15:08:54.052854 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:54Z","lastTransitionTime":"2025-11-26T15:08:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:54 crc kubenswrapper[4785]: I1126 15:08:54.156463 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:54 crc kubenswrapper[4785]: I1126 15:08:54.156528 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:54 crc kubenswrapper[4785]: I1126 15:08:54.156545 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:54 crc kubenswrapper[4785]: I1126 15:08:54.156613 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:54 crc kubenswrapper[4785]: I1126 15:08:54.156634 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:54Z","lastTransitionTime":"2025-11-26T15:08:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:54 crc kubenswrapper[4785]: I1126 15:08:54.259695 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:54 crc kubenswrapper[4785]: I1126 15:08:54.259766 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:54 crc kubenswrapper[4785]: I1126 15:08:54.259790 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:54 crc kubenswrapper[4785]: I1126 15:08:54.259814 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:54 crc kubenswrapper[4785]: I1126 15:08:54.259831 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:54Z","lastTransitionTime":"2025-11-26T15:08:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:54 crc kubenswrapper[4785]: I1126 15:08:54.363430 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:54 crc kubenswrapper[4785]: I1126 15:08:54.363495 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:54 crc kubenswrapper[4785]: I1126 15:08:54.363512 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:54 crc kubenswrapper[4785]: I1126 15:08:54.363535 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:54 crc kubenswrapper[4785]: I1126 15:08:54.363591 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:54Z","lastTransitionTime":"2025-11-26T15:08:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:54 crc kubenswrapper[4785]: I1126 15:08:54.465285 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:54 crc kubenswrapper[4785]: I1126 15:08:54.465319 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:54 crc kubenswrapper[4785]: I1126 15:08:54.465327 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:54 crc kubenswrapper[4785]: I1126 15:08:54.465340 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:54 crc kubenswrapper[4785]: I1126 15:08:54.465349 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:54Z","lastTransitionTime":"2025-11-26T15:08:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:54 crc kubenswrapper[4785]: I1126 15:08:54.568050 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:54 crc kubenswrapper[4785]: I1126 15:08:54.568128 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:54 crc kubenswrapper[4785]: I1126 15:08:54.568146 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:54 crc kubenswrapper[4785]: I1126 15:08:54.568165 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:54 crc kubenswrapper[4785]: I1126 15:08:54.568215 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:54Z","lastTransitionTime":"2025-11-26T15:08:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:54 crc kubenswrapper[4785]: I1126 15:08:54.670959 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:54 crc kubenswrapper[4785]: I1126 15:08:54.671009 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:54 crc kubenswrapper[4785]: I1126 15:08:54.671019 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:54 crc kubenswrapper[4785]: I1126 15:08:54.671042 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:54 crc kubenswrapper[4785]: I1126 15:08:54.671057 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:54Z","lastTransitionTime":"2025-11-26T15:08:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:54 crc kubenswrapper[4785]: I1126 15:08:54.773418 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:54 crc kubenswrapper[4785]: I1126 15:08:54.773471 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:54 crc kubenswrapper[4785]: I1126 15:08:54.773483 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:54 crc kubenswrapper[4785]: I1126 15:08:54.773502 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:54 crc kubenswrapper[4785]: I1126 15:08:54.773511 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:54Z","lastTransitionTime":"2025-11-26T15:08:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:54 crc kubenswrapper[4785]: I1126 15:08:54.875986 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:54 crc kubenswrapper[4785]: I1126 15:08:54.876063 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:54 crc kubenswrapper[4785]: I1126 15:08:54.876075 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:54 crc kubenswrapper[4785]: I1126 15:08:54.876089 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:54 crc kubenswrapper[4785]: I1126 15:08:54.876101 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:54Z","lastTransitionTime":"2025-11-26T15:08:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:54 crc kubenswrapper[4785]: I1126 15:08:54.979426 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:54 crc kubenswrapper[4785]: I1126 15:08:54.979502 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:54 crc kubenswrapper[4785]: I1126 15:08:54.979524 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:54 crc kubenswrapper[4785]: I1126 15:08:54.979554 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:54 crc kubenswrapper[4785]: I1126 15:08:54.979621 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:54Z","lastTransitionTime":"2025-11-26T15:08:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:55 crc kubenswrapper[4785]: I1126 15:08:55.036177 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 26 15:08:55 crc kubenswrapper[4785]: I1126 15:08:55.036222 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 26 15:08:55 crc kubenswrapper[4785]: E1126 15:08:55.036393 4785 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 26 15:08:55 crc kubenswrapper[4785]: E1126 15:08:55.036721 4785 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 26 15:08:55 crc kubenswrapper[4785]: I1126 15:08:55.082352 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:55 crc kubenswrapper[4785]: I1126 15:08:55.082408 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:55 crc kubenswrapper[4785]: I1126 15:08:55.082418 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:55 crc kubenswrapper[4785]: I1126 15:08:55.082435 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:55 crc kubenswrapper[4785]: I1126 15:08:55.082446 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:55Z","lastTransitionTime":"2025-11-26T15:08:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:55 crc kubenswrapper[4785]: I1126 15:08:55.186516 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:55 crc kubenswrapper[4785]: I1126 15:08:55.186599 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:55 crc kubenswrapper[4785]: I1126 15:08:55.186617 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:55 crc kubenswrapper[4785]: I1126 15:08:55.186640 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:55 crc kubenswrapper[4785]: I1126 15:08:55.186663 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:55Z","lastTransitionTime":"2025-11-26T15:08:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:55 crc kubenswrapper[4785]: I1126 15:08:55.289986 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:55 crc kubenswrapper[4785]: I1126 15:08:55.290039 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:55 crc kubenswrapper[4785]: I1126 15:08:55.290060 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:55 crc kubenswrapper[4785]: I1126 15:08:55.290087 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:55 crc kubenswrapper[4785]: I1126 15:08:55.290107 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:55Z","lastTransitionTime":"2025-11-26T15:08:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:55 crc kubenswrapper[4785]: I1126 15:08:55.393186 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:55 crc kubenswrapper[4785]: I1126 15:08:55.393221 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:55 crc kubenswrapper[4785]: I1126 15:08:55.393233 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:55 crc kubenswrapper[4785]: I1126 15:08:55.393249 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:55 crc kubenswrapper[4785]: I1126 15:08:55.393260 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:55Z","lastTransitionTime":"2025-11-26T15:08:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:55 crc kubenswrapper[4785]: I1126 15:08:55.480886 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:55 crc kubenswrapper[4785]: I1126 15:08:55.480922 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:55 crc kubenswrapper[4785]: I1126 15:08:55.480935 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:55 crc kubenswrapper[4785]: I1126 15:08:55.480950 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:55 crc kubenswrapper[4785]: I1126 15:08:55.480961 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:55Z","lastTransitionTime":"2025-11-26T15:08:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:55 crc kubenswrapper[4785]: E1126 15:08:55.498524 4785 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T15:08:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T15:08:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T15:08:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T15:08:55Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T15:08:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T15:08:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T15:08:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T15:08:55Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"564b0a72-079b-4004-bac6-7e7947bc6860\\\",\\\"systemUUID\\\":\\\"0559caf5-1d73-4afa-a2e0-e8d6b738bfd5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:08:55Z is after 2025-08-24T17:21:41Z" Nov 26 15:08:55 crc kubenswrapper[4785]: I1126 15:08:55.504202 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:55 crc kubenswrapper[4785]: I1126 15:08:55.504326 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:55 crc kubenswrapper[4785]: I1126 15:08:55.504402 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:55 crc kubenswrapper[4785]: I1126 15:08:55.504434 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:55 crc kubenswrapper[4785]: I1126 15:08:55.504461 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:55Z","lastTransitionTime":"2025-11-26T15:08:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:55 crc kubenswrapper[4785]: E1126 15:08:55.524920 4785 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T15:08:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T15:08:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T15:08:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T15:08:55Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T15:08:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T15:08:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T15:08:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T15:08:55Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"564b0a72-079b-4004-bac6-7e7947bc6860\\\",\\\"systemUUID\\\":\\\"0559caf5-1d73-4afa-a2e0-e8d6b738bfd5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:08:55Z is after 2025-08-24T17:21:41Z" Nov 26 15:08:55 crc kubenswrapper[4785]: I1126 15:08:55.530164 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:55 crc kubenswrapper[4785]: I1126 15:08:55.530228 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:55 crc kubenswrapper[4785]: I1126 15:08:55.530251 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:55 crc kubenswrapper[4785]: I1126 15:08:55.530281 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:55 crc kubenswrapper[4785]: I1126 15:08:55.530305 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:55Z","lastTransitionTime":"2025-11-26T15:08:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:55 crc kubenswrapper[4785]: E1126 15:08:55.552698 4785 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T15:08:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T15:08:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T15:08:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T15:08:55Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T15:08:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T15:08:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T15:08:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T15:08:55Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"564b0a72-079b-4004-bac6-7e7947bc6860\\\",\\\"systemUUID\\\":\\\"0559caf5-1d73-4afa-a2e0-e8d6b738bfd5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:08:55Z is after 2025-08-24T17:21:41Z" Nov 26 15:08:55 crc kubenswrapper[4785]: I1126 15:08:55.556761 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:55 crc kubenswrapper[4785]: I1126 15:08:55.556819 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:55 crc kubenswrapper[4785]: I1126 15:08:55.556841 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:55 crc kubenswrapper[4785]: I1126 15:08:55.556868 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:55 crc kubenswrapper[4785]: I1126 15:08:55.556889 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:55Z","lastTransitionTime":"2025-11-26T15:08:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:55 crc kubenswrapper[4785]: E1126 15:08:55.572089 4785 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T15:08:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T15:08:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T15:08:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T15:08:55Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T15:08:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T15:08:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T15:08:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T15:08:55Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"564b0a72-079b-4004-bac6-7e7947bc6860\\\",\\\"systemUUID\\\":\\\"0559caf5-1d73-4afa-a2e0-e8d6b738bfd5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:08:55Z is after 2025-08-24T17:21:41Z" Nov 26 15:08:55 crc kubenswrapper[4785]: I1126 15:08:55.576868 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:55 crc kubenswrapper[4785]: I1126 15:08:55.577004 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:55 crc kubenswrapper[4785]: I1126 15:08:55.577030 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:55 crc kubenswrapper[4785]: I1126 15:08:55.577062 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:55 crc kubenswrapper[4785]: I1126 15:08:55.577086 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:55Z","lastTransitionTime":"2025-11-26T15:08:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:55 crc kubenswrapper[4785]: E1126 15:08:55.601063 4785 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T15:08:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T15:08:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T15:08:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T15:08:55Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T15:08:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T15:08:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-26T15:08:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-26T15:08:55Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"564b0a72-079b-4004-bac6-7e7947bc6860\\\",\\\"systemUUID\\\":\\\"0559caf5-1d73-4afa-a2e0-e8d6b738bfd5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:08:55Z is after 2025-08-24T17:21:41Z" Nov 26 15:08:55 crc kubenswrapper[4785]: E1126 15:08:55.601321 4785 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Nov 26 15:08:55 crc kubenswrapper[4785]: I1126 15:08:55.604022 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:55 crc kubenswrapper[4785]: I1126 15:08:55.604078 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:55 crc kubenswrapper[4785]: I1126 15:08:55.604095 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:55 crc kubenswrapper[4785]: I1126 15:08:55.604117 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:55 crc kubenswrapper[4785]: I1126 15:08:55.604135 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:55Z","lastTransitionTime":"2025-11-26T15:08:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:55 crc kubenswrapper[4785]: I1126 15:08:55.707282 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:55 crc kubenswrapper[4785]: I1126 15:08:55.707327 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:55 crc kubenswrapper[4785]: I1126 15:08:55.707343 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:55 crc kubenswrapper[4785]: I1126 15:08:55.707364 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:55 crc kubenswrapper[4785]: I1126 15:08:55.707381 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:55Z","lastTransitionTime":"2025-11-26T15:08:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:55 crc kubenswrapper[4785]: I1126 15:08:55.810272 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:55 crc kubenswrapper[4785]: I1126 15:08:55.810324 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:55 crc kubenswrapper[4785]: I1126 15:08:55.810333 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:55 crc kubenswrapper[4785]: I1126 15:08:55.810347 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:55 crc kubenswrapper[4785]: I1126 15:08:55.810358 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:55Z","lastTransitionTime":"2025-11-26T15:08:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:55 crc kubenswrapper[4785]: I1126 15:08:55.913354 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:55 crc kubenswrapper[4785]: I1126 15:08:55.913411 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:55 crc kubenswrapper[4785]: I1126 15:08:55.913429 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:55 crc kubenswrapper[4785]: I1126 15:08:55.913459 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:55 crc kubenswrapper[4785]: I1126 15:08:55.913478 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:55Z","lastTransitionTime":"2025-11-26T15:08:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:56 crc kubenswrapper[4785]: I1126 15:08:56.017129 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:56 crc kubenswrapper[4785]: I1126 15:08:56.017192 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:56 crc kubenswrapper[4785]: I1126 15:08:56.017210 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:56 crc kubenswrapper[4785]: I1126 15:08:56.017234 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:56 crc kubenswrapper[4785]: I1126 15:08:56.017253 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:56Z","lastTransitionTime":"2025-11-26T15:08:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:56 crc kubenswrapper[4785]: I1126 15:08:56.035669 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qdfwp" Nov 26 15:08:56 crc kubenswrapper[4785]: I1126 15:08:56.035746 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 26 15:08:56 crc kubenswrapper[4785]: E1126 15:08:56.035788 4785 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qdfwp" podUID="72903df2-b694-4229-96b5-167500cab723" Nov 26 15:08:56 crc kubenswrapper[4785]: E1126 15:08:56.035946 4785 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 26 15:08:56 crc kubenswrapper[4785]: I1126 15:08:56.119450 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:56 crc kubenswrapper[4785]: I1126 15:08:56.119515 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:56 crc kubenswrapper[4785]: I1126 15:08:56.119537 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:56 crc kubenswrapper[4785]: I1126 15:08:56.119594 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:56 crc kubenswrapper[4785]: I1126 15:08:56.119616 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:56Z","lastTransitionTime":"2025-11-26T15:08:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:56 crc kubenswrapper[4785]: I1126 15:08:56.222252 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:56 crc kubenswrapper[4785]: I1126 15:08:56.222321 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:56 crc kubenswrapper[4785]: I1126 15:08:56.222338 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:56 crc kubenswrapper[4785]: I1126 15:08:56.222364 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:56 crc kubenswrapper[4785]: I1126 15:08:56.222382 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:56Z","lastTransitionTime":"2025-11-26T15:08:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:56 crc kubenswrapper[4785]: I1126 15:08:56.325691 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:56 crc kubenswrapper[4785]: I1126 15:08:56.325747 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:56 crc kubenswrapper[4785]: I1126 15:08:56.325765 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:56 crc kubenswrapper[4785]: I1126 15:08:56.325786 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:56 crc kubenswrapper[4785]: I1126 15:08:56.325804 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:56Z","lastTransitionTime":"2025-11-26T15:08:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:56 crc kubenswrapper[4785]: I1126 15:08:56.428977 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:56 crc kubenswrapper[4785]: I1126 15:08:56.429052 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:56 crc kubenswrapper[4785]: I1126 15:08:56.429082 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:56 crc kubenswrapper[4785]: I1126 15:08:56.429109 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:56 crc kubenswrapper[4785]: I1126 15:08:56.429130 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:56Z","lastTransitionTime":"2025-11-26T15:08:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:56 crc kubenswrapper[4785]: I1126 15:08:56.532028 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:56 crc kubenswrapper[4785]: I1126 15:08:56.532087 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:56 crc kubenswrapper[4785]: I1126 15:08:56.532105 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:56 crc kubenswrapper[4785]: I1126 15:08:56.532129 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:56 crc kubenswrapper[4785]: I1126 15:08:56.532152 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:56Z","lastTransitionTime":"2025-11-26T15:08:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:56 crc kubenswrapper[4785]: I1126 15:08:56.634921 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:56 crc kubenswrapper[4785]: I1126 15:08:56.634967 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:56 crc kubenswrapper[4785]: I1126 15:08:56.634979 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:56 crc kubenswrapper[4785]: I1126 15:08:56.634997 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:56 crc kubenswrapper[4785]: I1126 15:08:56.635009 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:56Z","lastTransitionTime":"2025-11-26T15:08:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:56 crc kubenswrapper[4785]: I1126 15:08:56.722974 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/72903df2-b694-4229-96b5-167500cab723-metrics-certs\") pod \"network-metrics-daemon-qdfwp\" (UID: \"72903df2-b694-4229-96b5-167500cab723\") " pod="openshift-multus/network-metrics-daemon-qdfwp" Nov 26 15:08:56 crc kubenswrapper[4785]: E1126 15:08:56.723166 4785 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Nov 26 15:08:56 crc kubenswrapper[4785]: E1126 15:08:56.723289 4785 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/72903df2-b694-4229-96b5-167500cab723-metrics-certs podName:72903df2-b694-4229-96b5-167500cab723 nodeName:}" failed. No retries permitted until 2025-11-26 15:10:00.723255436 +0000 UTC m=+164.401621240 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/72903df2-b694-4229-96b5-167500cab723-metrics-certs") pod "network-metrics-daemon-qdfwp" (UID: "72903df2-b694-4229-96b5-167500cab723") : object "openshift-multus"/"metrics-daemon-secret" not registered Nov 26 15:08:56 crc kubenswrapper[4785]: I1126 15:08:56.737795 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:56 crc kubenswrapper[4785]: I1126 15:08:56.737842 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:56 crc kubenswrapper[4785]: I1126 15:08:56.737855 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:56 crc kubenswrapper[4785]: I1126 15:08:56.737872 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:56 crc kubenswrapper[4785]: I1126 15:08:56.737897 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:56Z","lastTransitionTime":"2025-11-26T15:08:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:56 crc kubenswrapper[4785]: I1126 15:08:56.840420 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:56 crc kubenswrapper[4785]: I1126 15:08:56.840465 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:56 crc kubenswrapper[4785]: I1126 15:08:56.840481 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:56 crc kubenswrapper[4785]: I1126 15:08:56.840501 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:56 crc kubenswrapper[4785]: I1126 15:08:56.840518 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:56Z","lastTransitionTime":"2025-11-26T15:08:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:56 crc kubenswrapper[4785]: I1126 15:08:56.943141 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:56 crc kubenswrapper[4785]: I1126 15:08:56.943191 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:56 crc kubenswrapper[4785]: I1126 15:08:56.943200 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:56 crc kubenswrapper[4785]: I1126 15:08:56.943214 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:56 crc kubenswrapper[4785]: I1126 15:08:56.943226 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:56Z","lastTransitionTime":"2025-11-26T15:08:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:57 crc kubenswrapper[4785]: I1126 15:08:57.036085 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 26 15:08:57 crc kubenswrapper[4785]: I1126 15:08:57.036156 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 26 15:08:57 crc kubenswrapper[4785]: E1126 15:08:57.036236 4785 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 26 15:08:57 crc kubenswrapper[4785]: E1126 15:08:57.036286 4785 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 26 15:08:57 crc kubenswrapper[4785]: I1126 15:08:57.045241 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:57 crc kubenswrapper[4785]: I1126 15:08:57.045283 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:57 crc kubenswrapper[4785]: I1126 15:08:57.045295 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:57 crc kubenswrapper[4785]: I1126 15:08:57.045309 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:57 crc kubenswrapper[4785]: I1126 15:08:57.045317 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:57Z","lastTransitionTime":"2025-11-26T15:08:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:57 crc kubenswrapper[4785]: I1126 15:08:57.047036 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2fc8b9de-d617-46d7-8834-84da7611f363\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08b2ab738589b29c6c67e3319ccab97ca16955fbc26dfa0dcf8a4374f6cc0d4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad113c4b2e70dde555f17e5b90363d0c0d5bcd70fa99ed5422ffc32bce3253c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ad113c4b2e70dde555f17e5b90363d0c0d5bcd70fa99ed5422ffc32bce3253c1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:17Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:08:57Z is after 2025-08-24T17:21:41Z" Nov 26 15:08:57 crc kubenswrapper[4785]: I1126 15:08:57.061329 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:08:57Z is after 2025-08-24T17:21:41Z" Nov 26 15:08:57 crc kubenswrapper[4785]: I1126 15:08:57.073028 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-smv28" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d77660b-5b10-4573-84ab-3dc318d4b4ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3bb62d64073e9a5efa6ec04394fb33bf1498892d59e0843f323a728dff2e7b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mwfft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:38Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-smv28\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:08:57Z is after 2025-08-24T17:21:41Z" Nov 26 15:08:57 crc kubenswrapper[4785]: I1126 15:08:57.086533 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zk6jt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"09c65f70-203d-40eb-a45e-ed8d7e36912f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69bcee7670290dd116b68ea6d15585455861981551919b4b553fc699f43161e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dg52p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba1f2e54aa3b8e81f5bc6ad4531d5976cfd8d6c4c15f16cefecc75c3c6189ef4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dg52p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-zk6jt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:08:57Z is after 2025-08-24T17:21:41Z" Nov 26 15:08:57 crc kubenswrapper[4785]: I1126 15:08:57.100527 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ce2473df-8540-437a-9b68-a0c79c8f1189\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26237e500f4655636096b017418a9029308b9ad9e5c12041546e51d919766529\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7904d95c1ece08fadcdc54e83f80b295f3904cc950fb84624bc2a6967e4d5bdd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://936454b4c6cdedceb140c50e2dba0e677ce2b126075b203bdae8a77bbdefde76\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2202f005c9b704cc7a97a7387e4563a14bda9e52001c63918f1067a7e676d34d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a6d4414478527bb76f539125e4be6ebf2a6470ddb46fd35e07180319fa98ede\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-26T15:07:36Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1126 15:07:30.557519 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1126 15:07:30.559919 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2275430498/tls.crt::/tmp/serving-cert-2275430498/tls.key\\\\\\\"\\\\nI1126 15:07:36.843783 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1126 15:07:36.847546 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1126 15:07:36.847601 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1126 15:07:36.847630 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1126 15:07:36.847642 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1126 15:07:36.858915 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1126 15:07:36.858955 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1126 15:07:36.858960 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1126 15:07:36.858971 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1126 15:07:36.858975 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1126 15:07:36.858978 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1126 15:07:36.858982 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1126 15:07:36.859422 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1126 15:07:36.862125 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:20Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcce80995d95ff6993051a4f1ca260133d2513d216a291e3f2c905ec29f88d1b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e86ca1ef167bd0c31ad5ae1283d40945c72c4e528f3702b577bbbf9818e84c08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e86ca1ef167bd0c31ad5ae1283d40945c72c4e528f3702b577bbbf9818e84c08\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:08:57Z is after 2025-08-24T17:21:41Z" Nov 26 15:08:57 crc kubenswrapper[4785]: I1126 15:08:57.114825 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f369bd7df062792e57a0cee19e7be96c6850b20d6e739a0c7fa5eccc14ae1d1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:08:57Z is after 2025-08-24T17:21:41Z" Nov 26 15:08:57 crc kubenswrapper[4785]: I1126 15:08:57.137693 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:08:57Z is after 2025-08-24T17:21:41Z" Nov 26 15:08:57 crc kubenswrapper[4785]: I1126 15:08:57.147216 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:57 crc kubenswrapper[4785]: I1126 15:08:57.147261 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:57 crc kubenswrapper[4785]: I1126 15:08:57.147271 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:57 crc kubenswrapper[4785]: I1126 15:08:57.147305 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:57 crc kubenswrapper[4785]: I1126 15:08:57.147319 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:57Z","lastTransitionTime":"2025-11-26T15:08:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:57 crc kubenswrapper[4785]: I1126 15:08:57.148673 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ab8b8fda3321443cb24c214c10ba7171f01c80adf81860f4aa931062ec886d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:08:57Z is after 2025-08-24T17:21:41Z" Nov 26 15:08:57 crc kubenswrapper[4785]: I1126 15:08:57.160575 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:34Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:08:57Z is after 2025-08-24T17:21:41Z" Nov 26 15:08:57 crc kubenswrapper[4785]: I1126 15:08:57.174029 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xbz7b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84d83039-5d86-45bb-a5c1-ca5b94ed92c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5459c3414a7754407ac6d182e88047bbe1b7b9750d15c7a88767c50c4b9dc720\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gdb7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f7072a2a9067489a27300fea823075b89d71d45a1d972bd9d94226b01c2c123\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f7072a2a9067489a27300fea823075b89d71d45a1d972bd9d94226b01c2c123\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gdb7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://538d65a0b5f3e2c45c1b9761f2c73e958d5de0873652f20f4047631423e538eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://538d65a0b5f3e2c45c1b9761f2c73e958d5de0873652f20f4047631423e538eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gdb7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1abf2b4c1a00c0bc17acaf7547130008127579df59592382ae6e81430936e6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d1abf2b4c1a00c0bc17acaf7547130008127579df59592382ae6e81430936e6d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gdb7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://35a2151729e5cb2ae347214fc91e6de116a4062943e4cb4a27ab0553436953ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35a2151729e5cb2ae347214fc91e6de116a4062943e4cb4a27ab0553436953ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gdb7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b51e95d33c83c57a3b5a7f580eb9cac5e50575e699c066694a9ae8fd4dcdfb20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b51e95d33c83c57a3b5a7f580eb9cac5e50575e699c066694a9ae8fd4dcdfb20\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gdb7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20f87e07ed26b3b93d4dad87f971e4ee3bf3c46299491a95990ab34c63167f93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://20f87e07ed26b3b93d4dad87f971e4ee3bf3c46299491a95990ab34c63167f93\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gdb7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xbz7b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:08:57Z is after 2025-08-24T17:21:41Z" Nov 26 15:08:57 crc kubenswrapper[4785]: I1126 15:08:57.185431 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-gkxdl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5539d39a-e2bc-4e7f-8b5a-a5e3e10c4ba4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://183451ec5e6ad1983b6e56a53adfcf2eaf3489dcfbc3cbc709f67002691b9010\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqzs8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5538d579d2e0ac36c778f58da6e8176f8f7a0995797e8bc7c3ad21a054346867\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqzs8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-gkxdl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:08:57Z is after 2025-08-24T17:21:41Z" Nov 26 15:08:57 crc kubenswrapper[4785]: I1126 15:08:57.196829 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dbac4f7f-578e-41e6-88a0-4ad8d2f5eef2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://320fc519a91dfed4d5c233e7c1c2f21aadf25336ccade8830069aae4bd804a68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b79ab88118b1e683ad26fc3bd390961d15af99cce73126c56587c0df819f3a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6c964ff713703fe4484d76d86473279ffc4a3eef0cf3dd8b29aea7fdc6271616\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03ac9c745ec9eb21263a599561496630202872ef631f5e846e0475efedee6943\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:08:57Z is after 2025-08-24T17:21:41Z" Nov 26 15:08:57 crc kubenswrapper[4785]: I1126 15:08:57.206527 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58c0357f-6901-420b-a246-f5ef95e4fb7e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:08:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:08:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2553398db49910f840349bfd11fd5ce7908438bdbcf5a654355c10fb3c3d610\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3c4c50721c470ba97725e59a42ae3cf740017ed7027a9e5b0a7e1906f40fd49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f8dc60997fbeb399187b6bc95276994ff0a181732022bfff1e229928996f90b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://169def9ac03274e05424365ce5b2fc1236d495e846603c29fa6efbaae27b53b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://169def9ac03274e05424365ce5b2fc1236d495e846603c29fa6efbaae27b53b2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:18Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:17Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:08:57Z is after 2025-08-24T17:21:41Z" Nov 26 15:08:57 crc kubenswrapper[4785]: I1126 15:08:57.224033 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ec3c2133-1999-46b4-b5a6-c1b800f04c30\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee866b365ef5e15c3cc262ab6ad0649b10fa7fdf0c10c3f747d9e20d48535e76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79b8f6bd2047642cacd4effcee8f9760c7c9480092993c7a25eabe4727bd8238\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dee1454ae94951e83f437b999f871e4965a9199a20bcd6392a6e07a7bfa9c6aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://257e51cb9b93727ccb616e193abc21dfed1ee9ee2f1ce459e35728457dae5280\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc743861f85f56a87c6a20d7152af59e554284fe13b8212c5dae854438771406\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ff60b31932a511f9156f1416e112d115fcbc79a0a6030143ba9f5ddd327577a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ff60b31932a511f9156f1416e112d115fcbc79a0a6030143ba9f5ddd327577a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff524be439c6bf410b9377cd503516b43be3826e4c2fcd03a6ed9bf3fcfab95b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff524be439c6bf410b9377cd503516b43be3826e4c2fcd03a6ed9bf3fcfab95b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:19Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://26cb5ba5508ef320a24ba05e352547c9feb82f37ebeab58ad7c0a073d03b8c2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26cb5ba5508ef320a24ba05e352547c9feb82f37ebeab58ad7c0a073d03b8c2d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:17Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:08:57Z is after 2025-08-24T17:21:41Z" Nov 26 15:08:57 crc kubenswrapper[4785]: I1126 15:08:57.235123 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8acfa933b476aa5e196c195906225ae92af5cca3353292e23f29f87ee374d0ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f432643beb27bff31e146230a60837b1292cddb8595ede8ed7f929170fceeb17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:08:57Z is after 2025-08-24T17:21:41Z" Nov 26 15:08:57 crc kubenswrapper[4785]: I1126 15:08:57.251281 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:57 crc kubenswrapper[4785]: I1126 15:08:57.251314 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:57 crc kubenswrapper[4785]: I1126 15:08:57.251327 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:57 crc kubenswrapper[4785]: I1126 15:08:57.251347 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:57 crc kubenswrapper[4785]: I1126 15:08:57.251360 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:57Z","lastTransitionTime":"2025-11-26T15:08:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:57 crc kubenswrapper[4785]: I1126 15:08:57.252320 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-925q9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"862c58fd-3f79-4276-bd76-ce689d32cbd6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b754df1d5f2d20f3afdb10f2e77f4716de9d953c7322bb3e289d989d2699d17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://018b59a73e9d454f9156de17b00a8c5cccc3f321164cac374187afa18466bc45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e1a2013c12cbd23c8c9e4e0d5bd9f76df1c2c2f30f0d55fdb8c8cb03698ea576\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bc34631383b7470b2d582d037ea922f2c46d09ca9e37011d758e16837336a78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://77387f2d556a76807d809c75aa4d9e2fdf5de82d7a04e23d3f5461ccef860196\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://47be007cc11c8c2db5bc6dd5786cad4e45c1f8f08f872d060be0211012aad935\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8d76c8252001a8224d06473aabbbc9d9f0b4a73a415750b8414635ebf2b19da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c8d76c8252001a8224d06473aabbbc9d9f0b4a73a415750b8414635ebf2b19da\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-26T15:08:33Z\\\",\\\"message\\\":\\\"40702 6788 obj_retry.go:551] Creating *factory.egressNode crc took: 3.419578ms\\\\nI1126 15:08:33.840718 6788 factory.go:1336] Added *v1.Node event handler 7\\\\nI1126 15:08:33.840744 6788 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI1126 15:08:33.840780 6788 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-authentication/oauth-openshift\\\\\\\"}\\\\nI1126 15:08:33.840740 6788 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-machine-api/cluster-autoscaler-operator\\\\\\\"}\\\\nI1126 15:08:33.840820 6788 services_controller.go:360] Finished syncing service cluster-autoscaler-operator on namespace openshift-machine-api for network=default : 870.792µs\\\\nI1126 15:08:33.840809 6788 services_controller.go:360] Finished syncing service oauth-openshift on namespace openshift-authentication for network=default : 2.189107ms\\\\nI1126 15:08:33.840943 6788 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI1126 15:08:33.841021 6788 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI1126 15:08:33.841047 6788 ovnkube.go:599] Stopped ovnkube\\\\nI1126 15:08:33.841064 6788 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1126 15:08:33.841175 6788 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-26T15:08:33Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-925q9_openshift-ovn-kubernetes(862c58fd-3f79-4276-bd76-ce689d32cbd6)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84b98c7c476182b515072f71c6d81656debe5c03855324623b1eb31f53e6c5c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://049fc6de07d4b51e1c898035109c3427dff189cc509f5973db7cb75c3a022bc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://049fc6de07d4b51e1c898035109c3427dff189cc509f5973db7cb75c3a022bc1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgxb5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-925q9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:08:57Z is after 2025-08-24T17:21:41Z" Nov 26 15:08:57 crc kubenswrapper[4785]: I1126 15:08:57.262799 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-hk884" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b9a87c55-b930-4993-88b8-15902e000caa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://018c4591e8ee49a626a6357d6672b723d221f62eaa62face83fd6c3d5f1b3bfa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:07:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9n9rk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:38Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-hk884\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:08:57Z is after 2025-08-24T17:21:41Z" Nov 26 15:08:57 crc kubenswrapper[4785]: I1126 15:08:57.276726 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6q4xd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"855bd894-cca9-4fe1-a0d5-8b72afe7c93a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:08:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:08:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96bab32cef462780b564492521f2bdddda092ed263374eb243d3e20ed8a068ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://041064cd63f3d654e02e5e310fbc190ca527a986959c9d7a35ab87962184f2a0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-26T15:08:26Z\\\",\\\"message\\\":\\\"2025-11-26T15:07:40+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_bd6b8591-1e03-43be-806d-3dc58de1780b\\\\n2025-11-26T15:07:40+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_bd6b8591-1e03-43be-806d-3dc58de1780b to /host/opt/cni/bin/\\\\n2025-11-26T15:07:41Z [verbose] multus-daemon started\\\\n2025-11-26T15:07:41Z [verbose] Readiness Indicator file check\\\\n2025-11-26T15:08:26Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-26T15:07:39Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-26T15:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7zv2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6q4xd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:08:57Z is after 2025-08-24T17:21:41Z" Nov 26 15:08:57 crc kubenswrapper[4785]: I1126 15:08:57.286870 4785 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-qdfwp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72903df2-b694-4229-96b5-167500cab723\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-26T15:07:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrprh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrprh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-26T15:07:52Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-qdfwp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-26T15:08:57Z is after 2025-08-24T17:21:41Z" Nov 26 15:08:57 crc kubenswrapper[4785]: I1126 15:08:57.353572 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:57 crc kubenswrapper[4785]: I1126 15:08:57.353615 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:57 crc kubenswrapper[4785]: I1126 15:08:57.353625 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:57 crc kubenswrapper[4785]: I1126 15:08:57.353641 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:57 crc kubenswrapper[4785]: I1126 15:08:57.353652 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:57Z","lastTransitionTime":"2025-11-26T15:08:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:57 crc kubenswrapper[4785]: I1126 15:08:57.457241 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:57 crc kubenswrapper[4785]: I1126 15:08:57.457324 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:57 crc kubenswrapper[4785]: I1126 15:08:57.457346 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:57 crc kubenswrapper[4785]: I1126 15:08:57.457372 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:57 crc kubenswrapper[4785]: I1126 15:08:57.457390 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:57Z","lastTransitionTime":"2025-11-26T15:08:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:57 crc kubenswrapper[4785]: I1126 15:08:57.560658 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:57 crc kubenswrapper[4785]: I1126 15:08:57.560750 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:57 crc kubenswrapper[4785]: I1126 15:08:57.560769 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:57 crc kubenswrapper[4785]: I1126 15:08:57.560829 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:57 crc kubenswrapper[4785]: I1126 15:08:57.560848 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:57Z","lastTransitionTime":"2025-11-26T15:08:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:57 crc kubenswrapper[4785]: I1126 15:08:57.663745 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:57 crc kubenswrapper[4785]: I1126 15:08:57.663809 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:57 crc kubenswrapper[4785]: I1126 15:08:57.663826 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:57 crc kubenswrapper[4785]: I1126 15:08:57.663850 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:57 crc kubenswrapper[4785]: I1126 15:08:57.663870 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:57Z","lastTransitionTime":"2025-11-26T15:08:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:57 crc kubenswrapper[4785]: I1126 15:08:57.766644 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:57 crc kubenswrapper[4785]: I1126 15:08:57.767030 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:57 crc kubenswrapper[4785]: I1126 15:08:57.767232 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:57 crc kubenswrapper[4785]: I1126 15:08:57.767411 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:57 crc kubenswrapper[4785]: I1126 15:08:57.767615 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:57Z","lastTransitionTime":"2025-11-26T15:08:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:57 crc kubenswrapper[4785]: I1126 15:08:57.870877 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:57 crc kubenswrapper[4785]: I1126 15:08:57.870945 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:57 crc kubenswrapper[4785]: I1126 15:08:57.870961 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:57 crc kubenswrapper[4785]: I1126 15:08:57.870985 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:57 crc kubenswrapper[4785]: I1126 15:08:57.871002 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:57Z","lastTransitionTime":"2025-11-26T15:08:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:57 crc kubenswrapper[4785]: I1126 15:08:57.974790 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:57 crc kubenswrapper[4785]: I1126 15:08:57.975050 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:57 crc kubenswrapper[4785]: I1126 15:08:57.975225 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:57 crc kubenswrapper[4785]: I1126 15:08:57.975303 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:57 crc kubenswrapper[4785]: I1126 15:08:57.975366 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:57Z","lastTransitionTime":"2025-11-26T15:08:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:58 crc kubenswrapper[4785]: I1126 15:08:58.035523 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 26 15:08:58 crc kubenswrapper[4785]: I1126 15:08:58.035644 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qdfwp" Nov 26 15:08:58 crc kubenswrapper[4785]: E1126 15:08:58.035734 4785 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 26 15:08:58 crc kubenswrapper[4785]: E1126 15:08:58.035785 4785 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qdfwp" podUID="72903df2-b694-4229-96b5-167500cab723" Nov 26 15:08:58 crc kubenswrapper[4785]: I1126 15:08:58.037168 4785 scope.go:117] "RemoveContainer" containerID="c8d76c8252001a8224d06473aabbbc9d9f0b4a73a415750b8414635ebf2b19da" Nov 26 15:08:58 crc kubenswrapper[4785]: E1126 15:08:58.037549 4785 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-925q9_openshift-ovn-kubernetes(862c58fd-3f79-4276-bd76-ce689d32cbd6)\"" pod="openshift-ovn-kubernetes/ovnkube-node-925q9" podUID="862c58fd-3f79-4276-bd76-ce689d32cbd6" Nov 26 15:08:58 crc kubenswrapper[4785]: I1126 15:08:58.078441 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:58 crc kubenswrapper[4785]: I1126 15:08:58.078636 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:58 crc kubenswrapper[4785]: I1126 15:08:58.078659 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:58 crc kubenswrapper[4785]: I1126 15:08:58.078870 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:58 crc kubenswrapper[4785]: I1126 15:08:58.078905 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:58Z","lastTransitionTime":"2025-11-26T15:08:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:58 crc kubenswrapper[4785]: I1126 15:08:58.181726 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:58 crc kubenswrapper[4785]: I1126 15:08:58.181779 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:58 crc kubenswrapper[4785]: I1126 15:08:58.181792 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:58 crc kubenswrapper[4785]: I1126 15:08:58.181813 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:58 crc kubenswrapper[4785]: I1126 15:08:58.181826 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:58Z","lastTransitionTime":"2025-11-26T15:08:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:58 crc kubenswrapper[4785]: I1126 15:08:58.284772 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:58 crc kubenswrapper[4785]: I1126 15:08:58.284834 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:58 crc kubenswrapper[4785]: I1126 15:08:58.284851 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:58 crc kubenswrapper[4785]: I1126 15:08:58.284874 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:58 crc kubenswrapper[4785]: I1126 15:08:58.284891 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:58Z","lastTransitionTime":"2025-11-26T15:08:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:58 crc kubenswrapper[4785]: I1126 15:08:58.387806 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:58 crc kubenswrapper[4785]: I1126 15:08:58.387859 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:58 crc kubenswrapper[4785]: I1126 15:08:58.387876 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:58 crc kubenswrapper[4785]: I1126 15:08:58.387898 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:58 crc kubenswrapper[4785]: I1126 15:08:58.387914 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:58Z","lastTransitionTime":"2025-11-26T15:08:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:58 crc kubenswrapper[4785]: I1126 15:08:58.491473 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:58 crc kubenswrapper[4785]: I1126 15:08:58.491535 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:58 crc kubenswrapper[4785]: I1126 15:08:58.491578 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:58 crc kubenswrapper[4785]: I1126 15:08:58.491604 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:58 crc kubenswrapper[4785]: I1126 15:08:58.491698 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:58Z","lastTransitionTime":"2025-11-26T15:08:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:58 crc kubenswrapper[4785]: I1126 15:08:58.594858 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:58 crc kubenswrapper[4785]: I1126 15:08:58.594920 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:58 crc kubenswrapper[4785]: I1126 15:08:58.594955 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:58 crc kubenswrapper[4785]: I1126 15:08:58.594984 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:58 crc kubenswrapper[4785]: I1126 15:08:58.595005 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:58Z","lastTransitionTime":"2025-11-26T15:08:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:58 crc kubenswrapper[4785]: I1126 15:08:58.697622 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:58 crc kubenswrapper[4785]: I1126 15:08:58.697685 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:58 crc kubenswrapper[4785]: I1126 15:08:58.697704 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:58 crc kubenswrapper[4785]: I1126 15:08:58.697726 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:58 crc kubenswrapper[4785]: I1126 15:08:58.697743 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:58Z","lastTransitionTime":"2025-11-26T15:08:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:58 crc kubenswrapper[4785]: I1126 15:08:58.801699 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:58 crc kubenswrapper[4785]: I1126 15:08:58.801762 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:58 crc kubenswrapper[4785]: I1126 15:08:58.801781 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:58 crc kubenswrapper[4785]: I1126 15:08:58.801808 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:58 crc kubenswrapper[4785]: I1126 15:08:58.801829 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:58Z","lastTransitionTime":"2025-11-26T15:08:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:58 crc kubenswrapper[4785]: I1126 15:08:58.905526 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:58 crc kubenswrapper[4785]: I1126 15:08:58.905604 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:58 crc kubenswrapper[4785]: I1126 15:08:58.905621 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:58 crc kubenswrapper[4785]: I1126 15:08:58.905643 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:58 crc kubenswrapper[4785]: I1126 15:08:58.905659 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:58Z","lastTransitionTime":"2025-11-26T15:08:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:59 crc kubenswrapper[4785]: I1126 15:08:59.008459 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:59 crc kubenswrapper[4785]: I1126 15:08:59.008541 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:59 crc kubenswrapper[4785]: I1126 15:08:59.008606 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:59 crc kubenswrapper[4785]: I1126 15:08:59.008632 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:59 crc kubenswrapper[4785]: I1126 15:08:59.008649 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:59Z","lastTransitionTime":"2025-11-26T15:08:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:59 crc kubenswrapper[4785]: I1126 15:08:59.035316 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 26 15:08:59 crc kubenswrapper[4785]: E1126 15:08:59.035655 4785 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 26 15:08:59 crc kubenswrapper[4785]: I1126 15:08:59.035697 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 26 15:08:59 crc kubenswrapper[4785]: E1126 15:08:59.036259 4785 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 26 15:08:59 crc kubenswrapper[4785]: I1126 15:08:59.110650 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:59 crc kubenswrapper[4785]: I1126 15:08:59.110706 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:59 crc kubenswrapper[4785]: I1126 15:08:59.110720 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:59 crc kubenswrapper[4785]: I1126 15:08:59.110738 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:59 crc kubenswrapper[4785]: I1126 15:08:59.110750 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:59Z","lastTransitionTime":"2025-11-26T15:08:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:59 crc kubenswrapper[4785]: I1126 15:08:59.214023 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:59 crc kubenswrapper[4785]: I1126 15:08:59.214096 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:59 crc kubenswrapper[4785]: I1126 15:08:59.214113 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:59 crc kubenswrapper[4785]: I1126 15:08:59.214138 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:59 crc kubenswrapper[4785]: I1126 15:08:59.214156 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:59Z","lastTransitionTime":"2025-11-26T15:08:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:59 crc kubenswrapper[4785]: I1126 15:08:59.317321 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:59 crc kubenswrapper[4785]: I1126 15:08:59.317391 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:59 crc kubenswrapper[4785]: I1126 15:08:59.317409 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:59 crc kubenswrapper[4785]: I1126 15:08:59.317436 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:59 crc kubenswrapper[4785]: I1126 15:08:59.317459 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:59Z","lastTransitionTime":"2025-11-26T15:08:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:59 crc kubenswrapper[4785]: I1126 15:08:59.421002 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:59 crc kubenswrapper[4785]: I1126 15:08:59.421073 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:59 crc kubenswrapper[4785]: I1126 15:08:59.421099 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:59 crc kubenswrapper[4785]: I1126 15:08:59.421127 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:59 crc kubenswrapper[4785]: I1126 15:08:59.421149 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:59Z","lastTransitionTime":"2025-11-26T15:08:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:59 crc kubenswrapper[4785]: I1126 15:08:59.524213 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:59 crc kubenswrapper[4785]: I1126 15:08:59.524264 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:59 crc kubenswrapper[4785]: I1126 15:08:59.524282 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:59 crc kubenswrapper[4785]: I1126 15:08:59.524309 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:59 crc kubenswrapper[4785]: I1126 15:08:59.524332 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:59Z","lastTransitionTime":"2025-11-26T15:08:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:59 crc kubenswrapper[4785]: I1126 15:08:59.627253 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:59 crc kubenswrapper[4785]: I1126 15:08:59.627349 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:59 crc kubenswrapper[4785]: I1126 15:08:59.627373 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:59 crc kubenswrapper[4785]: I1126 15:08:59.627404 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:59 crc kubenswrapper[4785]: I1126 15:08:59.627423 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:59Z","lastTransitionTime":"2025-11-26T15:08:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:59 crc kubenswrapper[4785]: I1126 15:08:59.730813 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:59 crc kubenswrapper[4785]: I1126 15:08:59.730893 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:59 crc kubenswrapper[4785]: I1126 15:08:59.730916 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:59 crc kubenswrapper[4785]: I1126 15:08:59.730942 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:59 crc kubenswrapper[4785]: I1126 15:08:59.730964 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:59Z","lastTransitionTime":"2025-11-26T15:08:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:59 crc kubenswrapper[4785]: I1126 15:08:59.833355 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:59 crc kubenswrapper[4785]: I1126 15:08:59.833410 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:59 crc kubenswrapper[4785]: I1126 15:08:59.833426 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:59 crc kubenswrapper[4785]: I1126 15:08:59.833444 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:59 crc kubenswrapper[4785]: I1126 15:08:59.833455 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:59Z","lastTransitionTime":"2025-11-26T15:08:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:08:59 crc kubenswrapper[4785]: I1126 15:08:59.936533 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:08:59 crc kubenswrapper[4785]: I1126 15:08:59.936621 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:08:59 crc kubenswrapper[4785]: I1126 15:08:59.936641 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:08:59 crc kubenswrapper[4785]: I1126 15:08:59.936667 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:08:59 crc kubenswrapper[4785]: I1126 15:08:59.936686 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:08:59Z","lastTransitionTime":"2025-11-26T15:08:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:09:00 crc kubenswrapper[4785]: I1126 15:09:00.036394 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 26 15:09:00 crc kubenswrapper[4785]: I1126 15:09:00.036474 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qdfwp" Nov 26 15:09:00 crc kubenswrapper[4785]: E1126 15:09:00.036635 4785 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 26 15:09:00 crc kubenswrapper[4785]: E1126 15:09:00.036786 4785 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qdfwp" podUID="72903df2-b694-4229-96b5-167500cab723" Nov 26 15:09:00 crc kubenswrapper[4785]: I1126 15:09:00.039099 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:09:00 crc kubenswrapper[4785]: I1126 15:09:00.039149 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:09:00 crc kubenswrapper[4785]: I1126 15:09:00.039165 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:09:00 crc kubenswrapper[4785]: I1126 15:09:00.039189 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:09:00 crc kubenswrapper[4785]: I1126 15:09:00.039206 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:09:00Z","lastTransitionTime":"2025-11-26T15:09:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:09:00 crc kubenswrapper[4785]: I1126 15:09:00.142841 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:09:00 crc kubenswrapper[4785]: I1126 15:09:00.142908 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:09:00 crc kubenswrapper[4785]: I1126 15:09:00.142925 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:09:00 crc kubenswrapper[4785]: I1126 15:09:00.142952 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:09:00 crc kubenswrapper[4785]: I1126 15:09:00.142971 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:09:00Z","lastTransitionTime":"2025-11-26T15:09:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:09:00 crc kubenswrapper[4785]: I1126 15:09:00.245846 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:09:00 crc kubenswrapper[4785]: I1126 15:09:00.245929 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:09:00 crc kubenswrapper[4785]: I1126 15:09:00.245958 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:09:00 crc kubenswrapper[4785]: I1126 15:09:00.245984 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:09:00 crc kubenswrapper[4785]: I1126 15:09:00.246003 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:09:00Z","lastTransitionTime":"2025-11-26T15:09:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:09:00 crc kubenswrapper[4785]: I1126 15:09:00.349420 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:09:00 crc kubenswrapper[4785]: I1126 15:09:00.349489 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:09:00 crc kubenswrapper[4785]: I1126 15:09:00.349507 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:09:00 crc kubenswrapper[4785]: I1126 15:09:00.349533 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:09:00 crc kubenswrapper[4785]: I1126 15:09:00.349583 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:09:00Z","lastTransitionTime":"2025-11-26T15:09:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:09:00 crc kubenswrapper[4785]: I1126 15:09:00.452108 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:09:00 crc kubenswrapper[4785]: I1126 15:09:00.452178 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:09:00 crc kubenswrapper[4785]: I1126 15:09:00.452195 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:09:00 crc kubenswrapper[4785]: I1126 15:09:00.452220 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:09:00 crc kubenswrapper[4785]: I1126 15:09:00.452241 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:09:00Z","lastTransitionTime":"2025-11-26T15:09:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:09:00 crc kubenswrapper[4785]: I1126 15:09:00.554822 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:09:00 crc kubenswrapper[4785]: I1126 15:09:00.554871 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:09:00 crc kubenswrapper[4785]: I1126 15:09:00.554888 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:09:00 crc kubenswrapper[4785]: I1126 15:09:00.554911 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:09:00 crc kubenswrapper[4785]: I1126 15:09:00.554930 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:09:00Z","lastTransitionTime":"2025-11-26T15:09:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:09:00 crc kubenswrapper[4785]: I1126 15:09:00.658622 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:09:00 crc kubenswrapper[4785]: I1126 15:09:00.658683 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:09:00 crc kubenswrapper[4785]: I1126 15:09:00.658701 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:09:00 crc kubenswrapper[4785]: I1126 15:09:00.658723 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:09:00 crc kubenswrapper[4785]: I1126 15:09:00.658741 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:09:00Z","lastTransitionTime":"2025-11-26T15:09:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:09:00 crc kubenswrapper[4785]: I1126 15:09:00.761614 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:09:00 crc kubenswrapper[4785]: I1126 15:09:00.761659 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:09:00 crc kubenswrapper[4785]: I1126 15:09:00.761667 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:09:00 crc kubenswrapper[4785]: I1126 15:09:00.761683 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:09:00 crc kubenswrapper[4785]: I1126 15:09:00.761692 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:09:00Z","lastTransitionTime":"2025-11-26T15:09:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:09:00 crc kubenswrapper[4785]: I1126 15:09:00.864992 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:09:00 crc kubenswrapper[4785]: I1126 15:09:00.865066 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:09:00 crc kubenswrapper[4785]: I1126 15:09:00.865086 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:09:00 crc kubenswrapper[4785]: I1126 15:09:00.865110 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:09:00 crc kubenswrapper[4785]: I1126 15:09:00.865128 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:09:00Z","lastTransitionTime":"2025-11-26T15:09:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:09:00 crc kubenswrapper[4785]: I1126 15:09:00.968107 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:09:00 crc kubenswrapper[4785]: I1126 15:09:00.968201 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:09:00 crc kubenswrapper[4785]: I1126 15:09:00.968225 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:09:00 crc kubenswrapper[4785]: I1126 15:09:00.968256 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:09:00 crc kubenswrapper[4785]: I1126 15:09:00.968331 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:09:00Z","lastTransitionTime":"2025-11-26T15:09:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:09:01 crc kubenswrapper[4785]: I1126 15:09:01.035895 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 26 15:09:01 crc kubenswrapper[4785]: I1126 15:09:01.036213 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 26 15:09:01 crc kubenswrapper[4785]: E1126 15:09:01.036438 4785 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 26 15:09:01 crc kubenswrapper[4785]: E1126 15:09:01.036636 4785 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 26 15:09:01 crc kubenswrapper[4785]: I1126 15:09:01.071517 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:09:01 crc kubenswrapper[4785]: I1126 15:09:01.071609 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:09:01 crc kubenswrapper[4785]: I1126 15:09:01.071627 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:09:01 crc kubenswrapper[4785]: I1126 15:09:01.071655 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:09:01 crc kubenswrapper[4785]: I1126 15:09:01.071678 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:09:01Z","lastTransitionTime":"2025-11-26T15:09:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:09:01 crc kubenswrapper[4785]: I1126 15:09:01.174311 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:09:01 crc kubenswrapper[4785]: I1126 15:09:01.174387 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:09:01 crc kubenswrapper[4785]: I1126 15:09:01.174410 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:09:01 crc kubenswrapper[4785]: I1126 15:09:01.174440 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:09:01 crc kubenswrapper[4785]: I1126 15:09:01.174460 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:09:01Z","lastTransitionTime":"2025-11-26T15:09:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:09:01 crc kubenswrapper[4785]: I1126 15:09:01.278423 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:09:01 crc kubenswrapper[4785]: I1126 15:09:01.278505 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:09:01 crc kubenswrapper[4785]: I1126 15:09:01.278530 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:09:01 crc kubenswrapper[4785]: I1126 15:09:01.278590 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:09:01 crc kubenswrapper[4785]: I1126 15:09:01.278663 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:09:01Z","lastTransitionTime":"2025-11-26T15:09:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:09:01 crc kubenswrapper[4785]: I1126 15:09:01.381634 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:09:01 crc kubenswrapper[4785]: I1126 15:09:01.381721 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:09:01 crc kubenswrapper[4785]: I1126 15:09:01.381744 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:09:01 crc kubenswrapper[4785]: I1126 15:09:01.381774 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:09:01 crc kubenswrapper[4785]: I1126 15:09:01.381797 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:09:01Z","lastTransitionTime":"2025-11-26T15:09:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:09:01 crc kubenswrapper[4785]: I1126 15:09:01.484899 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:09:01 crc kubenswrapper[4785]: I1126 15:09:01.484945 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:09:01 crc kubenswrapper[4785]: I1126 15:09:01.484955 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:09:01 crc kubenswrapper[4785]: I1126 15:09:01.484973 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:09:01 crc kubenswrapper[4785]: I1126 15:09:01.484985 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:09:01Z","lastTransitionTime":"2025-11-26T15:09:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:09:01 crc kubenswrapper[4785]: I1126 15:09:01.588290 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:09:01 crc kubenswrapper[4785]: I1126 15:09:01.588370 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:09:01 crc kubenswrapper[4785]: I1126 15:09:01.588403 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:09:01 crc kubenswrapper[4785]: I1126 15:09:01.588431 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:09:01 crc kubenswrapper[4785]: I1126 15:09:01.588453 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:09:01Z","lastTransitionTime":"2025-11-26T15:09:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:09:01 crc kubenswrapper[4785]: I1126 15:09:01.690514 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:09:01 crc kubenswrapper[4785]: I1126 15:09:01.690593 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:09:01 crc kubenswrapper[4785]: I1126 15:09:01.690617 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:09:01 crc kubenswrapper[4785]: I1126 15:09:01.690636 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:09:01 crc kubenswrapper[4785]: I1126 15:09:01.690650 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:09:01Z","lastTransitionTime":"2025-11-26T15:09:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:09:01 crc kubenswrapper[4785]: I1126 15:09:01.794086 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:09:01 crc kubenswrapper[4785]: I1126 15:09:01.794133 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:09:01 crc kubenswrapper[4785]: I1126 15:09:01.794149 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:09:01 crc kubenswrapper[4785]: I1126 15:09:01.794173 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:09:01 crc kubenswrapper[4785]: I1126 15:09:01.794190 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:09:01Z","lastTransitionTime":"2025-11-26T15:09:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:09:01 crc kubenswrapper[4785]: I1126 15:09:01.896716 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:09:01 crc kubenswrapper[4785]: I1126 15:09:01.896873 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:09:01 crc kubenswrapper[4785]: I1126 15:09:01.896906 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:09:01 crc kubenswrapper[4785]: I1126 15:09:01.896936 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:09:01 crc kubenswrapper[4785]: I1126 15:09:01.896958 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:09:01Z","lastTransitionTime":"2025-11-26T15:09:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:09:02 crc kubenswrapper[4785]: I1126 15:09:02.000378 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:09:02 crc kubenswrapper[4785]: I1126 15:09:02.000446 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:09:02 crc kubenswrapper[4785]: I1126 15:09:02.000487 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:09:02 crc kubenswrapper[4785]: I1126 15:09:02.000520 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:09:02 crc kubenswrapper[4785]: I1126 15:09:02.000544 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:09:02Z","lastTransitionTime":"2025-11-26T15:09:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:09:02 crc kubenswrapper[4785]: I1126 15:09:02.035860 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qdfwp" Nov 26 15:09:02 crc kubenswrapper[4785]: I1126 15:09:02.035920 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 26 15:09:02 crc kubenswrapper[4785]: E1126 15:09:02.036149 4785 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qdfwp" podUID="72903df2-b694-4229-96b5-167500cab723" Nov 26 15:09:02 crc kubenswrapper[4785]: E1126 15:09:02.036451 4785 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 26 15:09:02 crc kubenswrapper[4785]: I1126 15:09:02.103809 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:09:02 crc kubenswrapper[4785]: I1126 15:09:02.103858 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:09:02 crc kubenswrapper[4785]: I1126 15:09:02.103874 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:09:02 crc kubenswrapper[4785]: I1126 15:09:02.103894 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:09:02 crc kubenswrapper[4785]: I1126 15:09:02.103909 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:09:02Z","lastTransitionTime":"2025-11-26T15:09:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:09:02 crc kubenswrapper[4785]: I1126 15:09:02.206771 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:09:02 crc kubenswrapper[4785]: I1126 15:09:02.206839 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:09:02 crc kubenswrapper[4785]: I1126 15:09:02.206857 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:09:02 crc kubenswrapper[4785]: I1126 15:09:02.206884 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:09:02 crc kubenswrapper[4785]: I1126 15:09:02.206902 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:09:02Z","lastTransitionTime":"2025-11-26T15:09:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:09:02 crc kubenswrapper[4785]: I1126 15:09:02.310348 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:09:02 crc kubenswrapper[4785]: I1126 15:09:02.310413 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:09:02 crc kubenswrapper[4785]: I1126 15:09:02.310430 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:09:02 crc kubenswrapper[4785]: I1126 15:09:02.310453 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:09:02 crc kubenswrapper[4785]: I1126 15:09:02.310470 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:09:02Z","lastTransitionTime":"2025-11-26T15:09:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:09:02 crc kubenswrapper[4785]: I1126 15:09:02.413172 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:09:02 crc kubenswrapper[4785]: I1126 15:09:02.413238 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:09:02 crc kubenswrapper[4785]: I1126 15:09:02.413256 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:09:02 crc kubenswrapper[4785]: I1126 15:09:02.413283 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:09:02 crc kubenswrapper[4785]: I1126 15:09:02.413300 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:09:02Z","lastTransitionTime":"2025-11-26T15:09:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:09:02 crc kubenswrapper[4785]: I1126 15:09:02.516380 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:09:02 crc kubenswrapper[4785]: I1126 15:09:02.516469 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:09:02 crc kubenswrapper[4785]: I1126 15:09:02.516513 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:09:02 crc kubenswrapper[4785]: I1126 15:09:02.516539 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:09:02 crc kubenswrapper[4785]: I1126 15:09:02.516604 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:09:02Z","lastTransitionTime":"2025-11-26T15:09:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:09:02 crc kubenswrapper[4785]: I1126 15:09:02.618533 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:09:02 crc kubenswrapper[4785]: I1126 15:09:02.618602 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:09:02 crc kubenswrapper[4785]: I1126 15:09:02.618614 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:09:02 crc kubenswrapper[4785]: I1126 15:09:02.618629 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:09:02 crc kubenswrapper[4785]: I1126 15:09:02.618663 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:09:02Z","lastTransitionTime":"2025-11-26T15:09:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:09:02 crc kubenswrapper[4785]: I1126 15:09:02.721687 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:09:02 crc kubenswrapper[4785]: I1126 15:09:02.721753 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:09:02 crc kubenswrapper[4785]: I1126 15:09:02.721770 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:09:02 crc kubenswrapper[4785]: I1126 15:09:02.721796 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:09:02 crc kubenswrapper[4785]: I1126 15:09:02.721814 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:09:02Z","lastTransitionTime":"2025-11-26T15:09:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:09:02 crc kubenswrapper[4785]: I1126 15:09:02.824629 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:09:02 crc kubenswrapper[4785]: I1126 15:09:02.824676 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:09:02 crc kubenswrapper[4785]: I1126 15:09:02.824686 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:09:02 crc kubenswrapper[4785]: I1126 15:09:02.824701 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:09:02 crc kubenswrapper[4785]: I1126 15:09:02.824712 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:09:02Z","lastTransitionTime":"2025-11-26T15:09:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:09:02 crc kubenswrapper[4785]: I1126 15:09:02.926778 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:09:02 crc kubenswrapper[4785]: I1126 15:09:02.926864 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:09:02 crc kubenswrapper[4785]: I1126 15:09:02.926873 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:09:02 crc kubenswrapper[4785]: I1126 15:09:02.926886 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:09:02 crc kubenswrapper[4785]: I1126 15:09:02.926895 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:09:02Z","lastTransitionTime":"2025-11-26T15:09:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:09:03 crc kubenswrapper[4785]: I1126 15:09:03.029024 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:09:03 crc kubenswrapper[4785]: I1126 15:09:03.029073 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:09:03 crc kubenswrapper[4785]: I1126 15:09:03.029084 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:09:03 crc kubenswrapper[4785]: I1126 15:09:03.029099 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:09:03 crc kubenswrapper[4785]: I1126 15:09:03.029111 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:09:03Z","lastTransitionTime":"2025-11-26T15:09:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:09:03 crc kubenswrapper[4785]: I1126 15:09:03.035975 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 26 15:09:03 crc kubenswrapper[4785]: I1126 15:09:03.036037 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 26 15:09:03 crc kubenswrapper[4785]: E1126 15:09:03.036072 4785 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 26 15:09:03 crc kubenswrapper[4785]: E1126 15:09:03.036342 4785 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 26 15:09:03 crc kubenswrapper[4785]: I1126 15:09:03.131855 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:09:03 crc kubenswrapper[4785]: I1126 15:09:03.131922 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:09:03 crc kubenswrapper[4785]: I1126 15:09:03.131943 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:09:03 crc kubenswrapper[4785]: I1126 15:09:03.131967 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:09:03 crc kubenswrapper[4785]: I1126 15:09:03.131986 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:09:03Z","lastTransitionTime":"2025-11-26T15:09:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:09:03 crc kubenswrapper[4785]: I1126 15:09:03.235700 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:09:03 crc kubenswrapper[4785]: I1126 15:09:03.235806 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:09:03 crc kubenswrapper[4785]: I1126 15:09:03.235825 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:09:03 crc kubenswrapper[4785]: I1126 15:09:03.235851 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:09:03 crc kubenswrapper[4785]: I1126 15:09:03.235869 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:09:03Z","lastTransitionTime":"2025-11-26T15:09:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:09:03 crc kubenswrapper[4785]: I1126 15:09:03.339212 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:09:03 crc kubenswrapper[4785]: I1126 15:09:03.339273 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:09:03 crc kubenswrapper[4785]: I1126 15:09:03.339289 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:09:03 crc kubenswrapper[4785]: I1126 15:09:03.339312 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:09:03 crc kubenswrapper[4785]: I1126 15:09:03.339331 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:09:03Z","lastTransitionTime":"2025-11-26T15:09:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:09:03 crc kubenswrapper[4785]: I1126 15:09:03.442965 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:09:03 crc kubenswrapper[4785]: I1126 15:09:03.443084 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:09:03 crc kubenswrapper[4785]: I1126 15:09:03.443102 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:09:03 crc kubenswrapper[4785]: I1126 15:09:03.443128 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:09:03 crc kubenswrapper[4785]: I1126 15:09:03.443151 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:09:03Z","lastTransitionTime":"2025-11-26T15:09:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:09:03 crc kubenswrapper[4785]: I1126 15:09:03.546188 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:09:03 crc kubenswrapper[4785]: I1126 15:09:03.546276 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:09:03 crc kubenswrapper[4785]: I1126 15:09:03.546294 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:09:03 crc kubenswrapper[4785]: I1126 15:09:03.546317 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:09:03 crc kubenswrapper[4785]: I1126 15:09:03.546334 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:09:03Z","lastTransitionTime":"2025-11-26T15:09:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:09:03 crc kubenswrapper[4785]: I1126 15:09:03.649400 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:09:03 crc kubenswrapper[4785]: I1126 15:09:03.649449 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:09:03 crc kubenswrapper[4785]: I1126 15:09:03.649467 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:09:03 crc kubenswrapper[4785]: I1126 15:09:03.649492 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:09:03 crc kubenswrapper[4785]: I1126 15:09:03.649509 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:09:03Z","lastTransitionTime":"2025-11-26T15:09:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:09:03 crc kubenswrapper[4785]: I1126 15:09:03.753217 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:09:03 crc kubenswrapper[4785]: I1126 15:09:03.753287 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:09:03 crc kubenswrapper[4785]: I1126 15:09:03.753304 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:09:03 crc kubenswrapper[4785]: I1126 15:09:03.753328 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:09:03 crc kubenswrapper[4785]: I1126 15:09:03.753345 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:09:03Z","lastTransitionTime":"2025-11-26T15:09:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:09:03 crc kubenswrapper[4785]: I1126 15:09:03.856845 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:09:03 crc kubenswrapper[4785]: I1126 15:09:03.856914 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:09:03 crc kubenswrapper[4785]: I1126 15:09:03.856940 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:09:03 crc kubenswrapper[4785]: I1126 15:09:03.856971 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:09:03 crc kubenswrapper[4785]: I1126 15:09:03.856993 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:09:03Z","lastTransitionTime":"2025-11-26T15:09:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:09:03 crc kubenswrapper[4785]: I1126 15:09:03.959908 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:09:03 crc kubenswrapper[4785]: I1126 15:09:03.959991 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:09:03 crc kubenswrapper[4785]: I1126 15:09:03.960011 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:09:03 crc kubenswrapper[4785]: I1126 15:09:03.960034 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:09:03 crc kubenswrapper[4785]: I1126 15:09:03.960056 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:09:03Z","lastTransitionTime":"2025-11-26T15:09:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:09:04 crc kubenswrapper[4785]: I1126 15:09:04.036010 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 26 15:09:04 crc kubenswrapper[4785]: I1126 15:09:04.036046 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qdfwp" Nov 26 15:09:04 crc kubenswrapper[4785]: E1126 15:09:04.036243 4785 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 26 15:09:04 crc kubenswrapper[4785]: E1126 15:09:04.036365 4785 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qdfwp" podUID="72903df2-b694-4229-96b5-167500cab723" Nov 26 15:09:04 crc kubenswrapper[4785]: I1126 15:09:04.063581 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:09:04 crc kubenswrapper[4785]: I1126 15:09:04.063663 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:09:04 crc kubenswrapper[4785]: I1126 15:09:04.063688 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:09:04 crc kubenswrapper[4785]: I1126 15:09:04.063716 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:09:04 crc kubenswrapper[4785]: I1126 15:09:04.063739 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:09:04Z","lastTransitionTime":"2025-11-26T15:09:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:09:04 crc kubenswrapper[4785]: I1126 15:09:04.166934 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:09:04 crc kubenswrapper[4785]: I1126 15:09:04.167034 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:09:04 crc kubenswrapper[4785]: I1126 15:09:04.167054 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:09:04 crc kubenswrapper[4785]: I1126 15:09:04.167079 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:09:04 crc kubenswrapper[4785]: I1126 15:09:04.167098 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:09:04Z","lastTransitionTime":"2025-11-26T15:09:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:09:04 crc kubenswrapper[4785]: I1126 15:09:04.271399 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:09:04 crc kubenswrapper[4785]: I1126 15:09:04.271515 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:09:04 crc kubenswrapper[4785]: I1126 15:09:04.271537 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:09:04 crc kubenswrapper[4785]: I1126 15:09:04.271591 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:09:04 crc kubenswrapper[4785]: I1126 15:09:04.271609 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:09:04Z","lastTransitionTime":"2025-11-26T15:09:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:09:04 crc kubenswrapper[4785]: I1126 15:09:04.375315 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:09:04 crc kubenswrapper[4785]: I1126 15:09:04.375388 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:09:04 crc kubenswrapper[4785]: I1126 15:09:04.375405 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:09:04 crc kubenswrapper[4785]: I1126 15:09:04.375431 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:09:04 crc kubenswrapper[4785]: I1126 15:09:04.375450 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:09:04Z","lastTransitionTime":"2025-11-26T15:09:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:09:04 crc kubenswrapper[4785]: I1126 15:09:04.478759 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:09:04 crc kubenswrapper[4785]: I1126 15:09:04.478892 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:09:04 crc kubenswrapper[4785]: I1126 15:09:04.478928 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:09:04 crc kubenswrapper[4785]: I1126 15:09:04.478959 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:09:04 crc kubenswrapper[4785]: I1126 15:09:04.479035 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:09:04Z","lastTransitionTime":"2025-11-26T15:09:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:09:04 crc kubenswrapper[4785]: I1126 15:09:04.582014 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:09:04 crc kubenswrapper[4785]: I1126 15:09:04.582087 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:09:04 crc kubenswrapper[4785]: I1126 15:09:04.582107 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:09:04 crc kubenswrapper[4785]: I1126 15:09:04.582133 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:09:04 crc kubenswrapper[4785]: I1126 15:09:04.582153 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:09:04Z","lastTransitionTime":"2025-11-26T15:09:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:09:04 crc kubenswrapper[4785]: I1126 15:09:04.684511 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:09:04 crc kubenswrapper[4785]: I1126 15:09:04.684545 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:09:04 crc kubenswrapper[4785]: I1126 15:09:04.684571 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:09:04 crc kubenswrapper[4785]: I1126 15:09:04.684588 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:09:04 crc kubenswrapper[4785]: I1126 15:09:04.684599 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:09:04Z","lastTransitionTime":"2025-11-26T15:09:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:09:04 crc kubenswrapper[4785]: I1126 15:09:04.787431 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:09:04 crc kubenswrapper[4785]: I1126 15:09:04.787472 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:09:04 crc kubenswrapper[4785]: I1126 15:09:04.787481 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:09:04 crc kubenswrapper[4785]: I1126 15:09:04.787494 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:09:04 crc kubenswrapper[4785]: I1126 15:09:04.787504 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:09:04Z","lastTransitionTime":"2025-11-26T15:09:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:09:04 crc kubenswrapper[4785]: I1126 15:09:04.890602 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:09:04 crc kubenswrapper[4785]: I1126 15:09:04.890688 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:09:04 crc kubenswrapper[4785]: I1126 15:09:04.890713 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:09:04 crc kubenswrapper[4785]: I1126 15:09:04.890744 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:09:04 crc kubenswrapper[4785]: I1126 15:09:04.890765 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:09:04Z","lastTransitionTime":"2025-11-26T15:09:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:09:04 crc kubenswrapper[4785]: I1126 15:09:04.993611 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:09:04 crc kubenswrapper[4785]: I1126 15:09:04.993675 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:09:04 crc kubenswrapper[4785]: I1126 15:09:04.993692 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:09:04 crc kubenswrapper[4785]: I1126 15:09:04.993718 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:09:04 crc kubenswrapper[4785]: I1126 15:09:04.993742 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:09:04Z","lastTransitionTime":"2025-11-26T15:09:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:09:05 crc kubenswrapper[4785]: I1126 15:09:05.035978 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 26 15:09:05 crc kubenswrapper[4785]: I1126 15:09:05.036005 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 26 15:09:05 crc kubenswrapper[4785]: E1126 15:09:05.036188 4785 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 26 15:09:05 crc kubenswrapper[4785]: E1126 15:09:05.036275 4785 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 26 15:09:05 crc kubenswrapper[4785]: I1126 15:09:05.096906 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:09:05 crc kubenswrapper[4785]: I1126 15:09:05.096950 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:09:05 crc kubenswrapper[4785]: I1126 15:09:05.096961 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:09:05 crc kubenswrapper[4785]: I1126 15:09:05.096980 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:09:05 crc kubenswrapper[4785]: I1126 15:09:05.096992 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:09:05Z","lastTransitionTime":"2025-11-26T15:09:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:09:05 crc kubenswrapper[4785]: I1126 15:09:05.200251 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:09:05 crc kubenswrapper[4785]: I1126 15:09:05.200320 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:09:05 crc kubenswrapper[4785]: I1126 15:09:05.200337 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:09:05 crc kubenswrapper[4785]: I1126 15:09:05.200362 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:09:05 crc kubenswrapper[4785]: I1126 15:09:05.200379 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:09:05Z","lastTransitionTime":"2025-11-26T15:09:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:09:05 crc kubenswrapper[4785]: I1126 15:09:05.303470 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:09:05 crc kubenswrapper[4785]: I1126 15:09:05.303517 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:09:05 crc kubenswrapper[4785]: I1126 15:09:05.303527 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:09:05 crc kubenswrapper[4785]: I1126 15:09:05.303540 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:09:05 crc kubenswrapper[4785]: I1126 15:09:05.303549 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:09:05Z","lastTransitionTime":"2025-11-26T15:09:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:09:05 crc kubenswrapper[4785]: I1126 15:09:05.407000 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:09:05 crc kubenswrapper[4785]: I1126 15:09:05.407065 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:09:05 crc kubenswrapper[4785]: I1126 15:09:05.407082 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:09:05 crc kubenswrapper[4785]: I1126 15:09:05.407106 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:09:05 crc kubenswrapper[4785]: I1126 15:09:05.407122 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:09:05Z","lastTransitionTime":"2025-11-26T15:09:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:09:05 crc kubenswrapper[4785]: I1126 15:09:05.510631 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:09:05 crc kubenswrapper[4785]: I1126 15:09:05.510703 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:09:05 crc kubenswrapper[4785]: I1126 15:09:05.510727 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:09:05 crc kubenswrapper[4785]: I1126 15:09:05.510755 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:09:05 crc kubenswrapper[4785]: I1126 15:09:05.510776 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:09:05Z","lastTransitionTime":"2025-11-26T15:09:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:09:05 crc kubenswrapper[4785]: I1126 15:09:05.613911 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:09:05 crc kubenswrapper[4785]: I1126 15:09:05.613984 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:09:05 crc kubenswrapper[4785]: I1126 15:09:05.614008 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:09:05 crc kubenswrapper[4785]: I1126 15:09:05.614036 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:09:05 crc kubenswrapper[4785]: I1126 15:09:05.614057 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:09:05Z","lastTransitionTime":"2025-11-26T15:09:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:09:05 crc kubenswrapper[4785]: I1126 15:09:05.716677 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:09:05 crc kubenswrapper[4785]: I1126 15:09:05.716793 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:09:05 crc kubenswrapper[4785]: I1126 15:09:05.716810 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:09:05 crc kubenswrapper[4785]: I1126 15:09:05.716835 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:09:05 crc kubenswrapper[4785]: I1126 15:09:05.716854 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:09:05Z","lastTransitionTime":"2025-11-26T15:09:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:09:05 crc kubenswrapper[4785]: I1126 15:09:05.819678 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:09:05 crc kubenswrapper[4785]: I1126 15:09:05.819762 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:09:05 crc kubenswrapper[4785]: I1126 15:09:05.819775 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:09:05 crc kubenswrapper[4785]: I1126 15:09:05.819792 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:09:05 crc kubenswrapper[4785]: I1126 15:09:05.819803 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:09:05Z","lastTransitionTime":"2025-11-26T15:09:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:09:05 crc kubenswrapper[4785]: I1126 15:09:05.857202 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 26 15:09:05 crc kubenswrapper[4785]: I1126 15:09:05.857281 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 26 15:09:05 crc kubenswrapper[4785]: I1126 15:09:05.857303 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 26 15:09:05 crc kubenswrapper[4785]: I1126 15:09:05.857336 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 26 15:09:05 crc kubenswrapper[4785]: I1126 15:09:05.857359 4785 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-26T15:09:05Z","lastTransitionTime":"2025-11-26T15:09:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 26 15:09:05 crc kubenswrapper[4785]: I1126 15:09:05.936239 4785 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-2wwgp"] Nov 26 15:09:05 crc kubenswrapper[4785]: I1126 15:09:05.937204 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-2wwgp" Nov 26 15:09:05 crc kubenswrapper[4785]: I1126 15:09:05.940805 4785 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Nov 26 15:09:05 crc kubenswrapper[4785]: I1126 15:09:05.941864 4785 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Nov 26 15:09:05 crc kubenswrapper[4785]: I1126 15:09:05.942247 4785 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Nov 26 15:09:05 crc kubenswrapper[4785]: I1126 15:09:05.942364 4785 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Nov 26 15:09:05 crc kubenswrapper[4785]: I1126 15:09:05.968421 4785 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=88.968399934 podStartE2EDuration="1m28.968399934s" podCreationTimestamp="2025-11-26 15:07:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 15:09:05.968146838 +0000 UTC m=+109.646512622" watchObservedRunningTime="2025-11-26 15:09:05.968399934 +0000 UTC m=+109.646765708" Nov 26 15:09:06 crc kubenswrapper[4785]: I1126 15:09:06.026308 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4978000f-bef2-4046-844e-85f39e553645-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-2wwgp\" (UID: \"4978000f-bef2-4046-844e-85f39e553645\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-2wwgp" Nov 26 15:09:06 crc kubenswrapper[4785]: I1126 15:09:06.026353 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/4978000f-bef2-4046-844e-85f39e553645-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-2wwgp\" (UID: \"4978000f-bef2-4046-844e-85f39e553645\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-2wwgp" Nov 26 15:09:06 crc kubenswrapper[4785]: I1126 15:09:06.026406 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4978000f-bef2-4046-844e-85f39e553645-service-ca\") pod \"cluster-version-operator-5c965bbfc6-2wwgp\" (UID: \"4978000f-bef2-4046-844e-85f39e553645\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-2wwgp" Nov 26 15:09:06 crc kubenswrapper[4785]: I1126 15:09:06.026445 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/4978000f-bef2-4046-844e-85f39e553645-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-2wwgp\" (UID: \"4978000f-bef2-4046-844e-85f39e553645\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-2wwgp" Nov 26 15:09:06 crc kubenswrapper[4785]: I1126 15:09:06.026467 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4978000f-bef2-4046-844e-85f39e553645-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-2wwgp\" (UID: \"4978000f-bef2-4046-844e-85f39e553645\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-2wwgp" Nov 26 15:09:06 crc kubenswrapper[4785]: I1126 15:09:06.035629 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 26 15:09:06 crc kubenswrapper[4785]: I1126 15:09:06.035652 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qdfwp" Nov 26 15:09:06 crc kubenswrapper[4785]: E1126 15:09:06.035749 4785 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 26 15:09:06 crc kubenswrapper[4785]: E1126 15:09:06.035987 4785 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qdfwp" podUID="72903df2-b694-4229-96b5-167500cab723" Nov 26 15:09:06 crc kubenswrapper[4785]: I1126 15:09:06.045070 4785 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-gkxdl" podStartSLOduration=88.045049742 podStartE2EDuration="1m28.045049742s" podCreationTimestamp="2025-11-26 15:07:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 15:09:06.044519908 +0000 UTC m=+109.722885692" watchObservedRunningTime="2025-11-26 15:09:06.045049742 +0000 UTC m=+109.723415536" Nov 26 15:09:06 crc kubenswrapper[4785]: I1126 15:09:06.045412 4785 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-xbz7b" podStartSLOduration=88.045400411 podStartE2EDuration="1m28.045400411s" podCreationTimestamp="2025-11-26 15:07:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 15:09:06.031279815 +0000 UTC m=+109.709645609" watchObservedRunningTime="2025-11-26 15:09:06.045400411 +0000 UTC m=+109.723766205" Nov 26 15:09:06 crc kubenswrapper[4785]: I1126 15:09:06.068780 4785 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=84.068760297 podStartE2EDuration="1m24.068760297s" podCreationTimestamp="2025-11-26 15:07:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 15:09:06.067406032 +0000 UTC m=+109.745771796" watchObservedRunningTime="2025-11-26 15:09:06.068760297 +0000 UTC m=+109.747126071" Nov 26 15:09:06 crc kubenswrapper[4785]: I1126 15:09:06.086060 4785 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=56.086027645 podStartE2EDuration="56.086027645s" podCreationTimestamp="2025-11-26 15:08:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 15:09:06.084945666 +0000 UTC m=+109.763311530" watchObservedRunningTime="2025-11-26 15:09:06.086027645 +0000 UTC m=+109.764393409" Nov 26 15:09:06 crc kubenswrapper[4785]: I1126 15:09:06.128668 4785 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=86.12864558 podStartE2EDuration="1m26.12864558s" podCreationTimestamp="2025-11-26 15:07:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 15:09:06.112309606 +0000 UTC m=+109.790675410" watchObservedRunningTime="2025-11-26 15:09:06.12864558 +0000 UTC m=+109.807011364" Nov 26 15:09:06 crc kubenswrapper[4785]: I1126 15:09:06.129718 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/4978000f-bef2-4046-844e-85f39e553645-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-2wwgp\" (UID: \"4978000f-bef2-4046-844e-85f39e553645\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-2wwgp" Nov 26 15:09:06 crc kubenswrapper[4785]: I1126 15:09:06.129753 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4978000f-bef2-4046-844e-85f39e553645-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-2wwgp\" (UID: \"4978000f-bef2-4046-844e-85f39e553645\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-2wwgp" Nov 26 15:09:06 crc kubenswrapper[4785]: I1126 15:09:06.129833 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4978000f-bef2-4046-844e-85f39e553645-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-2wwgp\" (UID: \"4978000f-bef2-4046-844e-85f39e553645\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-2wwgp" Nov 26 15:09:06 crc kubenswrapper[4785]: I1126 15:09:06.129843 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/4978000f-bef2-4046-844e-85f39e553645-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-2wwgp\" (UID: \"4978000f-bef2-4046-844e-85f39e553645\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-2wwgp" Nov 26 15:09:06 crc kubenswrapper[4785]: I1126 15:09:06.129862 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/4978000f-bef2-4046-844e-85f39e553645-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-2wwgp\" (UID: \"4978000f-bef2-4046-844e-85f39e553645\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-2wwgp" Nov 26 15:09:06 crc kubenswrapper[4785]: I1126 15:09:06.129904 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/4978000f-bef2-4046-844e-85f39e553645-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-2wwgp\" (UID: \"4978000f-bef2-4046-844e-85f39e553645\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-2wwgp" Nov 26 15:09:06 crc kubenswrapper[4785]: I1126 15:09:06.129918 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4978000f-bef2-4046-844e-85f39e553645-service-ca\") pod \"cluster-version-operator-5c965bbfc6-2wwgp\" (UID: \"4978000f-bef2-4046-844e-85f39e553645\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-2wwgp" Nov 26 15:09:06 crc kubenswrapper[4785]: I1126 15:09:06.130690 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4978000f-bef2-4046-844e-85f39e553645-service-ca\") pod \"cluster-version-operator-5c965bbfc6-2wwgp\" (UID: \"4978000f-bef2-4046-844e-85f39e553645\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-2wwgp" Nov 26 15:09:06 crc kubenswrapper[4785]: I1126 15:09:06.141921 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4978000f-bef2-4046-844e-85f39e553645-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-2wwgp\" (UID: \"4978000f-bef2-4046-844e-85f39e553645\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-2wwgp" Nov 26 15:09:06 crc kubenswrapper[4785]: I1126 15:09:06.152202 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4978000f-bef2-4046-844e-85f39e553645-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-2wwgp\" (UID: \"4978000f-bef2-4046-844e-85f39e553645\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-2wwgp" Nov 26 15:09:06 crc kubenswrapper[4785]: I1126 15:09:06.192509 4785 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-hk884" podStartSLOduration=88.192492704 podStartE2EDuration="1m28.192492704s" podCreationTimestamp="2025-11-26 15:07:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 15:09:06.192229118 +0000 UTC m=+109.870594902" watchObservedRunningTime="2025-11-26 15:09:06.192492704 +0000 UTC m=+109.870858468" Nov 26 15:09:06 crc kubenswrapper[4785]: I1126 15:09:06.219399 4785 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-6q4xd" podStartSLOduration=88.219383522 podStartE2EDuration="1m28.219383522s" podCreationTimestamp="2025-11-26 15:07:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 15:09:06.210093631 +0000 UTC m=+109.888459425" watchObservedRunningTime="2025-11-26 15:09:06.219383522 +0000 UTC m=+109.897749306" Nov 26 15:09:06 crc kubenswrapper[4785]: I1126 15:09:06.228989 4785 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=44.22897314 podStartE2EDuration="44.22897314s" podCreationTimestamp="2025-11-26 15:08:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 15:09:06.228420416 +0000 UTC m=+109.906786190" watchObservedRunningTime="2025-11-26 15:09:06.22897314 +0000 UTC m=+109.907338894" Nov 26 15:09:06 crc kubenswrapper[4785]: I1126 15:09:06.251238 4785 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-smv28" podStartSLOduration=88.251211657 podStartE2EDuration="1m28.251211657s" podCreationTimestamp="2025-11-26 15:07:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 15:09:06.250870028 +0000 UTC m=+109.929235782" watchObservedRunningTime="2025-11-26 15:09:06.251211657 +0000 UTC m=+109.929577431" Nov 26 15:09:06 crc kubenswrapper[4785]: I1126 15:09:06.261514 4785 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zk6jt" podStartSLOduration=88.261495684 podStartE2EDuration="1m28.261495684s" podCreationTimestamp="2025-11-26 15:07:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 15:09:06.261376861 +0000 UTC m=+109.939742635" watchObservedRunningTime="2025-11-26 15:09:06.261495684 +0000 UTC m=+109.939861458" Nov 26 15:09:06 crc kubenswrapper[4785]: I1126 15:09:06.271401 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-2wwgp" Nov 26 15:09:06 crc kubenswrapper[4785]: I1126 15:09:06.636107 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-2wwgp" event={"ID":"4978000f-bef2-4046-844e-85f39e553645","Type":"ContainerStarted","Data":"828ad183bdb7df6b17f386e19566518a9c696819a7874f60776ebe1f8503cabc"} Nov 26 15:09:06 crc kubenswrapper[4785]: I1126 15:09:06.636663 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-2wwgp" event={"ID":"4978000f-bef2-4046-844e-85f39e553645","Type":"ContainerStarted","Data":"1eafafb8e2f1dcc6d58d7716fa5dc5386349bc4c2e53abe3fa788814337363ee"} Nov 26 15:09:06 crc kubenswrapper[4785]: I1126 15:09:06.661765 4785 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-2wwgp" podStartSLOduration=88.661738443 podStartE2EDuration="1m28.661738443s" podCreationTimestamp="2025-11-26 15:07:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 15:09:06.661424875 +0000 UTC m=+110.339790669" watchObservedRunningTime="2025-11-26 15:09:06.661738443 +0000 UTC m=+110.340104247" Nov 26 15:09:07 crc kubenswrapper[4785]: I1126 15:09:07.035930 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 26 15:09:07 crc kubenswrapper[4785]: I1126 15:09:07.035969 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 26 15:09:07 crc kubenswrapper[4785]: E1126 15:09:07.038453 4785 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 26 15:09:07 crc kubenswrapper[4785]: E1126 15:09:07.039080 4785 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 26 15:09:08 crc kubenswrapper[4785]: I1126 15:09:08.035739 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 26 15:09:08 crc kubenswrapper[4785]: I1126 15:09:08.035950 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qdfwp" Nov 26 15:09:08 crc kubenswrapper[4785]: E1126 15:09:08.036139 4785 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 26 15:09:08 crc kubenswrapper[4785]: E1126 15:09:08.036431 4785 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qdfwp" podUID="72903df2-b694-4229-96b5-167500cab723" Nov 26 15:09:09 crc kubenswrapper[4785]: I1126 15:09:09.036293 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 26 15:09:09 crc kubenswrapper[4785]: I1126 15:09:09.036434 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 26 15:09:09 crc kubenswrapper[4785]: E1126 15:09:09.036630 4785 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 26 15:09:09 crc kubenswrapper[4785]: E1126 15:09:09.037147 4785 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 26 15:09:10 crc kubenswrapper[4785]: I1126 15:09:10.036096 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qdfwp" Nov 26 15:09:10 crc kubenswrapper[4785]: I1126 15:09:10.036096 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 26 15:09:10 crc kubenswrapper[4785]: E1126 15:09:10.036845 4785 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qdfwp" podUID="72903df2-b694-4229-96b5-167500cab723" Nov 26 15:09:10 crc kubenswrapper[4785]: E1126 15:09:10.036954 4785 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 26 15:09:11 crc kubenswrapper[4785]: I1126 15:09:11.035790 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 26 15:09:11 crc kubenswrapper[4785]: I1126 15:09:11.035842 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 26 15:09:11 crc kubenswrapper[4785]: E1126 15:09:11.035978 4785 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 26 15:09:11 crc kubenswrapper[4785]: E1126 15:09:11.036284 4785 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 26 15:09:12 crc kubenswrapper[4785]: I1126 15:09:12.036179 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 26 15:09:12 crc kubenswrapper[4785]: E1126 15:09:12.036319 4785 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 26 15:09:12 crc kubenswrapper[4785]: I1126 15:09:12.036341 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qdfwp" Nov 26 15:09:12 crc kubenswrapper[4785]: E1126 15:09:12.036859 4785 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qdfwp" podUID="72903df2-b694-4229-96b5-167500cab723" Nov 26 15:09:12 crc kubenswrapper[4785]: I1126 15:09:12.037292 4785 scope.go:117] "RemoveContainer" containerID="c8d76c8252001a8224d06473aabbbc9d9f0b4a73a415750b8414635ebf2b19da" Nov 26 15:09:12 crc kubenswrapper[4785]: E1126 15:09:12.037502 4785 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-925q9_openshift-ovn-kubernetes(862c58fd-3f79-4276-bd76-ce689d32cbd6)\"" pod="openshift-ovn-kubernetes/ovnkube-node-925q9" podUID="862c58fd-3f79-4276-bd76-ce689d32cbd6" Nov 26 15:09:12 crc kubenswrapper[4785]: I1126 15:09:12.656484 4785 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-6q4xd_855bd894-cca9-4fe1-a0d5-8b72afe7c93a/kube-multus/1.log" Nov 26 15:09:12 crc kubenswrapper[4785]: I1126 15:09:12.657389 4785 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-6q4xd_855bd894-cca9-4fe1-a0d5-8b72afe7c93a/kube-multus/0.log" Nov 26 15:09:12 crc kubenswrapper[4785]: I1126 15:09:12.657469 4785 generic.go:334] "Generic (PLEG): container finished" podID="855bd894-cca9-4fe1-a0d5-8b72afe7c93a" containerID="96bab32cef462780b564492521f2bdddda092ed263374eb243d3e20ed8a068ff" exitCode=1 Nov 26 15:09:12 crc kubenswrapper[4785]: I1126 15:09:12.657517 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-6q4xd" event={"ID":"855bd894-cca9-4fe1-a0d5-8b72afe7c93a","Type":"ContainerDied","Data":"96bab32cef462780b564492521f2bdddda092ed263374eb243d3e20ed8a068ff"} Nov 26 15:09:12 crc kubenswrapper[4785]: I1126 15:09:12.657591 4785 scope.go:117] "RemoveContainer" containerID="041064cd63f3d654e02e5e310fbc190ca527a986959c9d7a35ab87962184f2a0" Nov 26 15:09:12 crc kubenswrapper[4785]: I1126 15:09:12.658204 4785 scope.go:117] "RemoveContainer" containerID="96bab32cef462780b564492521f2bdddda092ed263374eb243d3e20ed8a068ff" Nov 26 15:09:12 crc kubenswrapper[4785]: E1126 15:09:12.658607 4785 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-6q4xd_openshift-multus(855bd894-cca9-4fe1-a0d5-8b72afe7c93a)\"" pod="openshift-multus/multus-6q4xd" podUID="855bd894-cca9-4fe1-a0d5-8b72afe7c93a" Nov 26 15:09:13 crc kubenswrapper[4785]: I1126 15:09:13.035493 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 26 15:09:13 crc kubenswrapper[4785]: I1126 15:09:13.035493 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 26 15:09:13 crc kubenswrapper[4785]: E1126 15:09:13.035756 4785 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 26 15:09:13 crc kubenswrapper[4785]: E1126 15:09:13.035890 4785 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 26 15:09:13 crc kubenswrapper[4785]: I1126 15:09:13.663225 4785 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-6q4xd_855bd894-cca9-4fe1-a0d5-8b72afe7c93a/kube-multus/1.log" Nov 26 15:09:14 crc kubenswrapper[4785]: I1126 15:09:14.035805 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qdfwp" Nov 26 15:09:14 crc kubenswrapper[4785]: I1126 15:09:14.035873 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 26 15:09:14 crc kubenswrapper[4785]: E1126 15:09:14.036073 4785 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qdfwp" podUID="72903df2-b694-4229-96b5-167500cab723" Nov 26 15:09:14 crc kubenswrapper[4785]: E1126 15:09:14.036267 4785 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 26 15:09:15 crc kubenswrapper[4785]: I1126 15:09:15.036022 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 26 15:09:15 crc kubenswrapper[4785]: I1126 15:09:15.036218 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 26 15:09:15 crc kubenswrapper[4785]: E1126 15:09:15.036794 4785 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 26 15:09:15 crc kubenswrapper[4785]: E1126 15:09:15.037299 4785 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 26 15:09:16 crc kubenswrapper[4785]: I1126 15:09:16.036155 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 26 15:09:16 crc kubenswrapper[4785]: E1126 15:09:16.036365 4785 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 26 15:09:16 crc kubenswrapper[4785]: I1126 15:09:16.036164 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qdfwp" Nov 26 15:09:16 crc kubenswrapper[4785]: E1126 15:09:16.036491 4785 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qdfwp" podUID="72903df2-b694-4229-96b5-167500cab723" Nov 26 15:09:16 crc kubenswrapper[4785]: E1126 15:09:16.983264 4785 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Nov 26 15:09:17 crc kubenswrapper[4785]: I1126 15:09:17.036423 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 26 15:09:17 crc kubenswrapper[4785]: I1126 15:09:17.036505 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 26 15:09:17 crc kubenswrapper[4785]: E1126 15:09:17.036744 4785 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 26 15:09:17 crc kubenswrapper[4785]: E1126 15:09:17.036914 4785 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 26 15:09:17 crc kubenswrapper[4785]: E1126 15:09:17.126105 4785 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Nov 26 15:09:18 crc kubenswrapper[4785]: I1126 15:09:18.035983 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 26 15:09:18 crc kubenswrapper[4785]: I1126 15:09:18.035985 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qdfwp" Nov 26 15:09:18 crc kubenswrapper[4785]: E1126 15:09:18.036428 4785 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 26 15:09:18 crc kubenswrapper[4785]: E1126 15:09:18.036481 4785 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qdfwp" podUID="72903df2-b694-4229-96b5-167500cab723" Nov 26 15:09:19 crc kubenswrapper[4785]: I1126 15:09:19.035784 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 26 15:09:19 crc kubenswrapper[4785]: E1126 15:09:19.036403 4785 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 26 15:09:19 crc kubenswrapper[4785]: I1126 15:09:19.036071 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 26 15:09:19 crc kubenswrapper[4785]: E1126 15:09:19.036646 4785 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 26 15:09:20 crc kubenswrapper[4785]: I1126 15:09:20.036342 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 26 15:09:20 crc kubenswrapper[4785]: I1126 15:09:20.036355 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qdfwp" Nov 26 15:09:20 crc kubenswrapper[4785]: E1126 15:09:20.036651 4785 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 26 15:09:20 crc kubenswrapper[4785]: E1126 15:09:20.036676 4785 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qdfwp" podUID="72903df2-b694-4229-96b5-167500cab723" Nov 26 15:09:21 crc kubenswrapper[4785]: I1126 15:09:21.035328 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 26 15:09:21 crc kubenswrapper[4785]: I1126 15:09:21.035346 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 26 15:09:21 crc kubenswrapper[4785]: E1126 15:09:21.035532 4785 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 26 15:09:21 crc kubenswrapper[4785]: E1126 15:09:21.035689 4785 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 26 15:09:22 crc kubenswrapper[4785]: I1126 15:09:22.036265 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qdfwp" Nov 26 15:09:22 crc kubenswrapper[4785]: I1126 15:09:22.036304 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 26 15:09:22 crc kubenswrapper[4785]: E1126 15:09:22.036469 4785 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qdfwp" podUID="72903df2-b694-4229-96b5-167500cab723" Nov 26 15:09:22 crc kubenswrapper[4785]: E1126 15:09:22.036644 4785 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 26 15:09:22 crc kubenswrapper[4785]: E1126 15:09:22.127828 4785 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Nov 26 15:09:23 crc kubenswrapper[4785]: I1126 15:09:23.036362 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 26 15:09:23 crc kubenswrapper[4785]: I1126 15:09:23.036588 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 26 15:09:23 crc kubenswrapper[4785]: E1126 15:09:23.036763 4785 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 26 15:09:23 crc kubenswrapper[4785]: E1126 15:09:23.036934 4785 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 26 15:09:24 crc kubenswrapper[4785]: I1126 15:09:24.035537 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 26 15:09:24 crc kubenswrapper[4785]: E1126 15:09:24.035720 4785 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 26 15:09:24 crc kubenswrapper[4785]: I1126 15:09:24.035815 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qdfwp" Nov 26 15:09:24 crc kubenswrapper[4785]: E1126 15:09:24.036281 4785 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qdfwp" podUID="72903df2-b694-4229-96b5-167500cab723" Nov 26 15:09:24 crc kubenswrapper[4785]: I1126 15:09:24.037210 4785 scope.go:117] "RemoveContainer" containerID="c8d76c8252001a8224d06473aabbbc9d9f0b4a73a415750b8414635ebf2b19da" Nov 26 15:09:24 crc kubenswrapper[4785]: I1126 15:09:24.708031 4785 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-925q9_862c58fd-3f79-4276-bd76-ce689d32cbd6/ovnkube-controller/3.log" Nov 26 15:09:24 crc kubenswrapper[4785]: I1126 15:09:24.711827 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-925q9" event={"ID":"862c58fd-3f79-4276-bd76-ce689d32cbd6","Type":"ContainerStarted","Data":"e3659c263897812b210f9a05d4de256ad029c456cea625a09d22e4a93c71ba6a"} Nov 26 15:09:24 crc kubenswrapper[4785]: I1126 15:09:24.712431 4785 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-925q9" Nov 26 15:09:24 crc kubenswrapper[4785]: I1126 15:09:24.751665 4785 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-925q9" podStartSLOduration=106.751641156 podStartE2EDuration="1m46.751641156s" podCreationTimestamp="2025-11-26 15:07:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 15:09:24.750984739 +0000 UTC m=+128.429350593" watchObservedRunningTime="2025-11-26 15:09:24.751641156 +0000 UTC m=+128.430006960" Nov 26 15:09:25 crc kubenswrapper[4785]: I1126 15:09:25.036179 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 26 15:09:25 crc kubenswrapper[4785]: I1126 15:09:25.036199 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 26 15:09:25 crc kubenswrapper[4785]: E1126 15:09:25.036418 4785 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 26 15:09:25 crc kubenswrapper[4785]: E1126 15:09:25.036595 4785 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 26 15:09:25 crc kubenswrapper[4785]: I1126 15:09:25.173535 4785 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-qdfwp"] Nov 26 15:09:25 crc kubenswrapper[4785]: I1126 15:09:25.173728 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qdfwp" Nov 26 15:09:25 crc kubenswrapper[4785]: E1126 15:09:25.173854 4785 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qdfwp" podUID="72903df2-b694-4229-96b5-167500cab723" Nov 26 15:09:26 crc kubenswrapper[4785]: I1126 15:09:26.035847 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 26 15:09:26 crc kubenswrapper[4785]: E1126 15:09:26.036511 4785 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 26 15:09:27 crc kubenswrapper[4785]: I1126 15:09:27.035810 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 26 15:09:27 crc kubenswrapper[4785]: I1126 15:09:27.035967 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 26 15:09:27 crc kubenswrapper[4785]: E1126 15:09:27.036201 4785 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 26 15:09:27 crc kubenswrapper[4785]: I1126 15:09:27.036235 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qdfwp" Nov 26 15:09:27 crc kubenswrapper[4785]: E1126 15:09:27.036423 4785 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 26 15:09:27 crc kubenswrapper[4785]: E1126 15:09:27.036661 4785 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qdfwp" podUID="72903df2-b694-4229-96b5-167500cab723" Nov 26 15:09:27 crc kubenswrapper[4785]: E1126 15:09:27.128615 4785 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Nov 26 15:09:28 crc kubenswrapper[4785]: I1126 15:09:28.036345 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 26 15:09:28 crc kubenswrapper[4785]: E1126 15:09:28.036505 4785 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 26 15:09:28 crc kubenswrapper[4785]: I1126 15:09:28.036820 4785 scope.go:117] "RemoveContainer" containerID="96bab32cef462780b564492521f2bdddda092ed263374eb243d3e20ed8a068ff" Nov 26 15:09:28 crc kubenswrapper[4785]: I1126 15:09:28.730175 4785 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-6q4xd_855bd894-cca9-4fe1-a0d5-8b72afe7c93a/kube-multus/1.log" Nov 26 15:09:28 crc kubenswrapper[4785]: I1126 15:09:28.730323 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-6q4xd" event={"ID":"855bd894-cca9-4fe1-a0d5-8b72afe7c93a","Type":"ContainerStarted","Data":"72a7e155fb846c9d6ff923ffa51d0d8a953f1aa548ab41ffc0d7339cd228d85b"} Nov 26 15:09:29 crc kubenswrapper[4785]: I1126 15:09:29.035739 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qdfwp" Nov 26 15:09:29 crc kubenswrapper[4785]: I1126 15:09:29.035788 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 26 15:09:29 crc kubenswrapper[4785]: E1126 15:09:29.035991 4785 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qdfwp" podUID="72903df2-b694-4229-96b5-167500cab723" Nov 26 15:09:29 crc kubenswrapper[4785]: I1126 15:09:29.036101 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 26 15:09:29 crc kubenswrapper[4785]: E1126 15:09:29.036288 4785 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 26 15:09:29 crc kubenswrapper[4785]: E1126 15:09:29.036510 4785 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 26 15:09:30 crc kubenswrapper[4785]: I1126 15:09:30.036191 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 26 15:09:30 crc kubenswrapper[4785]: E1126 15:09:30.036377 4785 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 26 15:09:31 crc kubenswrapper[4785]: I1126 15:09:31.036311 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qdfwp" Nov 26 15:09:31 crc kubenswrapper[4785]: I1126 15:09:31.036392 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 26 15:09:31 crc kubenswrapper[4785]: I1126 15:09:31.036416 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 26 15:09:31 crc kubenswrapper[4785]: E1126 15:09:31.036583 4785 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qdfwp" podUID="72903df2-b694-4229-96b5-167500cab723" Nov 26 15:09:31 crc kubenswrapper[4785]: E1126 15:09:31.036692 4785 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 26 15:09:31 crc kubenswrapper[4785]: E1126 15:09:31.036870 4785 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 26 15:09:32 crc kubenswrapper[4785]: I1126 15:09:32.035927 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 26 15:09:32 crc kubenswrapper[4785]: E1126 15:09:32.036163 4785 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 26 15:09:33 crc kubenswrapper[4785]: I1126 15:09:33.036214 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qdfwp" Nov 26 15:09:33 crc kubenswrapper[4785]: I1126 15:09:33.036223 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 26 15:09:33 crc kubenswrapper[4785]: I1126 15:09:33.037609 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 26 15:09:33 crc kubenswrapper[4785]: I1126 15:09:33.040050 4785 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Nov 26 15:09:33 crc kubenswrapper[4785]: I1126 15:09:33.042021 4785 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Nov 26 15:09:33 crc kubenswrapper[4785]: I1126 15:09:33.042235 4785 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Nov 26 15:09:33 crc kubenswrapper[4785]: I1126 15:09:33.042047 4785 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Nov 26 15:09:33 crc kubenswrapper[4785]: I1126 15:09:33.042055 4785 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Nov 26 15:09:33 crc kubenswrapper[4785]: I1126 15:09:33.042765 4785 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Nov 26 15:09:34 crc kubenswrapper[4785]: I1126 15:09:34.035994 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 26 15:09:36 crc kubenswrapper[4785]: I1126 15:09:36.296458 4785 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Nov 26 15:09:36 crc kubenswrapper[4785]: I1126 15:09:36.355462 4785 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-dss42"] Nov 26 15:09:36 crc kubenswrapper[4785]: I1126 15:09:36.358196 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-dss42" Nov 26 15:09:36 crc kubenswrapper[4785]: I1126 15:09:36.374093 4785 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Nov 26 15:09:36 crc kubenswrapper[4785]: I1126 15:09:36.374862 4785 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Nov 26 15:09:36 crc kubenswrapper[4785]: I1126 15:09:36.376595 4785 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-lcfs6"] Nov 26 15:09:36 crc kubenswrapper[4785]: I1126 15:09:36.377508 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-lcfs6" Nov 26 15:09:36 crc kubenswrapper[4785]: I1126 15:09:36.378109 4785 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-s6nf7"] Nov 26 15:09:36 crc kubenswrapper[4785]: I1126 15:09:36.378625 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-s6nf7" Nov 26 15:09:36 crc kubenswrapper[4785]: I1126 15:09:36.379117 4785 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Nov 26 15:09:36 crc kubenswrapper[4785]: I1126 15:09:36.380176 4785 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-qdc2p"] Nov 26 15:09:36 crc kubenswrapper[4785]: I1126 15:09:36.380685 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-qdc2p" Nov 26 15:09:36 crc kubenswrapper[4785]: I1126 15:09:36.383124 4785 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Nov 26 15:09:36 crc kubenswrapper[4785]: I1126 15:09:36.387266 4785 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-wb7dh"] Nov 26 15:09:36 crc kubenswrapper[4785]: I1126 15:09:36.387822 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-wb7dh" Nov 26 15:09:36 crc kubenswrapper[4785]: I1126 15:09:36.389133 4785 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-pnqcw"] Nov 26 15:09:36 crc kubenswrapper[4785]: I1126 15:09:36.392766 4785 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Nov 26 15:09:36 crc kubenswrapper[4785]: I1126 15:09:36.393541 4785 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Nov 26 15:09:36 crc kubenswrapper[4785]: I1126 15:09:36.394263 4785 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-z8vql"] Nov 26 15:09:36 crc kubenswrapper[4785]: I1126 15:09:36.394936 4785 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-4h5d5"] Nov 26 15:09:36 crc kubenswrapper[4785]: I1126 15:09:36.395591 4785 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-vz9fw"] Nov 26 15:09:36 crc kubenswrapper[4785]: I1126 15:09:36.396181 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-vz9fw" Nov 26 15:09:36 crc kubenswrapper[4785]: I1126 15:09:36.396930 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-pnqcw" Nov 26 15:09:36 crc kubenswrapper[4785]: I1126 15:09:36.397116 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-z8vql" Nov 26 15:09:36 crc kubenswrapper[4785]: I1126 15:09:36.397286 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4h5d5" Nov 26 15:09:36 crc kubenswrapper[4785]: I1126 15:09:36.404719 4785 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Nov 26 15:09:36 crc kubenswrapper[4785]: I1126 15:09:36.407043 4785 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Nov 26 15:09:36 crc kubenswrapper[4785]: I1126 15:09:36.407275 4785 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Nov 26 15:09:36 crc kubenswrapper[4785]: I1126 15:09:36.408626 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/21c0b576-b3af-4d08-8e09-2c3728c8623e-client-ca\") pod \"controller-manager-879f6c89f-dss42\" (UID: \"21c0b576-b3af-4d08-8e09-2c3728c8623e\") " pod="openshift-controller-manager/controller-manager-879f6c89f-dss42" Nov 26 15:09:36 crc kubenswrapper[4785]: I1126 15:09:36.408668 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/21c0b576-b3af-4d08-8e09-2c3728c8623e-serving-cert\") pod \"controller-manager-879f6c89f-dss42\" (UID: \"21c0b576-b3af-4d08-8e09-2c3728c8623e\") " pod="openshift-controller-manager/controller-manager-879f6c89f-dss42" Nov 26 15:09:36 crc kubenswrapper[4785]: I1126 15:09:36.408784 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/21c0b576-b3af-4d08-8e09-2c3728c8623e-config\") pod \"controller-manager-879f6c89f-dss42\" (UID: \"21c0b576-b3af-4d08-8e09-2c3728c8623e\") " pod="openshift-controller-manager/controller-manager-879f6c89f-dss42" Nov 26 15:09:36 crc kubenswrapper[4785]: I1126 15:09:36.408844 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/21c0b576-b3af-4d08-8e09-2c3728c8623e-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-dss42\" (UID: \"21c0b576-b3af-4d08-8e09-2c3728c8623e\") " pod="openshift-controller-manager/controller-manager-879f6c89f-dss42" Nov 26 15:09:36 crc kubenswrapper[4785]: I1126 15:09:36.408866 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n5kn4\" (UniqueName: \"kubernetes.io/projected/21c0b576-b3af-4d08-8e09-2c3728c8623e-kube-api-access-n5kn4\") pod \"controller-manager-879f6c89f-dss42\" (UID: \"21c0b576-b3af-4d08-8e09-2c3728c8623e\") " pod="openshift-controller-manager/controller-manager-879f6c89f-dss42" Nov 26 15:09:36 crc kubenswrapper[4785]: I1126 15:09:36.409663 4785 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Nov 26 15:09:36 crc kubenswrapper[4785]: I1126 15:09:36.410074 4785 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Nov 26 15:09:36 crc kubenswrapper[4785]: I1126 15:09:36.410358 4785 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Nov 26 15:09:36 crc kubenswrapper[4785]: I1126 15:09:36.410511 4785 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Nov 26 15:09:36 crc kubenswrapper[4785]: I1126 15:09:36.410913 4785 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Nov 26 15:09:36 crc kubenswrapper[4785]: I1126 15:09:36.412779 4785 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Nov 26 15:09:36 crc kubenswrapper[4785]: I1126 15:09:36.413076 4785 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Nov 26 15:09:36 crc kubenswrapper[4785]: I1126 15:09:36.413281 4785 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Nov 26 15:09:36 crc kubenswrapper[4785]: I1126 15:09:36.413374 4785 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Nov 26 15:09:36 crc kubenswrapper[4785]: I1126 15:09:36.413500 4785 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Nov 26 15:09:36 crc kubenswrapper[4785]: I1126 15:09:36.413577 4785 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Nov 26 15:09:36 crc kubenswrapper[4785]: I1126 15:09:36.413804 4785 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Nov 26 15:09:36 crc kubenswrapper[4785]: I1126 15:09:36.413320 4785 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Nov 26 15:09:36 crc kubenswrapper[4785]: I1126 15:09:36.413912 4785 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Nov 26 15:09:36 crc kubenswrapper[4785]: I1126 15:09:36.414289 4785 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Nov 26 15:09:36 crc kubenswrapper[4785]: I1126 15:09:36.414485 4785 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Nov 26 15:09:36 crc kubenswrapper[4785]: I1126 15:09:36.414718 4785 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Nov 26 15:09:36 crc kubenswrapper[4785]: I1126 15:09:36.415432 4785 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Nov 26 15:09:36 crc kubenswrapper[4785]: I1126 15:09:36.416011 4785 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Nov 26 15:09:36 crc kubenswrapper[4785]: I1126 15:09:36.416368 4785 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Nov 26 15:09:36 crc kubenswrapper[4785]: I1126 15:09:36.416641 4785 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Nov 26 15:09:36 crc kubenswrapper[4785]: I1126 15:09:36.416882 4785 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Nov 26 15:09:36 crc kubenswrapper[4785]: I1126 15:09:36.417058 4785 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Nov 26 15:09:36 crc kubenswrapper[4785]: I1126 15:09:36.417239 4785 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Nov 26 15:09:36 crc kubenswrapper[4785]: I1126 15:09:36.417804 4785 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Nov 26 15:09:36 crc kubenswrapper[4785]: I1126 15:09:36.418101 4785 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Nov 26 15:09:36 crc kubenswrapper[4785]: I1126 15:09:36.418280 4785 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Nov 26 15:09:36 crc kubenswrapper[4785]: I1126 15:09:36.418303 4785 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Nov 26 15:09:36 crc kubenswrapper[4785]: I1126 15:09:36.418455 4785 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Nov 26 15:09:36 crc kubenswrapper[4785]: I1126 15:09:36.418592 4785 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Nov 26 15:09:36 crc kubenswrapper[4785]: I1126 15:09:36.418703 4785 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Nov 26 15:09:36 crc kubenswrapper[4785]: I1126 15:09:36.418830 4785 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Nov 26 15:09:36 crc kubenswrapper[4785]: I1126 15:09:36.418879 4785 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Nov 26 15:09:36 crc kubenswrapper[4785]: I1126 15:09:36.418940 4785 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Nov 26 15:09:36 crc kubenswrapper[4785]: I1126 15:09:36.418991 4785 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Nov 26 15:09:36 crc kubenswrapper[4785]: I1126 15:09:36.419074 4785 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Nov 26 15:09:36 crc kubenswrapper[4785]: I1126 15:09:36.419128 4785 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Nov 26 15:09:36 crc kubenswrapper[4785]: I1126 15:09:36.419177 4785 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Nov 26 15:09:36 crc kubenswrapper[4785]: I1126 15:09:36.419244 4785 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Nov 26 15:09:36 crc kubenswrapper[4785]: I1126 15:09:36.419352 4785 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Nov 26 15:09:36 crc kubenswrapper[4785]: I1126 15:09:36.419587 4785 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Nov 26 15:09:36 crc kubenswrapper[4785]: I1126 15:09:36.420061 4785 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Nov 26 15:09:36 crc kubenswrapper[4785]: I1126 15:09:36.420220 4785 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Nov 26 15:09:36 crc kubenswrapper[4785]: I1126 15:09:36.420320 4785 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Nov 26 15:09:36 crc kubenswrapper[4785]: I1126 15:09:36.420330 4785 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Nov 26 15:09:36 crc kubenswrapper[4785]: I1126 15:09:36.420482 4785 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Nov 26 15:09:36 crc kubenswrapper[4785]: I1126 15:09:36.425457 4785 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Nov 26 15:09:36 crc kubenswrapper[4785]: I1126 15:09:36.426353 4785 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-jmg2s"] Nov 26 15:09:36 crc kubenswrapper[4785]: I1126 15:09:36.426937 4785 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-4k7ds"] Nov 26 15:09:36 crc kubenswrapper[4785]: I1126 15:09:36.427344 4785 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-w4twq"] Nov 26 15:09:36 crc kubenswrapper[4785]: I1126 15:09:36.427770 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-w4twq" Nov 26 15:09:36 crc kubenswrapper[4785]: I1126 15:09:36.428144 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-jmg2s" Nov 26 15:09:36 crc kubenswrapper[4785]: I1126 15:09:36.428420 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-4k7ds" Nov 26 15:09:36 crc kubenswrapper[4785]: I1126 15:09:36.429769 4785 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-6cx55"] Nov 26 15:09:36 crc kubenswrapper[4785]: I1126 15:09:36.430119 4785 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-s7fwr"] Nov 26 15:09:36 crc kubenswrapper[4785]: I1126 15:09:36.430334 4785 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-bzwjs"] Nov 26 15:09:36 crc kubenswrapper[4785]: I1126 15:09:36.431421 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-6cx55" Nov 26 15:09:36 crc kubenswrapper[4785]: I1126 15:09:36.433098 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-s7fwr" Nov 26 15:09:36 crc kubenswrapper[4785]: I1126 15:09:36.458811 4785 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-vn49n"] Nov 26 15:09:36 crc kubenswrapper[4785]: I1126 15:09:36.459414 4785 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-nrsd2"] Nov 26 15:09:36 crc kubenswrapper[4785]: I1126 15:09:36.460191 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-vn49n" Nov 26 15:09:36 crc kubenswrapper[4785]: I1126 15:09:36.460496 4785 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-6tqsq"] Nov 26 15:09:36 crc kubenswrapper[4785]: I1126 15:09:36.460681 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-bzwjs" Nov 26 15:09:36 crc kubenswrapper[4785]: I1126 15:09:36.461117 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-nrsd2" Nov 26 15:09:36 crc kubenswrapper[4785]: I1126 15:09:36.461362 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-6tqsq" Nov 26 15:09:36 crc kubenswrapper[4785]: I1126 15:09:36.479390 4785 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-4v4sr"] Nov 26 15:09:36 crc kubenswrapper[4785]: I1126 15:09:36.479795 4785 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-b2j9t"] Nov 26 15:09:36 crc kubenswrapper[4785]: I1126 15:09:36.479938 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-4v4sr" Nov 26 15:09:36 crc kubenswrapper[4785]: I1126 15:09:36.480795 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-b2j9t" Nov 26 15:09:36 crc kubenswrapper[4785]: I1126 15:09:36.483094 4785 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-dss42"] Nov 26 15:09:36 crc kubenswrapper[4785]: I1126 15:09:36.489750 4785 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Nov 26 15:09:36 crc kubenswrapper[4785]: I1126 15:09:36.490134 4785 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Nov 26 15:09:36 crc kubenswrapper[4785]: I1126 15:09:36.490321 4785 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Nov 26 15:09:36 crc kubenswrapper[4785]: I1126 15:09:36.490425 4785 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Nov 26 15:09:36 crc kubenswrapper[4785]: I1126 15:09:36.490528 4785 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Nov 26 15:09:36 crc kubenswrapper[4785]: I1126 15:09:36.490694 4785 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Nov 26 15:09:36 crc kubenswrapper[4785]: I1126 15:09:36.492065 4785 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Nov 26 15:09:36 crc kubenswrapper[4785]: I1126 15:09:36.492146 4785 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Nov 26 15:09:36 crc kubenswrapper[4785]: I1126 15:09:36.492362 4785 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Nov 26 15:09:36 crc kubenswrapper[4785]: I1126 15:09:36.492458 4785 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Nov 26 15:09:36 crc kubenswrapper[4785]: I1126 15:09:36.492779 4785 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Nov 26 15:09:36 crc kubenswrapper[4785]: I1126 15:09:36.492935 4785 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Nov 26 15:09:36 crc kubenswrapper[4785]: I1126 15:09:36.492986 4785 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Nov 26 15:09:36 crc kubenswrapper[4785]: I1126 15:09:36.493029 4785 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Nov 26 15:09:36 crc kubenswrapper[4785]: I1126 15:09:36.493182 4785 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Nov 26 15:09:36 crc kubenswrapper[4785]: I1126 15:09:36.493203 4785 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Nov 26 15:09:36 crc kubenswrapper[4785]: I1126 15:09:36.493417 4785 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Nov 26 15:09:36 crc kubenswrapper[4785]: I1126 15:09:36.493570 4785 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Nov 26 15:09:36 crc kubenswrapper[4785]: I1126 15:09:36.493627 4785 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Nov 26 15:09:36 crc kubenswrapper[4785]: I1126 15:09:36.493781 4785 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Nov 26 15:09:36 crc kubenswrapper[4785]: I1126 15:09:36.493736 4785 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-4q4dl"] Nov 26 15:09:36 crc kubenswrapper[4785]: I1126 15:09:36.493708 4785 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Nov 26 15:09:36 crc kubenswrapper[4785]: I1126 15:09:36.494094 4785 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Nov 26 15:09:36 crc kubenswrapper[4785]: I1126 15:09:36.494228 4785 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Nov 26 15:09:36 crc kubenswrapper[4785]: I1126 15:09:36.494305 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-4q4dl" Nov 26 15:09:36 crc kubenswrapper[4785]: I1126 15:09:36.494584 4785 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-9wv9b"] Nov 26 15:09:36 crc kubenswrapper[4785]: I1126 15:09:36.494975 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-9wv9b" Nov 26 15:09:36 crc kubenswrapper[4785]: I1126 15:09:36.495882 4785 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Nov 26 15:09:36 crc kubenswrapper[4785]: I1126 15:09:36.496249 4785 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-98w8d"] Nov 26 15:09:36 crc kubenswrapper[4785]: I1126 15:09:36.496594 4785 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Nov 26 15:09:36 crc kubenswrapper[4785]: I1126 15:09:36.497070 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-98w8d" Nov 26 15:09:36 crc kubenswrapper[4785]: I1126 15:09:36.497826 4785 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Nov 26 15:09:36 crc kubenswrapper[4785]: I1126 15:09:36.501471 4785 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Nov 26 15:09:36 crc kubenswrapper[4785]: I1126 15:09:36.501791 4785 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Nov 26 15:09:36 crc kubenswrapper[4785]: I1126 15:09:36.501918 4785 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-sq88m"] Nov 26 15:09:36 crc kubenswrapper[4785]: I1126 15:09:36.502416 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-sq88m" Nov 26 15:09:36 crc kubenswrapper[4785]: I1126 15:09:36.502695 4785 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Nov 26 15:09:36 crc kubenswrapper[4785]: I1126 15:09:36.503534 4785 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Nov 26 15:09:36 crc kubenswrapper[4785]: I1126 15:09:36.504181 4785 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Nov 26 15:09:36 crc kubenswrapper[4785]: I1126 15:09:36.504393 4785 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Nov 26 15:09:36 crc kubenswrapper[4785]: I1126 15:09:36.506307 4785 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-mwxx7"] Nov 26 15:09:36 crc kubenswrapper[4785]: I1126 15:09:36.507141 4785 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-qr82q"] Nov 26 15:09:36 crc kubenswrapper[4785]: I1126 15:09:36.507268 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-mwxx7" Nov 26 15:09:36 crc kubenswrapper[4785]: I1126 15:09:36.507273 4785 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Nov 26 15:09:36 crc kubenswrapper[4785]: I1126 15:09:36.508224 4785 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-wmz66"] Nov 26 15:09:36 crc kubenswrapper[4785]: I1126 15:09:36.509343 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-qr82q" Nov 26 15:09:36 crc kubenswrapper[4785]: I1126 15:09:36.509801 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/44ec5c36-3bfd-42c0-a4a7-28f7e0e5aad7-trusted-ca-bundle\") pod \"console-f9d7485db-6cx55\" (UID: \"44ec5c36-3bfd-42c0-a4a7-28f7e0e5aad7\") " pod="openshift-console/console-f9d7485db-6cx55" Nov 26 15:09:36 crc kubenswrapper[4785]: I1126 15:09:36.509844 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2h75g\" (UniqueName: \"kubernetes.io/projected/b475816f-0aee-4aed-92a4-82ced173b416-kube-api-access-2h75g\") pod \"openshift-config-operator-7777fb866f-z8vql\" (UID: \"b475816f-0aee-4aed-92a4-82ced173b416\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-z8vql" Nov 26 15:09:36 crc kubenswrapper[4785]: I1126 15:09:36.509877 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7wzr2\" (UniqueName: \"kubernetes.io/projected/44ec5c36-3bfd-42c0-a4a7-28f7e0e5aad7-kube-api-access-7wzr2\") pod \"console-f9d7485db-6cx55\" (UID: \"44ec5c36-3bfd-42c0-a4a7-28f7e0e5aad7\") " pod="openshift-console/console-f9d7485db-6cx55" Nov 26 15:09:36 crc kubenswrapper[4785]: I1126 15:09:36.509892 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/44ec5c36-3bfd-42c0-a4a7-28f7e0e5aad7-console-serving-cert\") pod \"console-f9d7485db-6cx55\" (UID: \"44ec5c36-3bfd-42c0-a4a7-28f7e0e5aad7\") " pod="openshift-console/console-f9d7485db-6cx55" Nov 26 15:09:36 crc kubenswrapper[4785]: I1126 15:09:36.509920 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/21c0b576-b3af-4d08-8e09-2c3728c8623e-client-ca\") pod \"controller-manager-879f6c89f-dss42\" (UID: \"21c0b576-b3af-4d08-8e09-2c3728c8623e\") " pod="openshift-controller-manager/controller-manager-879f6c89f-dss42" Nov 26 15:09:36 crc kubenswrapper[4785]: I1126 15:09:36.509937 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/21c0b576-b3af-4d08-8e09-2c3728c8623e-serving-cert\") pod \"controller-manager-879f6c89f-dss42\" (UID: \"21c0b576-b3af-4d08-8e09-2c3728c8623e\") " pod="openshift-controller-manager/controller-manager-879f6c89f-dss42" Nov 26 15:09:36 crc kubenswrapper[4785]: I1126 15:09:36.509953 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b475816f-0aee-4aed-92a4-82ced173b416-serving-cert\") pod \"openshift-config-operator-7777fb866f-z8vql\" (UID: \"b475816f-0aee-4aed-92a4-82ced173b416\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-z8vql" Nov 26 15:09:36 crc kubenswrapper[4785]: I1126 15:09:36.509984 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/44ec5c36-3bfd-42c0-a4a7-28f7e0e5aad7-console-config\") pod \"console-f9d7485db-6cx55\" (UID: \"44ec5c36-3bfd-42c0-a4a7-28f7e0e5aad7\") " pod="openshift-console/console-f9d7485db-6cx55" Nov 26 15:09:36 crc kubenswrapper[4785]: I1126 15:09:36.510000 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/44ec5c36-3bfd-42c0-a4a7-28f7e0e5aad7-oauth-serving-cert\") pod \"console-f9d7485db-6cx55\" (UID: \"44ec5c36-3bfd-42c0-a4a7-28f7e0e5aad7\") " pod="openshift-console/console-f9d7485db-6cx55" Nov 26 15:09:36 crc kubenswrapper[4785]: I1126 15:09:36.510019 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/21c0b576-b3af-4d08-8e09-2c3728c8623e-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-dss42\" (UID: \"21c0b576-b3af-4d08-8e09-2c3728c8623e\") " pod="openshift-controller-manager/controller-manager-879f6c89f-dss42" Nov 26 15:09:36 crc kubenswrapper[4785]: I1126 15:09:36.510037 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/44ec5c36-3bfd-42c0-a4a7-28f7e0e5aad7-console-oauth-config\") pod \"console-f9d7485db-6cx55\" (UID: \"44ec5c36-3bfd-42c0-a4a7-28f7e0e5aad7\") " pod="openshift-console/console-f9d7485db-6cx55" Nov 26 15:09:36 crc kubenswrapper[4785]: I1126 15:09:36.510054 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n5kn4\" (UniqueName: \"kubernetes.io/projected/21c0b576-b3af-4d08-8e09-2c3728c8623e-kube-api-access-n5kn4\") pod \"controller-manager-879f6c89f-dss42\" (UID: \"21c0b576-b3af-4d08-8e09-2c3728c8623e\") " pod="openshift-controller-manager/controller-manager-879f6c89f-dss42" Nov 26 15:09:36 crc kubenswrapper[4785]: I1126 15:09:36.510071 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/b475816f-0aee-4aed-92a4-82ced173b416-available-featuregates\") pod \"openshift-config-operator-7777fb866f-z8vql\" (UID: \"b475816f-0aee-4aed-92a4-82ced173b416\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-z8vql" Nov 26 15:09:36 crc kubenswrapper[4785]: I1126 15:09:36.510088 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/44ec5c36-3bfd-42c0-a4a7-28f7e0e5aad7-service-ca\") pod \"console-f9d7485db-6cx55\" (UID: \"44ec5c36-3bfd-42c0-a4a7-28f7e0e5aad7\") " pod="openshift-console/console-f9d7485db-6cx55" Nov 26 15:09:36 crc kubenswrapper[4785]: I1126 15:09:36.510111 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/21c0b576-b3af-4d08-8e09-2c3728c8623e-config\") pod \"controller-manager-879f6c89f-dss42\" (UID: \"21c0b576-b3af-4d08-8e09-2c3728c8623e\") " pod="openshift-controller-manager/controller-manager-879f6c89f-dss42" Nov 26 15:09:36 crc kubenswrapper[4785]: I1126 15:09:36.510180 4785 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Nov 26 15:09:36 crc kubenswrapper[4785]: I1126 15:09:36.510528 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-wmz66" Nov 26 15:09:36 crc kubenswrapper[4785]: I1126 15:09:36.510904 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/21c0b576-b3af-4d08-8e09-2c3728c8623e-client-ca\") pod \"controller-manager-879f6c89f-dss42\" (UID: \"21c0b576-b3af-4d08-8e09-2c3728c8623e\") " pod="openshift-controller-manager/controller-manager-879f6c89f-dss42" Nov 26 15:09:36 crc kubenswrapper[4785]: I1126 15:09:36.511321 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/21c0b576-b3af-4d08-8e09-2c3728c8623e-config\") pod \"controller-manager-879f6c89f-dss42\" (UID: \"21c0b576-b3af-4d08-8e09-2c3728c8623e\") " pod="openshift-controller-manager/controller-manager-879f6c89f-dss42" Nov 26 15:09:36 crc kubenswrapper[4785]: I1126 15:09:36.511451 4785 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Nov 26 15:09:36 crc kubenswrapper[4785]: I1126 15:09:36.511605 4785 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Nov 26 15:09:36 crc kubenswrapper[4785]: I1126 15:09:36.511731 4785 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Nov 26 15:09:36 crc kubenswrapper[4785]: I1126 15:09:36.511858 4785 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-vt4z2"] Nov 26 15:09:36 crc kubenswrapper[4785]: I1126 15:09:36.513142 4785 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Nov 26 15:09:36 crc kubenswrapper[4785]: I1126 15:09:36.515092 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-vt4z2" Nov 26 15:09:36 crc kubenswrapper[4785]: I1126 15:09:36.515487 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/21c0b576-b3af-4d08-8e09-2c3728c8623e-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-dss42\" (UID: \"21c0b576-b3af-4d08-8e09-2c3728c8623e\") " pod="openshift-controller-manager/controller-manager-879f6c89f-dss42" Nov 26 15:09:36 crc kubenswrapper[4785]: I1126 15:09:36.518135 4785 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Nov 26 15:09:36 crc kubenswrapper[4785]: I1126 15:09:36.520920 4785 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-56grk"] Nov 26 15:09:36 crc kubenswrapper[4785]: I1126 15:09:36.543183 4785 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Nov 26 15:09:36 crc kubenswrapper[4785]: I1126 15:09:36.543613 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-56grk" Nov 26 15:09:36 crc kubenswrapper[4785]: I1126 15:09:36.543955 4785 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Nov 26 15:09:36 crc kubenswrapper[4785]: I1126 15:09:36.544079 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/21c0b576-b3af-4d08-8e09-2c3728c8623e-serving-cert\") pod \"controller-manager-879f6c89f-dss42\" (UID: \"21c0b576-b3af-4d08-8e09-2c3728c8623e\") " pod="openshift-controller-manager/controller-manager-879f6c89f-dss42" Nov 26 15:09:36 crc kubenswrapper[4785]: I1126 15:09:36.544203 4785 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Nov 26 15:09:36 crc kubenswrapper[4785]: I1126 15:09:36.545787 4785 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Nov 26 15:09:36 crc kubenswrapper[4785]: I1126 15:09:36.545866 4785 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Nov 26 15:09:36 crc kubenswrapper[4785]: I1126 15:09:36.545883 4785 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Nov 26 15:09:36 crc kubenswrapper[4785]: I1126 15:09:36.546205 4785 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Nov 26 15:09:36 crc kubenswrapper[4785]: I1126 15:09:36.546205 4785 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-dr58q"] Nov 26 15:09:36 crc kubenswrapper[4785]: I1126 15:09:36.546971 4785 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Nov 26 15:09:36 crc kubenswrapper[4785]: I1126 15:09:36.547375 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-dr58q" Nov 26 15:09:36 crc kubenswrapper[4785]: I1126 15:09:36.547538 4785 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Nov 26 15:09:36 crc kubenswrapper[4785]: I1126 15:09:36.548269 4785 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-hlltg"] Nov 26 15:09:36 crc kubenswrapper[4785]: I1126 15:09:36.548811 4785 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Nov 26 15:09:36 crc kubenswrapper[4785]: I1126 15:09:36.549808 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-hlltg" Nov 26 15:09:36 crc kubenswrapper[4785]: I1126 15:09:36.550034 4785 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Nov 26 15:09:36 crc kubenswrapper[4785]: I1126 15:09:36.550733 4785 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4z7ct"] Nov 26 15:09:36 crc kubenswrapper[4785]: I1126 15:09:36.551246 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4z7ct" Nov 26 15:09:36 crc kubenswrapper[4785]: I1126 15:09:36.554226 4785 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-7dxnf"] Nov 26 15:09:36 crc kubenswrapper[4785]: I1126 15:09:36.554801 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-7dxnf" Nov 26 15:09:36 crc kubenswrapper[4785]: I1126 15:09:36.560643 4785 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-h952m"] Nov 26 15:09:36 crc kubenswrapper[4785]: I1126 15:09:36.561238 4785 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Nov 26 15:09:36 crc kubenswrapper[4785]: I1126 15:09:36.562065 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-h952m" Nov 26 15:09:36 crc kubenswrapper[4785]: I1126 15:09:36.563802 4785 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-mvkhw"] Nov 26 15:09:36 crc kubenswrapper[4785]: I1126 15:09:36.564451 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-mvkhw" Nov 26 15:09:36 crc kubenswrapper[4785]: I1126 15:09:36.565745 4785 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-4h5d5"] Nov 26 15:09:36 crc kubenswrapper[4785]: I1126 15:09:36.567046 4785 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-cllbv"] Nov 26 15:09:36 crc kubenswrapper[4785]: I1126 15:09:36.568827 4785 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-fsdwl"] Nov 26 15:09:36 crc kubenswrapper[4785]: I1126 15:09:36.569514 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-fsdwl" Nov 26 15:09:36 crc kubenswrapper[4785]: I1126 15:09:36.570090 4785 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-pnqcw"] Nov 26 15:09:36 crc kubenswrapper[4785]: I1126 15:09:36.570137 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-cllbv" Nov 26 15:09:36 crc kubenswrapper[4785]: I1126 15:09:36.571442 4785 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-wb7dh"] Nov 26 15:09:36 crc kubenswrapper[4785]: I1126 15:09:36.572902 4785 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-qdc2p"] Nov 26 15:09:36 crc kubenswrapper[4785]: I1126 15:09:36.574201 4785 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-z8vql"] Nov 26 15:09:36 crc kubenswrapper[4785]: I1126 15:09:36.575123 4785 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29402820-vcn7w"] Nov 26 15:09:36 crc kubenswrapper[4785]: I1126 15:09:36.576039 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29402820-vcn7w" Nov 26 15:09:36 crc kubenswrapper[4785]: I1126 15:09:36.578670 4785 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-vn49n"] Nov 26 15:09:36 crc kubenswrapper[4785]: I1126 15:09:36.578741 4785 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-lcfs6"] Nov 26 15:09:36 crc kubenswrapper[4785]: I1126 15:09:36.582116 4785 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-s6nf7"] Nov 26 15:09:36 crc kubenswrapper[4785]: I1126 15:09:36.582788 4785 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Nov 26 15:09:36 crc kubenswrapper[4785]: I1126 15:09:36.583590 4785 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-6cx55"] Nov 26 15:09:36 crc kubenswrapper[4785]: I1126 15:09:36.584640 4785 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-4k7ds"] Nov 26 15:09:36 crc kubenswrapper[4785]: I1126 15:09:36.586512 4785 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-nrsd2"] Nov 26 15:09:36 crc kubenswrapper[4785]: I1126 15:09:36.591838 4785 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-6tqsq"] Nov 26 15:09:36 crc kubenswrapper[4785]: I1126 15:09:36.597148 4785 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-s7fwr"] Nov 26 15:09:36 crc kubenswrapper[4785]: I1126 15:09:36.598740 4785 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-98w8d"] Nov 26 15:09:36 crc kubenswrapper[4785]: I1126 15:09:36.600355 4785 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-sq88m"] Nov 26 15:09:36 crc kubenswrapper[4785]: I1126 15:09:36.600988 4785 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Nov 26 15:09:36 crc kubenswrapper[4785]: I1126 15:09:36.601415 4785 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-vt4z2"] Nov 26 15:09:36 crc kubenswrapper[4785]: I1126 15:09:36.604993 4785 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-dr58q"] Nov 26 15:09:36 crc kubenswrapper[4785]: I1126 15:09:36.611017 4785 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-bzwjs"] Nov 26 15:09:36 crc kubenswrapper[4785]: I1126 15:09:36.611101 4785 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-jmg2s"] Nov 26 15:09:36 crc kubenswrapper[4785]: I1126 15:09:36.613070 4785 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-fsdwl"] Nov 26 15:09:36 crc kubenswrapper[4785]: I1126 15:09:36.613721 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/44ec5c36-3bfd-42c0-a4a7-28f7e0e5aad7-console-oauth-config\") pod \"console-f9d7485db-6cx55\" (UID: \"44ec5c36-3bfd-42c0-a4a7-28f7e0e5aad7\") " pod="openshift-console/console-f9d7485db-6cx55" Nov 26 15:09:36 crc kubenswrapper[4785]: I1126 15:09:36.613797 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/b475816f-0aee-4aed-92a4-82ced173b416-available-featuregates\") pod \"openshift-config-operator-7777fb866f-z8vql\" (UID: \"b475816f-0aee-4aed-92a4-82ced173b416\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-z8vql" Nov 26 15:09:36 crc kubenswrapper[4785]: I1126 15:09:36.613831 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/44ec5c36-3bfd-42c0-a4a7-28f7e0e5aad7-service-ca\") pod \"console-f9d7485db-6cx55\" (UID: \"44ec5c36-3bfd-42c0-a4a7-28f7e0e5aad7\") " pod="openshift-console/console-f9d7485db-6cx55" Nov 26 15:09:36 crc kubenswrapper[4785]: I1126 15:09:36.613882 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/44ec5c36-3bfd-42c0-a4a7-28f7e0e5aad7-trusted-ca-bundle\") pod \"console-f9d7485db-6cx55\" (UID: \"44ec5c36-3bfd-42c0-a4a7-28f7e0e5aad7\") " pod="openshift-console/console-f9d7485db-6cx55" Nov 26 15:09:36 crc kubenswrapper[4785]: I1126 15:09:36.613932 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2h75g\" (UniqueName: \"kubernetes.io/projected/b475816f-0aee-4aed-92a4-82ced173b416-kube-api-access-2h75g\") pod \"openshift-config-operator-7777fb866f-z8vql\" (UID: \"b475816f-0aee-4aed-92a4-82ced173b416\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-z8vql" Nov 26 15:09:36 crc kubenswrapper[4785]: I1126 15:09:36.613960 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/44ec5c36-3bfd-42c0-a4a7-28f7e0e5aad7-console-serving-cert\") pod \"console-f9d7485db-6cx55\" (UID: \"44ec5c36-3bfd-42c0-a4a7-28f7e0e5aad7\") " pod="openshift-console/console-f9d7485db-6cx55" Nov 26 15:09:36 crc kubenswrapper[4785]: I1126 15:09:36.613985 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7wzr2\" (UniqueName: \"kubernetes.io/projected/44ec5c36-3bfd-42c0-a4a7-28f7e0e5aad7-kube-api-access-7wzr2\") pod \"console-f9d7485db-6cx55\" (UID: \"44ec5c36-3bfd-42c0-a4a7-28f7e0e5aad7\") " pod="openshift-console/console-f9d7485db-6cx55" Nov 26 15:09:36 crc kubenswrapper[4785]: I1126 15:09:36.614026 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b475816f-0aee-4aed-92a4-82ced173b416-serving-cert\") pod \"openshift-config-operator-7777fb866f-z8vql\" (UID: \"b475816f-0aee-4aed-92a4-82ced173b416\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-z8vql" Nov 26 15:09:36 crc kubenswrapper[4785]: I1126 15:09:36.614059 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/44ec5c36-3bfd-42c0-a4a7-28f7e0e5aad7-console-config\") pod \"console-f9d7485db-6cx55\" (UID: \"44ec5c36-3bfd-42c0-a4a7-28f7e0e5aad7\") " pod="openshift-console/console-f9d7485db-6cx55" Nov 26 15:09:36 crc kubenswrapper[4785]: I1126 15:09:36.614094 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/44ec5c36-3bfd-42c0-a4a7-28f7e0e5aad7-oauth-serving-cert\") pod \"console-f9d7485db-6cx55\" (UID: \"44ec5c36-3bfd-42c0-a4a7-28f7e0e5aad7\") " pod="openshift-console/console-f9d7485db-6cx55" Nov 26 15:09:36 crc kubenswrapper[4785]: I1126 15:09:36.614504 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/b475816f-0aee-4aed-92a4-82ced173b416-available-featuregates\") pod \"openshift-config-operator-7777fb866f-z8vql\" (UID: \"b475816f-0aee-4aed-92a4-82ced173b416\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-z8vql" Nov 26 15:09:36 crc kubenswrapper[4785]: I1126 15:09:36.615224 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/44ec5c36-3bfd-42c0-a4a7-28f7e0e5aad7-oauth-serving-cert\") pod \"console-f9d7485db-6cx55\" (UID: \"44ec5c36-3bfd-42c0-a4a7-28f7e0e5aad7\") " pod="openshift-console/console-f9d7485db-6cx55" Nov 26 15:09:36 crc kubenswrapper[4785]: I1126 15:09:36.615341 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/44ec5c36-3bfd-42c0-a4a7-28f7e0e5aad7-service-ca\") pod \"console-f9d7485db-6cx55\" (UID: \"44ec5c36-3bfd-42c0-a4a7-28f7e0e5aad7\") " pod="openshift-console/console-f9d7485db-6cx55" Nov 26 15:09:36 crc kubenswrapper[4785]: I1126 15:09:36.615371 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/44ec5c36-3bfd-42c0-a4a7-28f7e0e5aad7-console-config\") pod \"console-f9d7485db-6cx55\" (UID: \"44ec5c36-3bfd-42c0-a4a7-28f7e0e5aad7\") " pod="openshift-console/console-f9d7485db-6cx55" Nov 26 15:09:36 crc kubenswrapper[4785]: I1126 15:09:36.616880 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b475816f-0aee-4aed-92a4-82ced173b416-serving-cert\") pod \"openshift-config-operator-7777fb866f-z8vql\" (UID: \"b475816f-0aee-4aed-92a4-82ced173b416\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-z8vql" Nov 26 15:09:36 crc kubenswrapper[4785]: I1126 15:09:36.617468 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/44ec5c36-3bfd-42c0-a4a7-28f7e0e5aad7-trusted-ca-bundle\") pod \"console-f9d7485db-6cx55\" (UID: \"44ec5c36-3bfd-42c0-a4a7-28f7e0e5aad7\") " pod="openshift-console/console-f9d7485db-6cx55" Nov 26 15:09:36 crc kubenswrapper[4785]: I1126 15:09:36.618023 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/44ec5c36-3bfd-42c0-a4a7-28f7e0e5aad7-console-oauth-config\") pod \"console-f9d7485db-6cx55\" (UID: \"44ec5c36-3bfd-42c0-a4a7-28f7e0e5aad7\") " pod="openshift-console/console-f9d7485db-6cx55" Nov 26 15:09:36 crc kubenswrapper[4785]: I1126 15:09:36.618080 4785 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-vz9fw"] Nov 26 15:09:36 crc kubenswrapper[4785]: I1126 15:09:36.618934 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/44ec5c36-3bfd-42c0-a4a7-28f7e0e5aad7-console-serving-cert\") pod \"console-f9d7485db-6cx55\" (UID: \"44ec5c36-3bfd-42c0-a4a7-28f7e0e5aad7\") " pod="openshift-console/console-f9d7485db-6cx55" Nov 26 15:09:36 crc kubenswrapper[4785]: I1126 15:09:36.620095 4785 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-4q4dl"] Nov 26 15:09:36 crc kubenswrapper[4785]: I1126 15:09:36.621013 4785 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Nov 26 15:09:36 crc kubenswrapper[4785]: I1126 15:09:36.626642 4785 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-b2j9t"] Nov 26 15:09:36 crc kubenswrapper[4785]: I1126 15:09:36.627779 4785 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-qr82q"] Nov 26 15:09:36 crc kubenswrapper[4785]: I1126 15:09:36.633755 4785 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-wmz66"] Nov 26 15:09:36 crc kubenswrapper[4785]: I1126 15:09:36.634903 4785 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-mwxx7"] Nov 26 15:09:36 crc kubenswrapper[4785]: I1126 15:09:36.637788 4785 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-9wv9b"] Nov 26 15:09:36 crc kubenswrapper[4785]: I1126 15:09:36.640301 4785 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Nov 26 15:09:36 crc kubenswrapper[4785]: I1126 15:09:36.641120 4785 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-56grk"] Nov 26 15:09:36 crc kubenswrapper[4785]: I1126 15:09:36.645172 4785 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-mvkhw"] Nov 26 15:09:36 crc kubenswrapper[4785]: I1126 15:09:36.646630 4785 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-h952m"] Nov 26 15:09:36 crc kubenswrapper[4785]: I1126 15:09:36.649765 4785 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4z7ct"] Nov 26 15:09:36 crc kubenswrapper[4785]: I1126 15:09:36.651323 4785 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-7dxnf"] Nov 26 15:09:36 crc kubenswrapper[4785]: I1126 15:09:36.653570 4785 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-hlltg"] Nov 26 15:09:36 crc kubenswrapper[4785]: I1126 15:09:36.655702 4785 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29402820-vcn7w"] Nov 26 15:09:36 crc kubenswrapper[4785]: I1126 15:09:36.657520 4785 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-cllbv"] Nov 26 15:09:36 crc kubenswrapper[4785]: I1126 15:09:36.659567 4785 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-452cx"] Nov 26 15:09:36 crc kubenswrapper[4785]: I1126 15:09:36.660610 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-452cx" Nov 26 15:09:36 crc kubenswrapper[4785]: I1126 15:09:36.661654 4785 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Nov 26 15:09:36 crc kubenswrapper[4785]: I1126 15:09:36.662670 4785 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-s7n5j"] Nov 26 15:09:36 crc kubenswrapper[4785]: I1126 15:09:36.663917 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-s7n5j" Nov 26 15:09:36 crc kubenswrapper[4785]: I1126 15:09:36.664213 4785 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-h5xj5"] Nov 26 15:09:36 crc kubenswrapper[4785]: I1126 15:09:36.664640 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-h5xj5" Nov 26 15:09:36 crc kubenswrapper[4785]: I1126 15:09:36.665409 4785 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-452cx"] Nov 26 15:09:36 crc kubenswrapper[4785]: I1126 15:09:36.666778 4785 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-h5xj5"] Nov 26 15:09:36 crc kubenswrapper[4785]: I1126 15:09:36.667628 4785 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-s7n5j"] Nov 26 15:09:36 crc kubenswrapper[4785]: I1126 15:09:36.668640 4785 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-cnl46"] Nov 26 15:09:36 crc kubenswrapper[4785]: I1126 15:09:36.669281 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-cnl46" Nov 26 15:09:36 crc kubenswrapper[4785]: I1126 15:09:36.681200 4785 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Nov 26 15:09:36 crc kubenswrapper[4785]: I1126 15:09:36.701445 4785 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Nov 26 15:09:36 crc kubenswrapper[4785]: I1126 15:09:36.721346 4785 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Nov 26 15:09:36 crc kubenswrapper[4785]: I1126 15:09:36.741054 4785 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Nov 26 15:09:36 crc kubenswrapper[4785]: I1126 15:09:36.762233 4785 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Nov 26 15:09:36 crc kubenswrapper[4785]: I1126 15:09:36.781488 4785 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Nov 26 15:09:36 crc kubenswrapper[4785]: I1126 15:09:36.801448 4785 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Nov 26 15:09:36 crc kubenswrapper[4785]: I1126 15:09:36.822179 4785 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Nov 26 15:09:36 crc kubenswrapper[4785]: I1126 15:09:36.861710 4785 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Nov 26 15:09:36 crc kubenswrapper[4785]: I1126 15:09:36.881442 4785 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Nov 26 15:09:36 crc kubenswrapper[4785]: I1126 15:09:36.901332 4785 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Nov 26 15:09:36 crc kubenswrapper[4785]: I1126 15:09:36.920491 4785 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Nov 26 15:09:36 crc kubenswrapper[4785]: I1126 15:09:36.941875 4785 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Nov 26 15:09:36 crc kubenswrapper[4785]: I1126 15:09:36.961675 4785 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Nov 26 15:09:36 crc kubenswrapper[4785]: I1126 15:09:36.981441 4785 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Nov 26 15:09:37 crc kubenswrapper[4785]: I1126 15:09:37.001708 4785 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Nov 26 15:09:37 crc kubenswrapper[4785]: I1126 15:09:37.021534 4785 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Nov 26 15:09:37 crc kubenswrapper[4785]: I1126 15:09:37.041945 4785 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Nov 26 15:09:37 crc kubenswrapper[4785]: I1126 15:09:37.062087 4785 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Nov 26 15:09:37 crc kubenswrapper[4785]: I1126 15:09:37.081270 4785 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Nov 26 15:09:37 crc kubenswrapper[4785]: I1126 15:09:37.101460 4785 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Nov 26 15:09:37 crc kubenswrapper[4785]: I1126 15:09:37.121035 4785 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Nov 26 15:09:37 crc kubenswrapper[4785]: I1126 15:09:37.142359 4785 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Nov 26 15:09:37 crc kubenswrapper[4785]: I1126 15:09:37.161046 4785 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Nov 26 15:09:37 crc kubenswrapper[4785]: I1126 15:09:37.180646 4785 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Nov 26 15:09:37 crc kubenswrapper[4785]: I1126 15:09:37.201985 4785 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Nov 26 15:09:37 crc kubenswrapper[4785]: I1126 15:09:37.222745 4785 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Nov 26 15:09:37 crc kubenswrapper[4785]: I1126 15:09:37.241415 4785 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Nov 26 15:09:37 crc kubenswrapper[4785]: I1126 15:09:37.261542 4785 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Nov 26 15:09:37 crc kubenswrapper[4785]: I1126 15:09:37.281599 4785 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Nov 26 15:09:37 crc kubenswrapper[4785]: I1126 15:09:37.301033 4785 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Nov 26 15:09:37 crc kubenswrapper[4785]: I1126 15:09:37.320826 4785 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Nov 26 15:09:37 crc kubenswrapper[4785]: I1126 15:09:37.341296 4785 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Nov 26 15:09:37 crc kubenswrapper[4785]: I1126 15:09:37.360929 4785 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Nov 26 15:09:37 crc kubenswrapper[4785]: I1126 15:09:37.421151 4785 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Nov 26 15:09:37 crc kubenswrapper[4785]: I1126 15:09:37.421644 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n5kn4\" (UniqueName: \"kubernetes.io/projected/21c0b576-b3af-4d08-8e09-2c3728c8623e-kube-api-access-n5kn4\") pod \"controller-manager-879f6c89f-dss42\" (UID: \"21c0b576-b3af-4d08-8e09-2c3728c8623e\") " pod="openshift-controller-manager/controller-manager-879f6c89f-dss42" Nov 26 15:09:37 crc kubenswrapper[4785]: I1126 15:09:37.440796 4785 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Nov 26 15:09:37 crc kubenswrapper[4785]: I1126 15:09:37.460474 4785 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Nov 26 15:09:37 crc kubenswrapper[4785]: I1126 15:09:37.481387 4785 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Nov 26 15:09:37 crc kubenswrapper[4785]: I1126 15:09:37.501626 4785 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Nov 26 15:09:37 crc kubenswrapper[4785]: I1126 15:09:37.519170 4785 request.go:700] Waited for 1.002466352s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-etcd-operator/configmaps?fieldSelector=metadata.name%3Dkube-root-ca.crt&limit=500&resourceVersion=0 Nov 26 15:09:37 crc kubenswrapper[4785]: I1126 15:09:37.521497 4785 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Nov 26 15:09:37 crc kubenswrapper[4785]: I1126 15:09:37.541206 4785 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Nov 26 15:09:37 crc kubenswrapper[4785]: I1126 15:09:37.560631 4785 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Nov 26 15:09:37 crc kubenswrapper[4785]: I1126 15:09:37.583035 4785 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Nov 26 15:09:37 crc kubenswrapper[4785]: I1126 15:09:37.601348 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-dss42" Nov 26 15:09:37 crc kubenswrapper[4785]: I1126 15:09:37.602227 4785 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Nov 26 15:09:37 crc kubenswrapper[4785]: I1126 15:09:37.622508 4785 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Nov 26 15:09:37 crc kubenswrapper[4785]: I1126 15:09:37.642504 4785 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Nov 26 15:09:37 crc kubenswrapper[4785]: I1126 15:09:37.669997 4785 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Nov 26 15:09:37 crc kubenswrapper[4785]: I1126 15:09:37.681658 4785 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Nov 26 15:09:37 crc kubenswrapper[4785]: I1126 15:09:37.701593 4785 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Nov 26 15:09:37 crc kubenswrapper[4785]: I1126 15:09:37.723227 4785 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Nov 26 15:09:37 crc kubenswrapper[4785]: I1126 15:09:37.742085 4785 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Nov 26 15:09:37 crc kubenswrapper[4785]: I1126 15:09:37.761531 4785 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Nov 26 15:09:37 crc kubenswrapper[4785]: I1126 15:09:37.782045 4785 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Nov 26 15:09:37 crc kubenswrapper[4785]: I1126 15:09:37.793603 4785 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-dss42"] Nov 26 15:09:37 crc kubenswrapper[4785]: I1126 15:09:37.800626 4785 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Nov 26 15:09:37 crc kubenswrapper[4785]: I1126 15:09:37.820539 4785 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Nov 26 15:09:37 crc kubenswrapper[4785]: I1126 15:09:37.840901 4785 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Nov 26 15:09:37 crc kubenswrapper[4785]: I1126 15:09:37.860987 4785 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Nov 26 15:09:37 crc kubenswrapper[4785]: I1126 15:09:37.881267 4785 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Nov 26 15:09:37 crc kubenswrapper[4785]: I1126 15:09:37.901957 4785 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Nov 26 15:09:37 crc kubenswrapper[4785]: I1126 15:09:37.921042 4785 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Nov 26 15:09:37 crc kubenswrapper[4785]: I1126 15:09:37.941066 4785 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Nov 26 15:09:37 crc kubenswrapper[4785]: I1126 15:09:37.960462 4785 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Nov 26 15:09:37 crc kubenswrapper[4785]: I1126 15:09:37.980866 4785 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.001536 4785 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.021422 4785 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.041595 4785 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.061225 4785 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.082331 4785 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.101816 4785 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.130271 4785 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.140697 4785 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.160877 4785 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.180875 4785 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.217210 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2h75g\" (UniqueName: \"kubernetes.io/projected/b475816f-0aee-4aed-92a4-82ced173b416-kube-api-access-2h75g\") pod \"openshift-config-operator-7777fb866f-z8vql\" (UID: \"b475816f-0aee-4aed-92a4-82ced173b416\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-z8vql" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.237405 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7wzr2\" (UniqueName: \"kubernetes.io/projected/44ec5c36-3bfd-42c0-a4a7-28f7e0e5aad7-kube-api-access-7wzr2\") pod \"console-f9d7485db-6cx55\" (UID: \"44ec5c36-3bfd-42c0-a4a7-28f7e0e5aad7\") " pod="openshift-console/console-f9d7485db-6cx55" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.241052 4785 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.260983 4785 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.281688 4785 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.302740 4785 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.321308 4785 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.341394 4785 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.361180 4785 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.380877 4785 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.390479 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-z8vql" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.400906 4785 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.421593 4785 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.438416 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-6cx55" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.440320 4785 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.461158 4785 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.481540 4785 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.523714 4785 request.go:700] Waited for 1.116154099s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager-operator/pods/openshift-controller-manager-operator-756b6f6bc6-vz9fw Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.537098 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dxlcs\" (UniqueName: \"kubernetes.io/projected/2fb7b6f6-18b7-4fcd-b339-061bf56eb47a-kube-api-access-dxlcs\") pod \"machine-approver-56656f9798-w4twq\" (UID: \"2fb7b6f6-18b7-4fcd-b339-061bf56eb47a\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-w4twq" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.537137 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/03c85461-2fda-45e4-a240-5984b368b216-audit\") pod \"apiserver-76f77b778f-s6nf7\" (UID: \"03c85461-2fda-45e4-a240-5984b368b216\") " pod="openshift-apiserver/apiserver-76f77b778f-s6nf7" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.537166 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f7gm8\" (UniqueName: \"kubernetes.io/projected/50a056ca-c016-46ad-bb2c-0bf1d27511c3-kube-api-access-f7gm8\") pod \"openshift-controller-manager-operator-756b6f6bc6-vz9fw\" (UID: \"50a056ca-c016-46ad-bb2c-0bf1d27511c3\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-vz9fw" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.537192 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/5c5bd1c8-cf40-4ad2-8556-6269ab2a08d0-encryption-config\") pod \"apiserver-7bbb656c7d-4h5d5\" (UID: \"5c5bd1c8-cf40-4ad2-8556-6269ab2a08d0\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4h5d5" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.537214 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/2fb7b6f6-18b7-4fcd-b339-061bf56eb47a-auth-proxy-config\") pod \"machine-approver-56656f9798-w4twq\" (UID: \"2fb7b6f6-18b7-4fcd-b339-061bf56eb47a\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-w4twq" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.537236 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/45ee2e82-4124-490e-acb9-d7dc4b5c7cc4-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-s7fwr\" (UID: \"45ee2e82-4124-490e-acb9-d7dc4b5c7cc4\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-s7fwr" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.537256 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5c5bd1c8-cf40-4ad2-8556-6269ab2a08d0-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-4h5d5\" (UID: \"5c5bd1c8-cf40-4ad2-8556-6269ab2a08d0\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4h5d5" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.537277 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/5d3c5506-f58a-45bb-adbe-e895b7e4d646-audit-policies\") pod \"oauth-openshift-558db77b4-jmg2s\" (UID: \"5d3c5506-f58a-45bb-adbe-e895b7e4d646\") " pod="openshift-authentication/oauth-openshift-558db77b4-jmg2s" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.537300 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5d3c5506-f58a-45bb-adbe-e895b7e4d646-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-jmg2s\" (UID: \"5d3c5506-f58a-45bb-adbe-e895b7e4d646\") " pod="openshift-authentication/oauth-openshift-558db77b4-jmg2s" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.537324 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/53e6c54c-b18d-459d-b2b4-208e82921018-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-6tqsq\" (UID: \"53e6c54c-b18d-459d-b2b4-208e82921018\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-6tqsq" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.537372 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dllmj\" (UniqueName: \"kubernetes.io/projected/fb4884c6-873b-4728-8633-4ce0b794dfcd-kube-api-access-dllmj\") pod \"router-default-5444994796-4v4sr\" (UID: \"fb4884c6-873b-4728-8633-4ce0b794dfcd\") " pod="openshift-ingress/router-default-5444994796-4v4sr" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.537395 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/50a056ca-c016-46ad-bb2c-0bf1d27511c3-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-vz9fw\" (UID: \"50a056ca-c016-46ad-bb2c-0bf1d27511c3\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-vz9fw" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.537418 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/5d3c5506-f58a-45bb-adbe-e895b7e4d646-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-jmg2s\" (UID: \"5d3c5506-f58a-45bb-adbe-e895b7e4d646\") " pod="openshift-authentication/oauth-openshift-558db77b4-jmg2s" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.537440 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/5c5bd1c8-cf40-4ad2-8556-6269ab2a08d0-etcd-client\") pod \"apiserver-7bbb656c7d-4h5d5\" (UID: \"5c5bd1c8-cf40-4ad2-8556-6269ab2a08d0\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4h5d5" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.537463 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/5d3c5506-f58a-45bb-adbe-e895b7e4d646-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-jmg2s\" (UID: \"5d3c5506-f58a-45bb-adbe-e895b7e4d646\") " pod="openshift-authentication/oauth-openshift-558db77b4-jmg2s" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.537494 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nrsd2\" (UID: \"80f8d801-cad5-4d41-b2c7-1bb2306b1b25\") " pod="openshift-image-registry/image-registry-697d97f7c8-nrsd2" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.537517 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fb4884c6-873b-4728-8633-4ce0b794dfcd-metrics-certs\") pod \"router-default-5444994796-4v4sr\" (UID: \"fb4884c6-873b-4728-8633-4ce0b794dfcd\") " pod="openshift-ingress/router-default-5444994796-4v4sr" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.537562 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/80f8d801-cad5-4d41-b2c7-1bb2306b1b25-bound-sa-token\") pod \"image-registry-697d97f7c8-nrsd2\" (UID: \"80f8d801-cad5-4d41-b2c7-1bb2306b1b25\") " pod="openshift-image-registry/image-registry-697d97f7c8-nrsd2" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.537592 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fb4884c6-873b-4728-8633-4ce0b794dfcd-service-ca-bundle\") pod \"router-default-5444994796-4v4sr\" (UID: \"fb4884c6-873b-4728-8633-4ce0b794dfcd\") " pod="openshift-ingress/router-default-5444994796-4v4sr" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.537611 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/03c85461-2fda-45e4-a240-5984b368b216-audit-dir\") pod \"apiserver-76f77b778f-s6nf7\" (UID: \"03c85461-2fda-45e4-a240-5984b368b216\") " pod="openshift-apiserver/apiserver-76f77b778f-s6nf7" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.537628 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/5c5bd1c8-cf40-4ad2-8556-6269ab2a08d0-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-4h5d5\" (UID: \"5c5bd1c8-cf40-4ad2-8556-6269ab2a08d0\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4h5d5" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.537646 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/5d3c5506-f58a-45bb-adbe-e895b7e4d646-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-jmg2s\" (UID: \"5d3c5506-f58a-45bb-adbe-e895b7e4d646\") " pod="openshift-authentication/oauth-openshift-558db77b4-jmg2s" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.537664 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/5d3c5506-f58a-45bb-adbe-e895b7e4d646-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-jmg2s\" (UID: \"5d3c5506-f58a-45bb-adbe-e895b7e4d646\") " pod="openshift-authentication/oauth-openshift-558db77b4-jmg2s" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.537684 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/5d3c5506-f58a-45bb-adbe-e895b7e4d646-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-jmg2s\" (UID: \"5d3c5506-f58a-45bb-adbe-e895b7e4d646\") " pod="openshift-authentication/oauth-openshift-558db77b4-jmg2s" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.537706 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e167755f-d572-4f94-a24b-99a6b0c15552-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-qdc2p\" (UID: \"e167755f-d572-4f94-a24b-99a6b0c15552\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-qdc2p" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.537726 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e167755f-d572-4f94-a24b-99a6b0c15552-service-ca-bundle\") pod \"authentication-operator-69f744f599-qdc2p\" (UID: \"e167755f-d572-4f94-a24b-99a6b0c15552\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-qdc2p" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.537749 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/53e6c54c-b18d-459d-b2b4-208e82921018-config\") pod \"kube-apiserver-operator-766d6c64bb-6tqsq\" (UID: \"53e6c54c-b18d-459d-b2b4-208e82921018\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-6tqsq" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.537770 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/45ee2e82-4124-490e-acb9-d7dc4b5c7cc4-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-s7fwr\" (UID: \"45ee2e82-4124-490e-acb9-d7dc4b5c7cc4\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-s7fwr" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.537792 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/8cd4a8fe-0b02-4ee4-b125-c23de9e56acf-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-4k7ds\" (UID: \"8cd4a8fe-0b02-4ee4-b125-c23de9e56acf\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-4k7ds" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.537813 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/5d3c5506-f58a-45bb-adbe-e895b7e4d646-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-jmg2s\" (UID: \"5d3c5506-f58a-45bb-adbe-e895b7e4d646\") " pod="openshift-authentication/oauth-openshift-558db77b4-jmg2s" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.537834 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lq97x\" (UniqueName: \"kubernetes.io/projected/b9d58930-37e7-40ed-ab98-48654e752018-kube-api-access-lq97x\") pod \"dns-operator-744455d44c-bzwjs\" (UID: \"b9d58930-37e7-40ed-ab98-48654e752018\") " pod="openshift-dns-operator/dns-operator-744455d44c-bzwjs" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.537857 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/dc5c31e5-88ab-41d0-9976-b63f97b85543-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-lcfs6\" (UID: \"dc5c31e5-88ab-41d0-9976-b63f97b85543\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-lcfs6" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.537878 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f3f5b38b-0a65-4340-b4a2-18e4da7574f4-serving-cert\") pod \"console-operator-58897d9998-pnqcw\" (UID: \"f3f5b38b-0a65-4340-b4a2-18e4da7574f4\") " pod="openshift-console-operator/console-operator-58897d9998-pnqcw" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.537908 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1d527b78-bffe-47b6-bbaa-f06bcaab2316-config\") pod \"openshift-apiserver-operator-796bbdcf4f-wb7dh\" (UID: \"1d527b78-bffe-47b6-bbaa-f06bcaab2316\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-wb7dh" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.537927 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/03c85461-2fda-45e4-a240-5984b368b216-config\") pod \"apiserver-76f77b778f-s6nf7\" (UID: \"03c85461-2fda-45e4-a240-5984b368b216\") " pod="openshift-apiserver/apiserver-76f77b778f-s6nf7" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.537946 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5s6x4\" (UniqueName: \"kubernetes.io/projected/e167755f-d572-4f94-a24b-99a6b0c15552-kube-api-access-5s6x4\") pod \"authentication-operator-69f744f599-qdc2p\" (UID: \"e167755f-d572-4f94-a24b-99a6b0c15552\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-qdc2p" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.537992 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/80f8d801-cad5-4d41-b2c7-1bb2306b1b25-registry-tls\") pod \"image-registry-697d97f7c8-nrsd2\" (UID: \"80f8d801-cad5-4d41-b2c7-1bb2306b1b25\") " pod="openshift-image-registry/image-registry-697d97f7c8-nrsd2" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.538012 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/fb4884c6-873b-4728-8633-4ce0b794dfcd-default-certificate\") pod \"router-default-5444994796-4v4sr\" (UID: \"fb4884c6-873b-4728-8633-4ce0b794dfcd\") " pod="openshift-ingress/router-default-5444994796-4v4sr" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.538035 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/5d3c5506-f58a-45bb-adbe-e895b7e4d646-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-jmg2s\" (UID: \"5d3c5506-f58a-45bb-adbe-e895b7e4d646\") " pod="openshift-authentication/oauth-openshift-558db77b4-jmg2s" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.538076 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/45ee2e82-4124-490e-acb9-d7dc4b5c7cc4-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-s7fwr\" (UID: \"45ee2e82-4124-490e-acb9-d7dc4b5c7cc4\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-s7fwr" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.538097 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j5npj\" (UniqueName: \"kubernetes.io/projected/cb9844e2-63a7-4437-ab9d-5047b2363580-kube-api-access-j5npj\") pod \"downloads-7954f5f757-vn49n\" (UID: \"cb9844e2-63a7-4437-ab9d-5047b2363580\") " pod="openshift-console/downloads-7954f5f757-vn49n" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.538116 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/50a056ca-c016-46ad-bb2c-0bf1d27511c3-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-vz9fw\" (UID: \"50a056ca-c016-46ad-bb2c-0bf1d27511c3\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-vz9fw" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.538136 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2fb7b6f6-18b7-4fcd-b339-061bf56eb47a-config\") pod \"machine-approver-56656f9798-w4twq\" (UID: \"2fb7b6f6-18b7-4fcd-b339-061bf56eb47a\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-w4twq" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.538155 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/03c85461-2fda-45e4-a240-5984b368b216-trusted-ca-bundle\") pod \"apiserver-76f77b778f-s6nf7\" (UID: \"03c85461-2fda-45e4-a240-5984b368b216\") " pod="openshift-apiserver/apiserver-76f77b778f-s6nf7" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.538178 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/5d3c5506-f58a-45bb-adbe-e895b7e4d646-audit-dir\") pod \"oauth-openshift-558db77b4-jmg2s\" (UID: \"5d3c5506-f58a-45bb-adbe-e895b7e4d646\") " pod="openshift-authentication/oauth-openshift-558db77b4-jmg2s" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.538198 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/5c5bd1c8-cf40-4ad2-8556-6269ab2a08d0-audit-dir\") pod \"apiserver-7bbb656c7d-4h5d5\" (UID: \"5c5bd1c8-cf40-4ad2-8556-6269ab2a08d0\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4h5d5" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.538219 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/03c85461-2fda-45e4-a240-5984b368b216-serving-cert\") pod \"apiserver-76f77b778f-s6nf7\" (UID: \"03c85461-2fda-45e4-a240-5984b368b216\") " pod="openshift-apiserver/apiserver-76f77b778f-s6nf7" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.538268 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e167755f-d572-4f94-a24b-99a6b0c15552-serving-cert\") pod \"authentication-operator-69f744f599-qdc2p\" (UID: \"e167755f-d572-4f94-a24b-99a6b0c15552\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-qdc2p" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.538288 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dc5c31e5-88ab-41d0-9976-b63f97b85543-config\") pod \"machine-api-operator-5694c8668f-lcfs6\" (UID: \"dc5c31e5-88ab-41d0-9976-b63f97b85543\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-lcfs6" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.538307 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xmzhp\" (UniqueName: \"kubernetes.io/projected/f3f5b38b-0a65-4340-b4a2-18e4da7574f4-kube-api-access-xmzhp\") pod \"console-operator-58897d9998-pnqcw\" (UID: \"f3f5b38b-0a65-4340-b4a2-18e4da7574f4\") " pod="openshift-console-operator/console-operator-58897d9998-pnqcw" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.538327 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ln7mr\" (UniqueName: \"kubernetes.io/projected/45ee2e82-4124-490e-acb9-d7dc4b5c7cc4-kube-api-access-ln7mr\") pod \"cluster-image-registry-operator-dc59b4c8b-s7fwr\" (UID: \"45ee2e82-4124-490e-acb9-d7dc4b5c7cc4\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-s7fwr" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.538348 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/80f8d801-cad5-4d41-b2c7-1bb2306b1b25-ca-trust-extracted\") pod \"image-registry-697d97f7c8-nrsd2\" (UID: \"80f8d801-cad5-4d41-b2c7-1bb2306b1b25\") " pod="openshift-image-registry/image-registry-697d97f7c8-nrsd2" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.538368 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/80f8d801-cad5-4d41-b2c7-1bb2306b1b25-trusted-ca\") pod \"image-registry-697d97f7c8-nrsd2\" (UID: \"80f8d801-cad5-4d41-b2c7-1bb2306b1b25\") " pod="openshift-image-registry/image-registry-697d97f7c8-nrsd2" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.538388 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tnmlk\" (UniqueName: \"kubernetes.io/projected/8cd4a8fe-0b02-4ee4-b125-c23de9e56acf-kube-api-access-tnmlk\") pod \"cluster-samples-operator-665b6dd947-4k7ds\" (UID: \"8cd4a8fe-0b02-4ee4-b125-c23de9e56acf\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-4k7ds" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.538409 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/5d3c5506-f58a-45bb-adbe-e895b7e4d646-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-jmg2s\" (UID: \"5d3c5506-f58a-45bb-adbe-e895b7e4d646\") " pod="openshift-authentication/oauth-openshift-558db77b4-jmg2s" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.538427 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b9d58930-37e7-40ed-ab98-48654e752018-metrics-tls\") pod \"dns-operator-744455d44c-bzwjs\" (UID: \"b9d58930-37e7-40ed-ab98-48654e752018\") " pod="openshift-dns-operator/dns-operator-744455d44c-bzwjs" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.538447 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/03c85461-2fda-45e4-a240-5984b368b216-etcd-serving-ca\") pod \"apiserver-76f77b778f-s6nf7\" (UID: \"03c85461-2fda-45e4-a240-5984b368b216\") " pod="openshift-apiserver/apiserver-76f77b778f-s6nf7" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.538469 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-njvq8\" (UniqueName: \"kubernetes.io/projected/5c5bd1c8-cf40-4ad2-8556-6269ab2a08d0-kube-api-access-njvq8\") pod \"apiserver-7bbb656c7d-4h5d5\" (UID: \"5c5bd1c8-cf40-4ad2-8556-6269ab2a08d0\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4h5d5" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.538488 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/deef326b-464a-4a51-89e8-7fae04b5669a-config\") pod \"kube-controller-manager-operator-78b949d7b-b2j9t\" (UID: \"deef326b-464a-4a51-89e8-7fae04b5669a\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-b2j9t" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.538509 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e167755f-d572-4f94-a24b-99a6b0c15552-config\") pod \"authentication-operator-69f744f599-qdc2p\" (UID: \"e167755f-d572-4f94-a24b-99a6b0c15552\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-qdc2p" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.538527 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f3f5b38b-0a65-4340-b4a2-18e4da7574f4-trusted-ca\") pod \"console-operator-58897d9998-pnqcw\" (UID: \"f3f5b38b-0a65-4340-b4a2-18e4da7574f4\") " pod="openshift-console-operator/console-operator-58897d9998-pnqcw" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.538573 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1d527b78-bffe-47b6-bbaa-f06bcaab2316-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-wb7dh\" (UID: \"1d527b78-bffe-47b6-bbaa-f06bcaab2316\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-wb7dh" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.538597 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/5c5bd1c8-cf40-4ad2-8556-6269ab2a08d0-audit-policies\") pod \"apiserver-7bbb656c7d-4h5d5\" (UID: \"5c5bd1c8-cf40-4ad2-8556-6269ab2a08d0\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4h5d5" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.538631 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kp6vh\" (UniqueName: \"kubernetes.io/projected/80f8d801-cad5-4d41-b2c7-1bb2306b1b25-kube-api-access-kp6vh\") pod \"image-registry-697d97f7c8-nrsd2\" (UID: \"80f8d801-cad5-4d41-b2c7-1bb2306b1b25\") " pod="openshift-image-registry/image-registry-697d97f7c8-nrsd2" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.538651 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fbb8r\" (UniqueName: \"kubernetes.io/projected/1d527b78-bffe-47b6-bbaa-f06bcaab2316-kube-api-access-fbb8r\") pod \"openshift-apiserver-operator-796bbdcf4f-wb7dh\" (UID: \"1d527b78-bffe-47b6-bbaa-f06bcaab2316\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-wb7dh" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.538673 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f3f5b38b-0a65-4340-b4a2-18e4da7574f4-config\") pod \"console-operator-58897d9998-pnqcw\" (UID: \"f3f5b38b-0a65-4340-b4a2-18e4da7574f4\") " pod="openshift-console-operator/console-operator-58897d9998-pnqcw" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.538694 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cp8bf\" (UniqueName: \"kubernetes.io/projected/03c85461-2fda-45e4-a240-5984b368b216-kube-api-access-cp8bf\") pod \"apiserver-76f77b778f-s6nf7\" (UID: \"03c85461-2fda-45e4-a240-5984b368b216\") " pod="openshift-apiserver/apiserver-76f77b778f-s6nf7" Nov 26 15:09:38 crc kubenswrapper[4785]: E1126 15:09:38.543605 4785 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-26 15:09:39.043539152 +0000 UTC m=+142.721904916 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nrsd2" (UID: "80f8d801-cad5-4d41-b2c7-1bb2306b1b25") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.543853 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/80f8d801-cad5-4d41-b2c7-1bb2306b1b25-installation-pull-secrets\") pod \"image-registry-697d97f7c8-nrsd2\" (UID: \"80f8d801-cad5-4d41-b2c7-1bb2306b1b25\") " pod="openshift-image-registry/image-registry-697d97f7c8-nrsd2" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.544035 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/5d3c5506-f58a-45bb-adbe-e895b7e4d646-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-jmg2s\" (UID: \"5d3c5506-f58a-45bb-adbe-e895b7e4d646\") " pod="openshift-authentication/oauth-openshift-558db77b4-jmg2s" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.544168 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j94xc\" (UniqueName: \"kubernetes.io/projected/dc5c31e5-88ab-41d0-9976-b63f97b85543-kube-api-access-j94xc\") pod \"machine-api-operator-5694c8668f-lcfs6\" (UID: \"dc5c31e5-88ab-41d0-9976-b63f97b85543\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-lcfs6" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.544240 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/fb4884c6-873b-4728-8633-4ce0b794dfcd-stats-auth\") pod \"router-default-5444994796-4v4sr\" (UID: \"fb4884c6-873b-4728-8633-4ce0b794dfcd\") " pod="openshift-ingress/router-default-5444994796-4v4sr" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.544432 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/dc5c31e5-88ab-41d0-9976-b63f97b85543-images\") pod \"machine-api-operator-5694c8668f-lcfs6\" (UID: \"dc5c31e5-88ab-41d0-9976-b63f97b85543\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-lcfs6" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.544508 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/03c85461-2fda-45e4-a240-5984b368b216-node-pullsecrets\") pod \"apiserver-76f77b778f-s6nf7\" (UID: \"03c85461-2fda-45e4-a240-5984b368b216\") " pod="openshift-apiserver/apiserver-76f77b778f-s6nf7" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.544542 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/03c85461-2fda-45e4-a240-5984b368b216-etcd-client\") pod \"apiserver-76f77b778f-s6nf7\" (UID: \"03c85461-2fda-45e4-a240-5984b368b216\") " pod="openshift-apiserver/apiserver-76f77b778f-s6nf7" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.544698 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/deef326b-464a-4a51-89e8-7fae04b5669a-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-b2j9t\" (UID: \"deef326b-464a-4a51-89e8-7fae04b5669a\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-b2j9t" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.544734 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/2fb7b6f6-18b7-4fcd-b339-061bf56eb47a-machine-approver-tls\") pod \"machine-approver-56656f9798-w4twq\" (UID: \"2fb7b6f6-18b7-4fcd-b339-061bf56eb47a\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-w4twq" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.544760 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/53e6c54c-b18d-459d-b2b4-208e82921018-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-6tqsq\" (UID: \"53e6c54c-b18d-459d-b2b4-208e82921018\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-6tqsq" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.544926 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5c5bd1c8-cf40-4ad2-8556-6269ab2a08d0-serving-cert\") pod \"apiserver-7bbb656c7d-4h5d5\" (UID: \"5c5bd1c8-cf40-4ad2-8556-6269ab2a08d0\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4h5d5" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.544964 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/5d3c5506-f58a-45bb-adbe-e895b7e4d646-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-jmg2s\" (UID: \"5d3c5506-f58a-45bb-adbe-e895b7e4d646\") " pod="openshift-authentication/oauth-openshift-558db77b4-jmg2s" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.544996 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xcdzs\" (UniqueName: \"kubernetes.io/projected/5d3c5506-f58a-45bb-adbe-e895b7e4d646-kube-api-access-xcdzs\") pod \"oauth-openshift-558db77b4-jmg2s\" (UID: \"5d3c5506-f58a-45bb-adbe-e895b7e4d646\") " pod="openshift-authentication/oauth-openshift-558db77b4-jmg2s" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.545018 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/03c85461-2fda-45e4-a240-5984b368b216-image-import-ca\") pod \"apiserver-76f77b778f-s6nf7\" (UID: \"03c85461-2fda-45e4-a240-5984b368b216\") " pod="openshift-apiserver/apiserver-76f77b778f-s6nf7" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.545047 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/deef326b-464a-4a51-89e8-7fae04b5669a-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-b2j9t\" (UID: \"deef326b-464a-4a51-89e8-7fae04b5669a\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-b2j9t" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.545226 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/03c85461-2fda-45e4-a240-5984b368b216-encryption-config\") pod \"apiserver-76f77b778f-s6nf7\" (UID: \"03c85461-2fda-45e4-a240-5984b368b216\") " pod="openshift-apiserver/apiserver-76f77b778f-s6nf7" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.545259 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/80f8d801-cad5-4d41-b2c7-1bb2306b1b25-registry-certificates\") pod \"image-registry-697d97f7c8-nrsd2\" (UID: \"80f8d801-cad5-4d41-b2c7-1bb2306b1b25\") " pod="openshift-image-registry/image-registry-697d97f7c8-nrsd2" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.564224 4785 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-z8vql"] Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.640023 4785 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-6cx55"] Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.646172 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 26 15:09:38 crc kubenswrapper[4785]: E1126 15:09:38.646331 4785 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-26 15:09:39.146307397 +0000 UTC m=+142.824673171 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.646413 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/b022f00c-c760-4fc7-85be-bf6bda07ed6c-plugins-dir\") pod \"csi-hostpathplugin-s7n5j\" (UID: \"b022f00c-c760-4fc7-85be-bf6bda07ed6c\") " pod="hostpath-provisioner/csi-hostpathplugin-s7n5j" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.646446 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-29647\" (UniqueName: \"kubernetes.io/projected/c75fc6a1-3a11-401c-a519-285f2d2fed86-kube-api-access-29647\") pod \"machine-config-controller-84d6567774-98w8d\" (UID: \"c75fc6a1-3a11-401c-a519-285f2d2fed86\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-98w8d" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.646481 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/5d3c5506-f58a-45bb-adbe-e895b7e4d646-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-jmg2s\" (UID: \"5d3c5506-f58a-45bb-adbe-e895b7e4d646\") " pod="openshift-authentication/oauth-openshift-558db77b4-jmg2s" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.646505 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/fb4884c6-873b-4728-8633-4ce0b794dfcd-stats-auth\") pod \"router-default-5444994796-4v4sr\" (UID: \"fb4884c6-873b-4728-8633-4ce0b794dfcd\") " pod="openshift-ingress/router-default-5444994796-4v4sr" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.646531 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j94xc\" (UniqueName: \"kubernetes.io/projected/dc5c31e5-88ab-41d0-9976-b63f97b85543-kube-api-access-j94xc\") pod \"machine-api-operator-5694c8668f-lcfs6\" (UID: \"dc5c31e5-88ab-41d0-9976-b63f97b85543\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-lcfs6" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.646572 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6428b894-a54a-4acc-8f84-66843f7165f0-serving-cert\") pod \"route-controller-manager-6576b87f9c-sq88m\" (UID: \"6428b894-a54a-4acc-8f84-66843f7165f0\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-sq88m" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.646595 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sr2m9\" (UniqueName: \"kubernetes.io/projected/917f8462-10e3-4f5d-b991-7e549c9fbc0c-kube-api-access-sr2m9\") pod \"service-ca-operator-777779d784-7dxnf\" (UID: \"917f8462-10e3-4f5d-b991-7e549c9fbc0c\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-7dxnf" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.646618 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/2fb7b6f6-18b7-4fcd-b339-061bf56eb47a-machine-approver-tls\") pod \"machine-approver-56656f9798-w4twq\" (UID: \"2fb7b6f6-18b7-4fcd-b339-061bf56eb47a\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-w4twq" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.646663 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/03c85461-2fda-45e4-a240-5984b368b216-etcd-client\") pod \"apiserver-76f77b778f-s6nf7\" (UID: \"03c85461-2fda-45e4-a240-5984b368b216\") " pod="openshift-apiserver/apiserver-76f77b778f-s6nf7" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.646747 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/deef326b-464a-4a51-89e8-7fae04b5669a-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-b2j9t\" (UID: \"deef326b-464a-4a51-89e8-7fae04b5669a\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-b2j9t" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.646987 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/16404b4a-e9c8-4e18-bd40-0bdcab054a44-webhook-cert\") pod \"packageserver-d55dfcdfc-4z7ct\" (UID: \"16404b4a-e9c8-4e18-bd40-0bdcab054a44\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4z7ct" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.647020 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/53e6c54c-b18d-459d-b2b4-208e82921018-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-6tqsq\" (UID: \"53e6c54c-b18d-459d-b2b4-208e82921018\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-6tqsq" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.647043 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5c5bd1c8-cf40-4ad2-8556-6269ab2a08d0-serving-cert\") pod \"apiserver-7bbb656c7d-4h5d5\" (UID: \"5c5bd1c8-cf40-4ad2-8556-6269ab2a08d0\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4h5d5" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.647064 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0a45fa88-9d4e-4801-8d54-a7fc5ca37240-proxy-tls\") pod \"machine-config-operator-74547568cd-mwxx7\" (UID: \"0a45fa88-9d4e-4801-8d54-a7fc5ca37240\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-mwxx7" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.647089 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jfrll\" (UniqueName: \"kubernetes.io/projected/1a208865-a883-4a5e-9a56-2fb270dd3ffc-kube-api-access-jfrll\") pod \"kube-storage-version-migrator-operator-b67b599dd-4q4dl\" (UID: \"1a208865-a883-4a5e-9a56-2fb270dd3ffc\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-4q4dl" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.647110 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k8ggt\" (UniqueName: \"kubernetes.io/projected/99f14898-6d00-4964-8eaa-f58db7e92512-kube-api-access-k8ggt\") pod \"olm-operator-6b444d44fb-56grk\" (UID: \"99f14898-6d00-4964-8eaa-f58db7e92512\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-56grk" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.647133 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/80f8d801-cad5-4d41-b2c7-1bb2306b1b25-registry-certificates\") pod \"image-registry-697d97f7c8-nrsd2\" (UID: \"80f8d801-cad5-4d41-b2c7-1bb2306b1b25\") " pod="openshift-image-registry/image-registry-697d97f7c8-nrsd2" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.647153 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6428b894-a54a-4acc-8f84-66843f7165f0-config\") pod \"route-controller-manager-6576b87f9c-sq88m\" (UID: \"6428b894-a54a-4acc-8f84-66843f7165f0\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-sq88m" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.647178 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dxlcs\" (UniqueName: \"kubernetes.io/projected/2fb7b6f6-18b7-4fcd-b339-061bf56eb47a-kube-api-access-dxlcs\") pod \"machine-approver-56656f9798-w4twq\" (UID: \"2fb7b6f6-18b7-4fcd-b339-061bf56eb47a\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-w4twq" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.647199 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/0a45fa88-9d4e-4801-8d54-a7fc5ca37240-images\") pod \"machine-config-operator-74547568cd-mwxx7\" (UID: \"0a45fa88-9d4e-4801-8d54-a7fc5ca37240\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-mwxx7" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.647227 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f7gm8\" (UniqueName: \"kubernetes.io/projected/50a056ca-c016-46ad-bb2c-0bf1d27511c3-kube-api-access-f7gm8\") pod \"openshift-controller-manager-operator-756b6f6bc6-vz9fw\" (UID: \"50a056ca-c016-46ad-bb2c-0bf1d27511c3\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-vz9fw" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.647251 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/5c5bd1c8-cf40-4ad2-8556-6269ab2a08d0-encryption-config\") pod \"apiserver-7bbb656c7d-4h5d5\" (UID: \"5c5bd1c8-cf40-4ad2-8556-6269ab2a08d0\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4h5d5" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.647273 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/2fb7b6f6-18b7-4fcd-b339-061bf56eb47a-auth-proxy-config\") pod \"machine-approver-56656f9798-w4twq\" (UID: \"2fb7b6f6-18b7-4fcd-b339-061bf56eb47a\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-w4twq" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.647302 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5c5bd1c8-cf40-4ad2-8556-6269ab2a08d0-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-4h5d5\" (UID: \"5c5bd1c8-cf40-4ad2-8556-6269ab2a08d0\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4h5d5" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.647316 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/5d3c5506-f58a-45bb-adbe-e895b7e4d646-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-jmg2s\" (UID: \"5d3c5506-f58a-45bb-adbe-e895b7e4d646\") " pod="openshift-authentication/oauth-openshift-558db77b4-jmg2s" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.647328 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5d3c5506-f58a-45bb-adbe-e895b7e4d646-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-jmg2s\" (UID: \"5d3c5506-f58a-45bb-adbe-e895b7e4d646\") " pod="openshift-authentication/oauth-openshift-558db77b4-jmg2s" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.647351 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/5d3c5506-f58a-45bb-adbe-e895b7e4d646-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-jmg2s\" (UID: \"5d3c5506-f58a-45bb-adbe-e895b7e4d646\") " pod="openshift-authentication/oauth-openshift-558db77b4-jmg2s" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.647412 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lkccq\" (UniqueName: \"kubernetes.io/projected/bebeadc0-d563-42fc-9283-819249f42c0f-kube-api-access-lkccq\") pod \"control-plane-machine-set-operator-78cbb6b69f-h952m\" (UID: \"bebeadc0-d563-42fc-9283-819249f42c0f\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-h952m" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.647445 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/53e6c54c-b18d-459d-b2b4-208e82921018-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-6tqsq\" (UID: \"53e6c54c-b18d-459d-b2b4-208e82921018\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-6tqsq" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.647464 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/50a056ca-c016-46ad-bb2c-0bf1d27511c3-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-vz9fw\" (UID: \"50a056ca-c016-46ad-bb2c-0bf1d27511c3\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-vz9fw" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.647483 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/d07f0d40-7f03-41c2-ac98-5e5a7eb72a95-etcd-ca\") pod \"etcd-operator-b45778765-vt4z2\" (UID: \"d07f0d40-7f03-41c2-ac98-5e5a7eb72a95\") " pod="openshift-etcd-operator/etcd-operator-b45778765-vt4z2" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.647499 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/b022f00c-c760-4fc7-85be-bf6bda07ed6c-socket-dir\") pod \"csi-hostpathplugin-s7n5j\" (UID: \"b022f00c-c760-4fc7-85be-bf6bda07ed6c\") " pod="hostpath-provisioner/csi-hostpathplugin-s7n5j" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.647526 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/16404b4a-e9c8-4e18-bd40-0bdcab054a44-apiservice-cert\") pod \"packageserver-d55dfcdfc-4z7ct\" (UID: \"16404b4a-e9c8-4e18-bd40-0bdcab054a44\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4z7ct" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.647544 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c9abc8ae-a944-4f87-909b-8258f95c2c06-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-cllbv\" (UID: \"c9abc8ae-a944-4f87-909b-8258f95c2c06\") " pod="openshift-marketplace/marketplace-operator-79b997595-cllbv" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.647588 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f72tq\" (UniqueName: \"kubernetes.io/projected/6428b894-a54a-4acc-8f84-66843f7165f0-kube-api-access-f72tq\") pod \"route-controller-manager-6576b87f9c-sq88m\" (UID: \"6428b894-a54a-4acc-8f84-66843f7165f0\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-sq88m" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.647602 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wzzr5\" (UniqueName: \"kubernetes.io/projected/c9abc8ae-a944-4f87-909b-8258f95c2c06-kube-api-access-wzzr5\") pod \"marketplace-operator-79b997595-cllbv\" (UID: \"c9abc8ae-a944-4f87-909b-8258f95c2c06\") " pod="openshift-marketplace/marketplace-operator-79b997595-cllbv" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.647623 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nrsd2\" (UID: \"80f8d801-cad5-4d41-b2c7-1bb2306b1b25\") " pod="openshift-image-registry/image-registry-697d97f7c8-nrsd2" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.647638 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bae6a4ce-857f-47f4-8eda-688a559a21d2-cert\") pod \"ingress-canary-h5xj5\" (UID: \"bae6a4ce-857f-47f4-8eda-688a559a21d2\") " pod="openshift-ingress-canary/ingress-canary-h5xj5" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.647652 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6428b894-a54a-4acc-8f84-66843f7165f0-client-ca\") pod \"route-controller-manager-6576b87f9c-sq88m\" (UID: \"6428b894-a54a-4acc-8f84-66843f7165f0\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-sq88m" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.647674 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ed213209-32b5-4239-a332-afb26f56e83c-trusted-ca\") pod \"ingress-operator-5b745b69d9-dr58q\" (UID: \"ed213209-32b5-4239-a332-afb26f56e83c\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-dr58q" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.647689 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/5c5bd1c8-cf40-4ad2-8556-6269ab2a08d0-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-4h5d5\" (UID: \"5c5bd1c8-cf40-4ad2-8556-6269ab2a08d0\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4h5d5" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.647705 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/5d3c5506-f58a-45bb-adbe-e895b7e4d646-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-jmg2s\" (UID: \"5d3c5506-f58a-45bb-adbe-e895b7e4d646\") " pod="openshift-authentication/oauth-openshift-558db77b4-jmg2s" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.647732 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fb4884c6-873b-4728-8633-4ce0b794dfcd-service-ca-bundle\") pod \"router-default-5444994796-4v4sr\" (UID: \"fb4884c6-873b-4728-8633-4ce0b794dfcd\") " pod="openshift-ingress/router-default-5444994796-4v4sr" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.647747 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e167755f-d572-4f94-a24b-99a6b0c15552-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-qdc2p\" (UID: \"e167755f-d572-4f94-a24b-99a6b0c15552\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-qdc2p" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.647764 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1a208865-a883-4a5e-9a56-2fb270dd3ffc-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-4q4dl\" (UID: \"1a208865-a883-4a5e-9a56-2fb270dd3ffc\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-4q4dl" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.647780 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/5d3c5506-f58a-45bb-adbe-e895b7e4d646-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-jmg2s\" (UID: \"5d3c5506-f58a-45bb-adbe-e895b7e4d646\") " pod="openshift-authentication/oauth-openshift-558db77b4-jmg2s" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.647795 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/5d3c5506-f58a-45bb-adbe-e895b7e4d646-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-jmg2s\" (UID: \"5d3c5506-f58a-45bb-adbe-e895b7e4d646\") " pod="openshift-authentication/oauth-openshift-558db77b4-jmg2s" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.647811 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/53e6c54c-b18d-459d-b2b4-208e82921018-config\") pod \"kube-apiserver-operator-766d6c64bb-6tqsq\" (UID: \"53e6c54c-b18d-459d-b2b4-208e82921018\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-6tqsq" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.647826 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/8cd4a8fe-0b02-4ee4-b125-c23de9e56acf-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-4k7ds\" (UID: \"8cd4a8fe-0b02-4ee4-b125-c23de9e56acf\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-4k7ds" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.647842 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/45ee2e82-4124-490e-acb9-d7dc4b5c7cc4-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-s7fwr\" (UID: \"45ee2e82-4124-490e-acb9-d7dc4b5c7cc4\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-s7fwr" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.647867 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/5a190f82-8e34-4e31-b2a1-e773974ccf4e-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-hlltg\" (UID: \"5a190f82-8e34-4e31-b2a1-e773974ccf4e\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-hlltg" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.647884 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lq97x\" (UniqueName: \"kubernetes.io/projected/b9d58930-37e7-40ed-ab98-48654e752018-kube-api-access-lq97x\") pod \"dns-operator-744455d44c-bzwjs\" (UID: \"b9d58930-37e7-40ed-ab98-48654e752018\") " pod="openshift-dns-operator/dns-operator-744455d44c-bzwjs" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.647899 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/dc5c31e5-88ab-41d0-9976-b63f97b85543-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-lcfs6\" (UID: \"dc5c31e5-88ab-41d0-9976-b63f97b85543\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-lcfs6" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.647915 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f3f5b38b-0a65-4340-b4a2-18e4da7574f4-serving-cert\") pod \"console-operator-58897d9998-pnqcw\" (UID: \"f3f5b38b-0a65-4340-b4a2-18e4da7574f4\") " pod="openshift-console-operator/console-operator-58897d9998-pnqcw" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.647931 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qnk98\" (UniqueName: \"kubernetes.io/projected/0a45fa88-9d4e-4801-8d54-a7fc5ca37240-kube-api-access-qnk98\") pod \"machine-config-operator-74547568cd-mwxx7\" (UID: \"0a45fa88-9d4e-4801-8d54-a7fc5ca37240\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-mwxx7" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.647946 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/d07f0d40-7f03-41c2-ac98-5e5a7eb72a95-etcd-service-ca\") pod \"etcd-operator-b45778765-vt4z2\" (UID: \"d07f0d40-7f03-41c2-ac98-5e5a7eb72a95\") " pod="openshift-etcd-operator/etcd-operator-b45778765-vt4z2" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.647973 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/80f8d801-cad5-4d41-b2c7-1bb2306b1b25-registry-tls\") pod \"image-registry-697d97f7c8-nrsd2\" (UID: \"80f8d801-cad5-4d41-b2c7-1bb2306b1b25\") " pod="openshift-image-registry/image-registry-697d97f7c8-nrsd2" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.647989 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/45ee2e82-4124-490e-acb9-d7dc4b5c7cc4-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-s7fwr\" (UID: \"45ee2e82-4124-490e-acb9-d7dc4b5c7cc4\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-s7fwr" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.648006 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/c75fc6a1-3a11-401c-a519-285f2d2fed86-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-98w8d\" (UID: \"c75fc6a1-3a11-401c-a519-285f2d2fed86\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-98w8d" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.648024 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/50a056ca-c016-46ad-bb2c-0bf1d27511c3-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-vz9fw\" (UID: \"50a056ca-c016-46ad-bb2c-0bf1d27511c3\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-vz9fw" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.648041 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2fb7b6f6-18b7-4fcd-b339-061bf56eb47a-config\") pod \"machine-approver-56656f9798-w4twq\" (UID: \"2fb7b6f6-18b7-4fcd-b339-061bf56eb47a\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-w4twq" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.648058 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j5npj\" (UniqueName: \"kubernetes.io/projected/cb9844e2-63a7-4437-ab9d-5047b2363580-kube-api-access-j5npj\") pod \"downloads-7954f5f757-vn49n\" (UID: \"cb9844e2-63a7-4437-ab9d-5047b2363580\") " pod="openshift-console/downloads-7954f5f757-vn49n" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.648075 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/03c85461-2fda-45e4-a240-5984b368b216-trusted-ca-bundle\") pod \"apiserver-76f77b778f-s6nf7\" (UID: \"03c85461-2fda-45e4-a240-5984b368b216\") " pod="openshift-apiserver/apiserver-76f77b778f-s6nf7" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.648090 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/5d3c5506-f58a-45bb-adbe-e895b7e4d646-audit-dir\") pod \"oauth-openshift-558db77b4-jmg2s\" (UID: \"5d3c5506-f58a-45bb-adbe-e895b7e4d646\") " pod="openshift-authentication/oauth-openshift-558db77b4-jmg2s" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.648090 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/2fb7b6f6-18b7-4fcd-b339-061bf56eb47a-auth-proxy-config\") pod \"machine-approver-56656f9798-w4twq\" (UID: \"2fb7b6f6-18b7-4fcd-b339-061bf56eb47a\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-w4twq" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.648107 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b006dace-4a31-4ab1-af0f-144eeea6994e-srv-cert\") pod \"catalog-operator-68c6474976-wmz66\" (UID: \"b006dace-4a31-4ab1-af0f-144eeea6994e\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-wmz66" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.648133 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ed213209-32b5-4239-a332-afb26f56e83c-metrics-tls\") pod \"ingress-operator-5b745b69d9-dr58q\" (UID: \"ed213209-32b5-4239-a332-afb26f56e83c\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-dr58q" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.648539 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5d3c5506-f58a-45bb-adbe-e895b7e4d646-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-jmg2s\" (UID: \"5d3c5506-f58a-45bb-adbe-e895b7e4d646\") " pod="openshift-authentication/oauth-openshift-558db77b4-jmg2s" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.648713 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/5c5bd1c8-cf40-4ad2-8556-6269ab2a08d0-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-4h5d5\" (UID: \"5c5bd1c8-cf40-4ad2-8556-6269ab2a08d0\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4h5d5" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.648797 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/80f8d801-cad5-4d41-b2c7-1bb2306b1b25-registry-certificates\") pod \"image-registry-697d97f7c8-nrsd2\" (UID: \"80f8d801-cad5-4d41-b2c7-1bb2306b1b25\") " pod="openshift-image-registry/image-registry-697d97f7c8-nrsd2" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.648917 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fb4884c6-873b-4728-8633-4ce0b794dfcd-service-ca-bundle\") pod \"router-default-5444994796-4v4sr\" (UID: \"fb4884c6-873b-4728-8633-4ce0b794dfcd\") " pod="openshift-ingress/router-default-5444994796-4v4sr" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.648921 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5c5bd1c8-cf40-4ad2-8556-6269ab2a08d0-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-4h5d5\" (UID: \"5c5bd1c8-cf40-4ad2-8556-6269ab2a08d0\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4h5d5" Nov 26 15:09:38 crc kubenswrapper[4785]: E1126 15:09:38.648923 4785 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-26 15:09:39.148903144 +0000 UTC m=+142.827268928 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nrsd2" (UID: "80f8d801-cad5-4d41-b2c7-1bb2306b1b25") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.649786 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/53e6c54c-b18d-459d-b2b4-208e82921018-config\") pod \"kube-apiserver-operator-766d6c64bb-6tqsq\" (UID: \"53e6c54c-b18d-459d-b2b4-208e82921018\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-6tqsq" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.650404 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/45ee2e82-4124-490e-acb9-d7dc4b5c7cc4-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-s7fwr\" (UID: \"45ee2e82-4124-490e-acb9-d7dc4b5c7cc4\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-s7fwr" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.650409 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e167755f-d572-4f94-a24b-99a6b0c15552-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-qdc2p\" (UID: \"e167755f-d572-4f94-a24b-99a6b0c15552\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-qdc2p" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.650915 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0a45fa88-9d4e-4801-8d54-a7fc5ca37240-auth-proxy-config\") pod \"machine-config-operator-74547568cd-mwxx7\" (UID: \"0a45fa88-9d4e-4801-8d54-a7fc5ca37240\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-mwxx7" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.651005 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e167755f-d572-4f94-a24b-99a6b0c15552-serving-cert\") pod \"authentication-operator-69f744f599-qdc2p\" (UID: \"e167755f-d572-4f94-a24b-99a6b0c15552\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-qdc2p" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.651043 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/5d3c5506-f58a-45bb-adbe-e895b7e4d646-audit-dir\") pod \"oauth-openshift-558db77b4-jmg2s\" (UID: \"5d3c5506-f58a-45bb-adbe-e895b7e4d646\") " pod="openshift-authentication/oauth-openshift-558db77b4-jmg2s" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.651091 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/53e6c54c-b18d-459d-b2b4-208e82921018-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-6tqsq\" (UID: \"53e6c54c-b18d-459d-b2b4-208e82921018\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-6tqsq" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.651108 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2fb7b6f6-18b7-4fcd-b339-061bf56eb47a-config\") pod \"machine-approver-56656f9798-w4twq\" (UID: \"2fb7b6f6-18b7-4fcd-b339-061bf56eb47a\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-w4twq" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.651273 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/5c5bd1c8-cf40-4ad2-8556-6269ab2a08d0-audit-dir\") pod \"apiserver-7bbb656c7d-4h5d5\" (UID: \"5c5bd1c8-cf40-4ad2-8556-6269ab2a08d0\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4h5d5" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.651274 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/03c85461-2fda-45e4-a240-5984b368b216-trusted-ca-bundle\") pod \"apiserver-76f77b778f-s6nf7\" (UID: \"03c85461-2fda-45e4-a240-5984b368b216\") " pod="openshift-apiserver/apiserver-76f77b778f-s6nf7" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.651309 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/5d3c5506-f58a-45bb-adbe-e895b7e4d646-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-jmg2s\" (UID: \"5d3c5506-f58a-45bb-adbe-e895b7e4d646\") " pod="openshift-authentication/oauth-openshift-558db77b4-jmg2s" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.651402 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/5c5bd1c8-cf40-4ad2-8556-6269ab2a08d0-audit-dir\") pod \"apiserver-7bbb656c7d-4h5d5\" (UID: \"5c5bd1c8-cf40-4ad2-8556-6269ab2a08d0\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4h5d5" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.651450 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/03c85461-2fda-45e4-a240-5984b368b216-serving-cert\") pod \"apiserver-76f77b778f-s6nf7\" (UID: \"03c85461-2fda-45e4-a240-5984b368b216\") " pod="openshift-apiserver/apiserver-76f77b778f-s6nf7" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.651487 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/80f8d801-cad5-4d41-b2c7-1bb2306b1b25-trusted-ca\") pod \"image-registry-697d97f7c8-nrsd2\" (UID: \"80f8d801-cad5-4d41-b2c7-1bb2306b1b25\") " pod="openshift-image-registry/image-registry-697d97f7c8-nrsd2" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.651617 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tnmlk\" (UniqueName: \"kubernetes.io/projected/8cd4a8fe-0b02-4ee4-b125-c23de9e56acf-kube-api-access-tnmlk\") pod \"cluster-samples-operator-665b6dd947-4k7ds\" (UID: \"8cd4a8fe-0b02-4ee4-b125-c23de9e56acf\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-4k7ds" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.651665 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b9d58930-37e7-40ed-ab98-48654e752018-metrics-tls\") pod \"dns-operator-744455d44c-bzwjs\" (UID: \"b9d58930-37e7-40ed-ab98-48654e752018\") " pod="openshift-dns-operator/dns-operator-744455d44c-bzwjs" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.651688 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/03c85461-2fda-45e4-a240-5984b368b216-etcd-serving-ca\") pod \"apiserver-76f77b778f-s6nf7\" (UID: \"03c85461-2fda-45e4-a240-5984b368b216\") " pod="openshift-apiserver/apiserver-76f77b778f-s6nf7" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.651752 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/50a056ca-c016-46ad-bb2c-0bf1d27511c3-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-vz9fw\" (UID: \"50a056ca-c016-46ad-bb2c-0bf1d27511c3\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-vz9fw" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.651809 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f3f5b38b-0a65-4340-b4a2-18e4da7574f4-trusted-ca\") pod \"console-operator-58897d9998-pnqcw\" (UID: \"f3f5b38b-0a65-4340-b4a2-18e4da7574f4\") " pod="openshift-console-operator/console-operator-58897d9998-pnqcw" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.651989 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d07f0d40-7f03-41c2-ac98-5e5a7eb72a95-serving-cert\") pod \"etcd-operator-b45778765-vt4z2\" (UID: \"d07f0d40-7f03-41c2-ac98-5e5a7eb72a95\") " pod="openshift-etcd-operator/etcd-operator-b45778765-vt4z2" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.652319 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/03c85461-2fda-45e4-a240-5984b368b216-etcd-serving-ca\") pod \"apiserver-76f77b778f-s6nf7\" (UID: \"03c85461-2fda-45e4-a240-5984b368b216\") " pod="openshift-apiserver/apiserver-76f77b778f-s6nf7" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.652927 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f3f5b38b-0a65-4340-b4a2-18e4da7574f4-trusted-ca\") pod \"console-operator-58897d9998-pnqcw\" (UID: \"f3f5b38b-0a65-4340-b4a2-18e4da7574f4\") " pod="openshift-console-operator/console-operator-58897d9998-pnqcw" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.653804 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5c5bd1c8-cf40-4ad2-8556-6269ab2a08d0-serving-cert\") pod \"apiserver-7bbb656c7d-4h5d5\" (UID: \"5c5bd1c8-cf40-4ad2-8556-6269ab2a08d0\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4h5d5" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.653933 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/5c5bd1c8-cf40-4ad2-8556-6269ab2a08d0-encryption-config\") pod \"apiserver-7bbb656c7d-4h5d5\" (UID: \"5c5bd1c8-cf40-4ad2-8556-6269ab2a08d0\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4h5d5" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.653940 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/5d3c5506-f58a-45bb-adbe-e895b7e4d646-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-jmg2s\" (UID: \"5d3c5506-f58a-45bb-adbe-e895b7e4d646\") " pod="openshift-authentication/oauth-openshift-558db77b4-jmg2s" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.654091 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/2fb7b6f6-18b7-4fcd-b339-061bf56eb47a-machine-approver-tls\") pod \"machine-approver-56656f9798-w4twq\" (UID: \"2fb7b6f6-18b7-4fcd-b339-061bf56eb47a\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-w4twq" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.654229 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/50a056ca-c016-46ad-bb2c-0bf1d27511c3-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-vz9fw\" (UID: \"50a056ca-c016-46ad-bb2c-0bf1d27511c3\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-vz9fw" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.654431 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/80f8d801-cad5-4d41-b2c7-1bb2306b1b25-registry-tls\") pod \"image-registry-697d97f7c8-nrsd2\" (UID: \"80f8d801-cad5-4d41-b2c7-1bb2306b1b25\") " pod="openshift-image-registry/image-registry-697d97f7c8-nrsd2" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.654548 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/03c85461-2fda-45e4-a240-5984b368b216-etcd-client\") pod \"apiserver-76f77b778f-s6nf7\" (UID: \"03c85461-2fda-45e4-a240-5984b368b216\") " pod="openshift-apiserver/apiserver-76f77b778f-s6nf7" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.654860 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/fb4884c6-873b-4728-8633-4ce0b794dfcd-stats-auth\") pod \"router-default-5444994796-4v4sr\" (UID: \"fb4884c6-873b-4728-8633-4ce0b794dfcd\") " pod="openshift-ingress/router-default-5444994796-4v4sr" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.655504 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e167755f-d572-4f94-a24b-99a6b0c15552-config\") pod \"authentication-operator-69f744f599-qdc2p\" (UID: \"e167755f-d572-4f94-a24b-99a6b0c15552\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-qdc2p" Nov 26 15:09:38 crc kubenswrapper[4785]: W1126 15:09:38.655652 4785 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod44ec5c36_3bfd_42c0_a4a7_28f7e0e5aad7.slice/crio-55216d7ffb121692660ebb06254131fb77b3e52cc872bb3aff57f986de9142e2 WatchSource:0}: Error finding container 55216d7ffb121692660ebb06254131fb77b3e52cc872bb3aff57f986de9142e2: Status 404 returned error can't find the container with id 55216d7ffb121692660ebb06254131fb77b3e52cc872bb3aff57f986de9142e2 Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.655663 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/5c5bd1c8-cf40-4ad2-8556-6269ab2a08d0-audit-policies\") pod \"apiserver-7bbb656c7d-4h5d5\" (UID: \"5c5bd1c8-cf40-4ad2-8556-6269ab2a08d0\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4h5d5" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.655587 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e167755f-d572-4f94-a24b-99a6b0c15552-serving-cert\") pod \"authentication-operator-69f744f599-qdc2p\" (UID: \"e167755f-d572-4f94-a24b-99a6b0c15552\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-qdc2p" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.655710 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cp8bf\" (UniqueName: \"kubernetes.io/projected/03c85461-2fda-45e4-a240-5984b368b216-kube-api-access-cp8bf\") pod \"apiserver-76f77b778f-s6nf7\" (UID: \"03c85461-2fda-45e4-a240-5984b368b216\") " pod="openshift-apiserver/apiserver-76f77b778f-s6nf7" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.655778 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c75fc6a1-3a11-401c-a519-285f2d2fed86-proxy-tls\") pod \"machine-config-controller-84d6567774-98w8d\" (UID: \"c75fc6a1-3a11-401c-a519-285f2d2fed86\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-98w8d" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.655817 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/80f8d801-cad5-4d41-b2c7-1bb2306b1b25-installation-pull-secrets\") pod \"image-registry-697d97f7c8-nrsd2\" (UID: \"80f8d801-cad5-4d41-b2c7-1bb2306b1b25\") " pod="openshift-image-registry/image-registry-697d97f7c8-nrsd2" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.655845 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2wldj\" (UniqueName: \"kubernetes.io/projected/b006dace-4a31-4ab1-af0f-144eeea6994e-kube-api-access-2wldj\") pod \"catalog-operator-68c6474976-wmz66\" (UID: \"b006dace-4a31-4ab1-af0f-144eeea6994e\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-wmz66" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.655903 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/dc5c31e5-88ab-41d0-9976-b63f97b85543-images\") pod \"machine-api-operator-5694c8668f-lcfs6\" (UID: \"dc5c31e5-88ab-41d0-9976-b63f97b85543\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-lcfs6" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.655954 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/15b03acc-55fb-4845-8658-a7035db26deb-node-bootstrap-token\") pod \"machine-config-server-cnl46\" (UID: \"15b03acc-55fb-4845-8658-a7035db26deb\") " pod="openshift-machine-config-operator/machine-config-server-cnl46" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.655984 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d36477d3-bcc2-47e0-8aab-687e7ae01f9e-secret-volume\") pod \"collect-profiles-29402820-vcn7w\" (UID: \"d36477d3-bcc2-47e0-8aab-687e7ae01f9e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29402820-vcn7w" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.656015 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/03c85461-2fda-45e4-a240-5984b368b216-node-pullsecrets\") pod \"apiserver-76f77b778f-s6nf7\" (UID: \"03c85461-2fda-45e4-a240-5984b368b216\") " pod="openshift-apiserver/apiserver-76f77b778f-s6nf7" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.656042 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/bdb530b8-a573-45a5-beae-3d60ee06ede2-signing-cabundle\") pod \"service-ca-9c57cc56f-mvkhw\" (UID: \"bdb530b8-a573-45a5-beae-3d60ee06ede2\") " pod="openshift-service-ca/service-ca-9c57cc56f-mvkhw" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.656067 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-74fx8\" (UniqueName: \"kubernetes.io/projected/bae6a4ce-857f-47f4-8eda-688a559a21d2-kube-api-access-74fx8\") pod \"ingress-canary-h5xj5\" (UID: \"bae6a4ce-857f-47f4-8eda-688a559a21d2\") " pod="openshift-ingress-canary/ingress-canary-h5xj5" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.656089 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cbxjh\" (UniqueName: \"kubernetes.io/projected/bc1be764-4067-4f28-83e1-684ab5cdebaa-kube-api-access-cbxjh\") pod \"migrator-59844c95c7-fsdwl\" (UID: \"bc1be764-4067-4f28-83e1-684ab5cdebaa\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-fsdwl" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.656129 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/dc5c31e5-88ab-41d0-9976-b63f97b85543-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-lcfs6\" (UID: \"dc5c31e5-88ab-41d0-9976-b63f97b85543\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-lcfs6" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.656138 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/20dcdb37-da72-4fd8-8927-23cd9799d180-config-volume\") pod \"dns-default-452cx\" (UID: \"20dcdb37-da72-4fd8-8927-23cd9799d180\") " pod="openshift-dns/dns-default-452cx" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.656176 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/b022f00c-c760-4fc7-85be-bf6bda07ed6c-registration-dir\") pod \"csi-hostpathplugin-s7n5j\" (UID: \"b022f00c-c760-4fc7-85be-bf6bda07ed6c\") " pod="hostpath-provisioner/csi-hostpathplugin-s7n5j" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.656286 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/03c85461-2fda-45e4-a240-5984b368b216-node-pullsecrets\") pod \"apiserver-76f77b778f-s6nf7\" (UID: \"03c85461-2fda-45e4-a240-5984b368b216\") " pod="openshift-apiserver/apiserver-76f77b778f-s6nf7" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.656320 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/5d3c5506-f58a-45bb-adbe-e895b7e4d646-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-jmg2s\" (UID: \"5d3c5506-f58a-45bb-adbe-e895b7e4d646\") " pod="openshift-authentication/oauth-openshift-558db77b4-jmg2s" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.656334 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/b022f00c-c760-4fc7-85be-bf6bda07ed6c-csi-data-dir\") pod \"csi-hostpathplugin-s7n5j\" (UID: \"b022f00c-c760-4fc7-85be-bf6bda07ed6c\") " pod="hostpath-provisioner/csi-hostpathplugin-s7n5j" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.656399 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/5d3c5506-f58a-45bb-adbe-e895b7e4d646-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-jmg2s\" (UID: \"5d3c5506-f58a-45bb-adbe-e895b7e4d646\") " pod="openshift-authentication/oauth-openshift-558db77b4-jmg2s" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.656426 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xcdzs\" (UniqueName: \"kubernetes.io/projected/5d3c5506-f58a-45bb-adbe-e895b7e4d646-kube-api-access-xcdzs\") pod \"oauth-openshift-558db77b4-jmg2s\" (UID: \"5d3c5506-f58a-45bb-adbe-e895b7e4d646\") " pod="openshift-authentication/oauth-openshift-558db77b4-jmg2s" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.656619 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/03c85461-2fda-45e4-a240-5984b368b216-image-import-ca\") pod \"apiserver-76f77b778f-s6nf7\" (UID: \"03c85461-2fda-45e4-a240-5984b368b216\") " pod="openshift-apiserver/apiserver-76f77b778f-s6nf7" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.656863 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/dc5c31e5-88ab-41d0-9976-b63f97b85543-images\") pod \"machine-api-operator-5694c8668f-lcfs6\" (UID: \"dc5c31e5-88ab-41d0-9976-b63f97b85543\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-lcfs6" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.656898 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/deef326b-464a-4a51-89e8-7fae04b5669a-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-b2j9t\" (UID: \"deef326b-464a-4a51-89e8-7fae04b5669a\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-b2j9t" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.656934 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1a208865-a883-4a5e-9a56-2fb270dd3ffc-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-4q4dl\" (UID: \"1a208865-a883-4a5e-9a56-2fb270dd3ffc\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-4q4dl" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.656962 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/03c85461-2fda-45e4-a240-5984b368b216-encryption-config\") pod \"apiserver-76f77b778f-s6nf7\" (UID: \"03c85461-2fda-45e4-a240-5984b368b216\") " pod="openshift-apiserver/apiserver-76f77b778f-s6nf7" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.656993 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/b022f00c-c760-4fc7-85be-bf6bda07ed6c-mountpoint-dir\") pod \"csi-hostpathplugin-s7n5j\" (UID: \"b022f00c-c760-4fc7-85be-bf6bda07ed6c\") " pod="hostpath-provisioner/csi-hostpathplugin-s7n5j" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.657019 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/03c85461-2fda-45e4-a240-5984b368b216-audit\") pod \"apiserver-76f77b778f-s6nf7\" (UID: \"03c85461-2fda-45e4-a240-5984b368b216\") " pod="openshift-apiserver/apiserver-76f77b778f-s6nf7" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.657072 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b006dace-4a31-4ab1-af0f-144eeea6994e-profile-collector-cert\") pod \"catalog-operator-68c6474976-wmz66\" (UID: \"b006dace-4a31-4ab1-af0f-144eeea6994e\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-wmz66" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.657095 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3c750c46-8c0b-4853-80ec-dac544558a67-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-9wv9b\" (UID: \"3c750c46-8c0b-4853-80ec-dac544558a67\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-9wv9b" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.657120 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/45ee2e82-4124-490e-acb9-d7dc4b5c7cc4-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-s7fwr\" (UID: \"45ee2e82-4124-490e-acb9-d7dc4b5c7cc4\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-s7fwr" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.657143 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3c750c46-8c0b-4853-80ec-dac544558a67-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-9wv9b\" (UID: \"3c750c46-8c0b-4853-80ec-dac544558a67\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-9wv9b" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.657174 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/5d3c5506-f58a-45bb-adbe-e895b7e4d646-audit-policies\") pod \"oauth-openshift-558db77b4-jmg2s\" (UID: \"5d3c5506-f58a-45bb-adbe-e895b7e4d646\") " pod="openshift-authentication/oauth-openshift-558db77b4-jmg2s" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.657197 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/16404b4a-e9c8-4e18-bd40-0bdcab054a44-tmpfs\") pod \"packageserver-d55dfcdfc-4z7ct\" (UID: \"16404b4a-e9c8-4e18-bd40-0bdcab054a44\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4z7ct" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.657228 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dllmj\" (UniqueName: \"kubernetes.io/projected/fb4884c6-873b-4728-8633-4ce0b794dfcd-kube-api-access-dllmj\") pod \"router-default-5444994796-4v4sr\" (UID: \"fb4884c6-873b-4728-8633-4ce0b794dfcd\") " pod="openshift-ingress/router-default-5444994796-4v4sr" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.657252 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d07f0d40-7f03-41c2-ac98-5e5a7eb72a95-config\") pod \"etcd-operator-b45778765-vt4z2\" (UID: \"d07f0d40-7f03-41c2-ac98-5e5a7eb72a95\") " pod="openshift-etcd-operator/etcd-operator-b45778765-vt4z2" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.657281 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-42x7h\" (UniqueName: \"kubernetes.io/projected/20dcdb37-da72-4fd8-8927-23cd9799d180-kube-api-access-42x7h\") pod \"dns-default-452cx\" (UID: \"20dcdb37-da72-4fd8-8927-23cd9799d180\") " pod="openshift-dns/dns-default-452cx" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.657305 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/99f14898-6d00-4964-8eaa-f58db7e92512-profile-collector-cert\") pod \"olm-operator-6b444d44fb-56grk\" (UID: \"99f14898-6d00-4964-8eaa-f58db7e92512\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-56grk" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.657330 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/15b03acc-55fb-4845-8658-a7035db26deb-certs\") pod \"machine-config-server-cnl46\" (UID: \"15b03acc-55fb-4845-8658-a7035db26deb\") " pod="openshift-machine-config-operator/machine-config-server-cnl46" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.657350 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/03c85461-2fda-45e4-a240-5984b368b216-image-import-ca\") pod \"apiserver-76f77b778f-s6nf7\" (UID: \"03c85461-2fda-45e4-a240-5984b368b216\") " pod="openshift-apiserver/apiserver-76f77b778f-s6nf7" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.657361 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/5d3c5506-f58a-45bb-adbe-e895b7e4d646-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-jmg2s\" (UID: \"5d3c5506-f58a-45bb-adbe-e895b7e4d646\") " pod="openshift-authentication/oauth-openshift-558db77b4-jmg2s" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.657391 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g7zxq\" (UniqueName: \"kubernetes.io/projected/16404b4a-e9c8-4e18-bd40-0bdcab054a44-kube-api-access-g7zxq\") pod \"packageserver-d55dfcdfc-4z7ct\" (UID: \"16404b4a-e9c8-4e18-bd40-0bdcab054a44\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4z7ct" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.657414 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/d07f0d40-7f03-41c2-ac98-5e5a7eb72a95-etcd-client\") pod \"etcd-operator-b45778765-vt4z2\" (UID: \"d07f0d40-7f03-41c2-ac98-5e5a7eb72a95\") " pod="openshift-etcd-operator/etcd-operator-b45778765-vt4z2" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.657679 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/5c5bd1c8-cf40-4ad2-8556-6269ab2a08d0-etcd-client\") pod \"apiserver-7bbb656c7d-4h5d5\" (UID: \"5c5bd1c8-cf40-4ad2-8556-6269ab2a08d0\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4h5d5" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.657801 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/03c85461-2fda-45e4-a240-5984b368b216-audit\") pod \"apiserver-76f77b778f-s6nf7\" (UID: \"03c85461-2fda-45e4-a240-5984b368b216\") " pod="openshift-apiserver/apiserver-76f77b778f-s6nf7" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.658017 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fb4884c6-873b-4728-8633-4ce0b794dfcd-metrics-certs\") pod \"router-default-5444994796-4v4sr\" (UID: \"fb4884c6-873b-4728-8633-4ce0b794dfcd\") " pod="openshift-ingress/router-default-5444994796-4v4sr" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.658067 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/5d3c5506-f58a-45bb-adbe-e895b7e4d646-audit-policies\") pod \"oauth-openshift-558db77b4-jmg2s\" (UID: \"5d3c5506-f58a-45bb-adbe-e895b7e4d646\") " pod="openshift-authentication/oauth-openshift-558db77b4-jmg2s" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.658093 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jjsxk\" (UniqueName: \"kubernetes.io/projected/5a190f82-8e34-4e31-b2a1-e773974ccf4e-kube-api-access-jjsxk\") pod \"multus-admission-controller-857f4d67dd-hlltg\" (UID: \"5a190f82-8e34-4e31-b2a1-e773974ccf4e\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-hlltg" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.658246 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/80f8d801-cad5-4d41-b2c7-1bb2306b1b25-bound-sa-token\") pod \"image-registry-697d97f7c8-nrsd2\" (UID: \"80f8d801-cad5-4d41-b2c7-1bb2306b1b25\") " pod="openshift-image-registry/image-registry-697d97f7c8-nrsd2" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.658294 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/c9abc8ae-a944-4f87-909b-8258f95c2c06-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-cllbv\" (UID: \"c9abc8ae-a944-4f87-909b-8258f95c2c06\") " pod="openshift-marketplace/marketplace-operator-79b997595-cllbv" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.658369 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dhfqt\" (UniqueName: \"kubernetes.io/projected/936506c8-413e-4978-830e-323d53a45cdf-kube-api-access-dhfqt\") pod \"package-server-manager-789f6589d5-qr82q\" (UID: \"936506c8-413e-4978-830e-323d53a45cdf\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-qr82q" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.658421 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/03c85461-2fda-45e4-a240-5984b368b216-audit-dir\") pod \"apiserver-76f77b778f-s6nf7\" (UID: \"03c85461-2fda-45e4-a240-5984b368b216\") " pod="openshift-apiserver/apiserver-76f77b778f-s6nf7" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.658455 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jjj4s\" (UniqueName: \"kubernetes.io/projected/bdb530b8-a573-45a5-beae-3d60ee06ede2-kube-api-access-jjj4s\") pod \"service-ca-9c57cc56f-mvkhw\" (UID: \"bdb530b8-a573-45a5-beae-3d60ee06ede2\") " pod="openshift-service-ca/service-ca-9c57cc56f-mvkhw" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.658480 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/917f8462-10e3-4f5d-b991-7e549c9fbc0c-config\") pod \"service-ca-operator-777779d784-7dxnf\" (UID: \"917f8462-10e3-4f5d-b991-7e549c9fbc0c\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-7dxnf" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.658577 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e167755f-d572-4f94-a24b-99a6b0c15552-service-ca-bundle\") pod \"authentication-operator-69f744f599-qdc2p\" (UID: \"e167755f-d572-4f94-a24b-99a6b0c15552\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-qdc2p" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.658605 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p4cl7\" (UniqueName: \"kubernetes.io/projected/ed213209-32b5-4239-a332-afb26f56e83c-kube-api-access-p4cl7\") pod \"ingress-operator-5b745b69d9-dr58q\" (UID: \"ed213209-32b5-4239-a332-afb26f56e83c\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-dr58q" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.658627 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3c750c46-8c0b-4853-80ec-dac544558a67-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-9wv9b\" (UID: \"3c750c46-8c0b-4853-80ec-dac544558a67\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-9wv9b" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.658653 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-967vg\" (UniqueName: \"kubernetes.io/projected/d07f0d40-7f03-41c2-ac98-5e5a7eb72a95-kube-api-access-967vg\") pod \"etcd-operator-b45778765-vt4z2\" (UID: \"d07f0d40-7f03-41c2-ac98-5e5a7eb72a95\") " pod="openshift-etcd-operator/etcd-operator-b45778765-vt4z2" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.658706 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d7zwb\" (UniqueName: \"kubernetes.io/projected/15b03acc-55fb-4845-8658-a7035db26deb-kube-api-access-d7zwb\") pod \"machine-config-server-cnl46\" (UID: \"15b03acc-55fb-4845-8658-a7035db26deb\") " pod="openshift-machine-config-operator/machine-config-server-cnl46" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.658744 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/5d3c5506-f58a-45bb-adbe-e895b7e4d646-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-jmg2s\" (UID: \"5d3c5506-f58a-45bb-adbe-e895b7e4d646\") " pod="openshift-authentication/oauth-openshift-558db77b4-jmg2s" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.658777 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5s6x4\" (UniqueName: \"kubernetes.io/projected/e167755f-d572-4f94-a24b-99a6b0c15552-kube-api-access-5s6x4\") pod \"authentication-operator-69f744f599-qdc2p\" (UID: \"e167755f-d572-4f94-a24b-99a6b0c15552\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-qdc2p" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.658801 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dgg4l\" (UniqueName: \"kubernetes.io/projected/d36477d3-bcc2-47e0-8aab-687e7ae01f9e-kube-api-access-dgg4l\") pod \"collect-profiles-29402820-vcn7w\" (UID: \"d36477d3-bcc2-47e0-8aab-687e7ae01f9e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29402820-vcn7w" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.658829 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1d527b78-bffe-47b6-bbaa-f06bcaab2316-config\") pod \"openshift-apiserver-operator-796bbdcf4f-wb7dh\" (UID: \"1d527b78-bffe-47b6-bbaa-f06bcaab2316\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-wb7dh" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.658983 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/03c85461-2fda-45e4-a240-5984b368b216-config\") pod \"apiserver-76f77b778f-s6nf7\" (UID: \"03c85461-2fda-45e4-a240-5984b368b216\") " pod="openshift-apiserver/apiserver-76f77b778f-s6nf7" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.659013 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/fb4884c6-873b-4728-8633-4ce0b794dfcd-default-certificate\") pod \"router-default-5444994796-4v4sr\" (UID: \"fb4884c6-873b-4728-8633-4ce0b794dfcd\") " pod="openshift-ingress/router-default-5444994796-4v4sr" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.659045 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/5d3c5506-f58a-45bb-adbe-e895b7e4d646-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-jmg2s\" (UID: \"5d3c5506-f58a-45bb-adbe-e895b7e4d646\") " pod="openshift-authentication/oauth-openshift-558db77b4-jmg2s" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.659070 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/99f14898-6d00-4964-8eaa-f58db7e92512-srv-cert\") pod \"olm-operator-6b444d44fb-56grk\" (UID: \"99f14898-6d00-4964-8eaa-f58db7e92512\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-56grk" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.659092 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bjwrd\" (UniqueName: \"kubernetes.io/projected/b022f00c-c760-4fc7-85be-bf6bda07ed6c-kube-api-access-bjwrd\") pod \"csi-hostpathplugin-s7n5j\" (UID: \"b022f00c-c760-4fc7-85be-bf6bda07ed6c\") " pod="hostpath-provisioner/csi-hostpathplugin-s7n5j" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.659109 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dc5c31e5-88ab-41d0-9976-b63f97b85543-config\") pod \"machine-api-operator-5694c8668f-lcfs6\" (UID: \"dc5c31e5-88ab-41d0-9976-b63f97b85543\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-lcfs6" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.659128 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xmzhp\" (UniqueName: \"kubernetes.io/projected/f3f5b38b-0a65-4340-b4a2-18e4da7574f4-kube-api-access-xmzhp\") pod \"console-operator-58897d9998-pnqcw\" (UID: \"f3f5b38b-0a65-4340-b4a2-18e4da7574f4\") " pod="openshift-console-operator/console-operator-58897d9998-pnqcw" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.659193 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ln7mr\" (UniqueName: \"kubernetes.io/projected/45ee2e82-4124-490e-acb9-d7dc4b5c7cc4-kube-api-access-ln7mr\") pod \"cluster-image-registry-operator-dc59b4c8b-s7fwr\" (UID: \"45ee2e82-4124-490e-acb9-d7dc4b5c7cc4\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-s7fwr" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.659218 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/917f8462-10e3-4f5d-b991-7e549c9fbc0c-serving-cert\") pod \"service-ca-operator-777779d784-7dxnf\" (UID: \"917f8462-10e3-4f5d-b991-7e549c9fbc0c\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-7dxnf" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.659239 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/80f8d801-cad5-4d41-b2c7-1bb2306b1b25-ca-trust-extracted\") pod \"image-registry-697d97f7c8-nrsd2\" (UID: \"80f8d801-cad5-4d41-b2c7-1bb2306b1b25\") " pod="openshift-image-registry/image-registry-697d97f7c8-nrsd2" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.659257 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/5d3c5506-f58a-45bb-adbe-e895b7e4d646-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-jmg2s\" (UID: \"5d3c5506-f58a-45bb-adbe-e895b7e4d646\") " pod="openshift-authentication/oauth-openshift-558db77b4-jmg2s" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.659275 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/deef326b-464a-4a51-89e8-7fae04b5669a-config\") pod \"kube-controller-manager-operator-78b949d7b-b2j9t\" (UID: \"deef326b-464a-4a51-89e8-7fae04b5669a\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-b2j9t" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.659296 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-njvq8\" (UniqueName: \"kubernetes.io/projected/5c5bd1c8-cf40-4ad2-8556-6269ab2a08d0-kube-api-access-njvq8\") pod \"apiserver-7bbb656c7d-4h5d5\" (UID: \"5c5bd1c8-cf40-4ad2-8556-6269ab2a08d0\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4h5d5" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.659314 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ed213209-32b5-4239-a332-afb26f56e83c-bound-sa-token\") pod \"ingress-operator-5b745b69d9-dr58q\" (UID: \"ed213209-32b5-4239-a332-afb26f56e83c\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-dr58q" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.659331 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/bdb530b8-a573-45a5-beae-3d60ee06ede2-signing-key\") pod \"service-ca-9c57cc56f-mvkhw\" (UID: \"bdb530b8-a573-45a5-beae-3d60ee06ede2\") " pod="openshift-service-ca/service-ca-9c57cc56f-mvkhw" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.659346 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/936506c8-413e-4978-830e-323d53a45cdf-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-qr82q\" (UID: \"936506c8-413e-4978-830e-323d53a45cdf\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-qr82q" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.659365 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/bebeadc0-d563-42fc-9283-819249f42c0f-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-h952m\" (UID: \"bebeadc0-d563-42fc-9283-819249f42c0f\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-h952m" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.659381 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d36477d3-bcc2-47e0-8aab-687e7ae01f9e-config-volume\") pod \"collect-profiles-29402820-vcn7w\" (UID: \"d36477d3-bcc2-47e0-8aab-687e7ae01f9e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29402820-vcn7w" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.659403 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/8cd4a8fe-0b02-4ee4-b125-c23de9e56acf-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-4k7ds\" (UID: \"8cd4a8fe-0b02-4ee4-b125-c23de9e56acf\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-4k7ds" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.659411 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1d527b78-bffe-47b6-bbaa-f06bcaab2316-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-wb7dh\" (UID: \"1d527b78-bffe-47b6-bbaa-f06bcaab2316\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-wb7dh" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.659495 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e167755f-d572-4f94-a24b-99a6b0c15552-config\") pod \"authentication-operator-69f744f599-qdc2p\" (UID: \"e167755f-d572-4f94-a24b-99a6b0c15552\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-qdc2p" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.660018 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/03c85461-2fda-45e4-a240-5984b368b216-audit-dir\") pod \"apiserver-76f77b778f-s6nf7\" (UID: \"03c85461-2fda-45e4-a240-5984b368b216\") " pod="openshift-apiserver/apiserver-76f77b778f-s6nf7" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.660073 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/deef326b-464a-4a51-89e8-7fae04b5669a-config\") pod \"kube-controller-manager-operator-78b949d7b-b2j9t\" (UID: \"deef326b-464a-4a51-89e8-7fae04b5669a\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-b2j9t" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.660310 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/45ee2e82-4124-490e-acb9-d7dc4b5c7cc4-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-s7fwr\" (UID: \"45ee2e82-4124-490e-acb9-d7dc4b5c7cc4\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-s7fwr" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.660472 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1d527b78-bffe-47b6-bbaa-f06bcaab2316-config\") pod \"openshift-apiserver-operator-796bbdcf4f-wb7dh\" (UID: \"1d527b78-bffe-47b6-bbaa-f06bcaab2316\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-wb7dh" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.660665 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dc5c31e5-88ab-41d0-9976-b63f97b85543-config\") pod \"machine-api-operator-5694c8668f-lcfs6\" (UID: \"dc5c31e5-88ab-41d0-9976-b63f97b85543\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-lcfs6" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.660930 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/5c5bd1c8-cf40-4ad2-8556-6269ab2a08d0-audit-policies\") pod \"apiserver-7bbb656c7d-4h5d5\" (UID: \"5c5bd1c8-cf40-4ad2-8556-6269ab2a08d0\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4h5d5" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.661074 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/20dcdb37-da72-4fd8-8927-23cd9799d180-metrics-tls\") pod \"dns-default-452cx\" (UID: \"20dcdb37-da72-4fd8-8927-23cd9799d180\") " pod="openshift-dns/dns-default-452cx" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.661120 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/03c85461-2fda-45e4-a240-5984b368b216-config\") pod \"apiserver-76f77b778f-s6nf7\" (UID: \"03c85461-2fda-45e4-a240-5984b368b216\") " pod="openshift-apiserver/apiserver-76f77b778f-s6nf7" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.661370 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fbb8r\" (UniqueName: \"kubernetes.io/projected/1d527b78-bffe-47b6-bbaa-f06bcaab2316-kube-api-access-fbb8r\") pod \"openshift-apiserver-operator-796bbdcf4f-wb7dh\" (UID: \"1d527b78-bffe-47b6-bbaa-f06bcaab2316\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-wb7dh" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.661482 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f3f5b38b-0a65-4340-b4a2-18e4da7574f4-config\") pod \"console-operator-58897d9998-pnqcw\" (UID: \"f3f5b38b-0a65-4340-b4a2-18e4da7574f4\") " pod="openshift-console-operator/console-operator-58897d9998-pnqcw" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.661422 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/80f8d801-cad5-4d41-b2c7-1bb2306b1b25-ca-trust-extracted\") pod \"image-registry-697d97f7c8-nrsd2\" (UID: \"80f8d801-cad5-4d41-b2c7-1bb2306b1b25\") " pod="openshift-image-registry/image-registry-697d97f7c8-nrsd2" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.661388 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e167755f-d572-4f94-a24b-99a6b0c15552-service-ca-bundle\") pod \"authentication-operator-69f744f599-qdc2p\" (UID: \"e167755f-d572-4f94-a24b-99a6b0c15552\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-qdc2p" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.661797 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kp6vh\" (UniqueName: \"kubernetes.io/projected/80f8d801-cad5-4d41-b2c7-1bb2306b1b25-kube-api-access-kp6vh\") pod \"image-registry-697d97f7c8-nrsd2\" (UID: \"80f8d801-cad5-4d41-b2c7-1bb2306b1b25\") " pod="openshift-image-registry/image-registry-697d97f7c8-nrsd2" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.662424 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f3f5b38b-0a65-4340-b4a2-18e4da7574f4-config\") pod \"console-operator-58897d9998-pnqcw\" (UID: \"f3f5b38b-0a65-4340-b4a2-18e4da7574f4\") " pod="openshift-console-operator/console-operator-58897d9998-pnqcw" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.663386 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f3f5b38b-0a65-4340-b4a2-18e4da7574f4-serving-cert\") pod \"console-operator-58897d9998-pnqcw\" (UID: \"f3f5b38b-0a65-4340-b4a2-18e4da7574f4\") " pod="openshift-console-operator/console-operator-58897d9998-pnqcw" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.667393 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/deef326b-464a-4a51-89e8-7fae04b5669a-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-b2j9t\" (UID: \"deef326b-464a-4a51-89e8-7fae04b5669a\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-b2j9t" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.668001 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/80f8d801-cad5-4d41-b2c7-1bb2306b1b25-trusted-ca\") pod \"image-registry-697d97f7c8-nrsd2\" (UID: \"80f8d801-cad5-4d41-b2c7-1bb2306b1b25\") " pod="openshift-image-registry/image-registry-697d97f7c8-nrsd2" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.670649 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/80f8d801-cad5-4d41-b2c7-1bb2306b1b25-installation-pull-secrets\") pod \"image-registry-697d97f7c8-nrsd2\" (UID: \"80f8d801-cad5-4d41-b2c7-1bb2306b1b25\") " pod="openshift-image-registry/image-registry-697d97f7c8-nrsd2" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.670909 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fb4884c6-873b-4728-8633-4ce0b794dfcd-metrics-certs\") pod \"router-default-5444994796-4v4sr\" (UID: \"fb4884c6-873b-4728-8633-4ce0b794dfcd\") " pod="openshift-ingress/router-default-5444994796-4v4sr" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.671125 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/03c85461-2fda-45e4-a240-5984b368b216-encryption-config\") pod \"apiserver-76f77b778f-s6nf7\" (UID: \"03c85461-2fda-45e4-a240-5984b368b216\") " pod="openshift-apiserver/apiserver-76f77b778f-s6nf7" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.671849 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/5d3c5506-f58a-45bb-adbe-e895b7e4d646-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-jmg2s\" (UID: \"5d3c5506-f58a-45bb-adbe-e895b7e4d646\") " pod="openshift-authentication/oauth-openshift-558db77b4-jmg2s" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.671854 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/5d3c5506-f58a-45bb-adbe-e895b7e4d646-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-jmg2s\" (UID: \"5d3c5506-f58a-45bb-adbe-e895b7e4d646\") " pod="openshift-authentication/oauth-openshift-558db77b4-jmg2s" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.672095 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/5c5bd1c8-cf40-4ad2-8556-6269ab2a08d0-etcd-client\") pod \"apiserver-7bbb656c7d-4h5d5\" (UID: \"5c5bd1c8-cf40-4ad2-8556-6269ab2a08d0\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4h5d5" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.672209 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/03c85461-2fda-45e4-a240-5984b368b216-serving-cert\") pod \"apiserver-76f77b778f-s6nf7\" (UID: \"03c85461-2fda-45e4-a240-5984b368b216\") " pod="openshift-apiserver/apiserver-76f77b778f-s6nf7" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.672426 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/fb4884c6-873b-4728-8633-4ce0b794dfcd-default-certificate\") pod \"router-default-5444994796-4v4sr\" (UID: \"fb4884c6-873b-4728-8633-4ce0b794dfcd\") " pod="openshift-ingress/router-default-5444994796-4v4sr" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.673066 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1d527b78-bffe-47b6-bbaa-f06bcaab2316-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-wb7dh\" (UID: \"1d527b78-bffe-47b6-bbaa-f06bcaab2316\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-wb7dh" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.673073 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/5d3c5506-f58a-45bb-adbe-e895b7e4d646-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-jmg2s\" (UID: \"5d3c5506-f58a-45bb-adbe-e895b7e4d646\") " pod="openshift-authentication/oauth-openshift-558db77b4-jmg2s" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.674898 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b9d58930-37e7-40ed-ab98-48654e752018-metrics-tls\") pod \"dns-operator-744455d44c-bzwjs\" (UID: \"b9d58930-37e7-40ed-ab98-48654e752018\") " pod="openshift-dns-operator/dns-operator-744455d44c-bzwjs" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.676141 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/5d3c5506-f58a-45bb-adbe-e895b7e4d646-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-jmg2s\" (UID: \"5d3c5506-f58a-45bb-adbe-e895b7e4d646\") " pod="openshift-authentication/oauth-openshift-558db77b4-jmg2s" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.676646 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/5d3c5506-f58a-45bb-adbe-e895b7e4d646-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-jmg2s\" (UID: \"5d3c5506-f58a-45bb-adbe-e895b7e4d646\") " pod="openshift-authentication/oauth-openshift-558db77b4-jmg2s" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.681110 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/5d3c5506-f58a-45bb-adbe-e895b7e4d646-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-jmg2s\" (UID: \"5d3c5506-f58a-45bb-adbe-e895b7e4d646\") " pod="openshift-authentication/oauth-openshift-558db77b4-jmg2s" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.696386 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j94xc\" (UniqueName: \"kubernetes.io/projected/dc5c31e5-88ab-41d0-9976-b63f97b85543-kube-api-access-j94xc\") pod \"machine-api-operator-5694c8668f-lcfs6\" (UID: \"dc5c31e5-88ab-41d0-9976-b63f97b85543\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-lcfs6" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.720412 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dxlcs\" (UniqueName: \"kubernetes.io/projected/2fb7b6f6-18b7-4fcd-b339-061bf56eb47a-kube-api-access-dxlcs\") pod \"machine-approver-56656f9798-w4twq\" (UID: \"2fb7b6f6-18b7-4fcd-b339-061bf56eb47a\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-w4twq" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.735080 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f7gm8\" (UniqueName: \"kubernetes.io/projected/50a056ca-c016-46ad-bb2c-0bf1d27511c3-kube-api-access-f7gm8\") pod \"openshift-controller-manager-operator-756b6f6bc6-vz9fw\" (UID: \"50a056ca-c016-46ad-bb2c-0bf1d27511c3\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-vz9fw" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.757597 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/53e6c54c-b18d-459d-b2b4-208e82921018-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-6tqsq\" (UID: \"53e6c54c-b18d-459d-b2b4-208e82921018\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-6tqsq" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.762614 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 26 15:09:38 crc kubenswrapper[4785]: E1126 15:09:38.762730 4785 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-26 15:09:39.262710205 +0000 UTC m=+142.941075969 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.762785 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2wldj\" (UniqueName: \"kubernetes.io/projected/b006dace-4a31-4ab1-af0f-144eeea6994e-kube-api-access-2wldj\") pod \"catalog-operator-68c6474976-wmz66\" (UID: \"b006dace-4a31-4ab1-af0f-144eeea6994e\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-wmz66" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.762808 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/15b03acc-55fb-4845-8658-a7035db26deb-node-bootstrap-token\") pod \"machine-config-server-cnl46\" (UID: \"15b03acc-55fb-4845-8658-a7035db26deb\") " pod="openshift-machine-config-operator/machine-config-server-cnl46" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.762827 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/bdb530b8-a573-45a5-beae-3d60ee06ede2-signing-cabundle\") pod \"service-ca-9c57cc56f-mvkhw\" (UID: \"bdb530b8-a573-45a5-beae-3d60ee06ede2\") " pod="openshift-service-ca/service-ca-9c57cc56f-mvkhw" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.762843 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-74fx8\" (UniqueName: \"kubernetes.io/projected/bae6a4ce-857f-47f4-8eda-688a559a21d2-kube-api-access-74fx8\") pod \"ingress-canary-h5xj5\" (UID: \"bae6a4ce-857f-47f4-8eda-688a559a21d2\") " pod="openshift-ingress-canary/ingress-canary-h5xj5" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.762860 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d36477d3-bcc2-47e0-8aab-687e7ae01f9e-secret-volume\") pod \"collect-profiles-29402820-vcn7w\" (UID: \"d36477d3-bcc2-47e0-8aab-687e7ae01f9e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29402820-vcn7w" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.762883 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cbxjh\" (UniqueName: \"kubernetes.io/projected/bc1be764-4067-4f28-83e1-684ab5cdebaa-kube-api-access-cbxjh\") pod \"migrator-59844c95c7-fsdwl\" (UID: \"bc1be764-4067-4f28-83e1-684ab5cdebaa\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-fsdwl" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.762902 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/20dcdb37-da72-4fd8-8927-23cd9799d180-config-volume\") pod \"dns-default-452cx\" (UID: \"20dcdb37-da72-4fd8-8927-23cd9799d180\") " pod="openshift-dns/dns-default-452cx" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.762919 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/b022f00c-c760-4fc7-85be-bf6bda07ed6c-registration-dir\") pod \"csi-hostpathplugin-s7n5j\" (UID: \"b022f00c-c760-4fc7-85be-bf6bda07ed6c\") " pod="hostpath-provisioner/csi-hostpathplugin-s7n5j" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.762934 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/b022f00c-c760-4fc7-85be-bf6bda07ed6c-csi-data-dir\") pod \"csi-hostpathplugin-s7n5j\" (UID: \"b022f00c-c760-4fc7-85be-bf6bda07ed6c\") " pod="hostpath-provisioner/csi-hostpathplugin-s7n5j" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.762962 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1a208865-a883-4a5e-9a56-2fb270dd3ffc-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-4q4dl\" (UID: \"1a208865-a883-4a5e-9a56-2fb270dd3ffc\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-4q4dl" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.762977 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/b022f00c-c760-4fc7-85be-bf6bda07ed6c-mountpoint-dir\") pod \"csi-hostpathplugin-s7n5j\" (UID: \"b022f00c-c760-4fc7-85be-bf6bda07ed6c\") " pod="hostpath-provisioner/csi-hostpathplugin-s7n5j" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.762995 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b006dace-4a31-4ab1-af0f-144eeea6994e-profile-collector-cert\") pod \"catalog-operator-68c6474976-wmz66\" (UID: \"b006dace-4a31-4ab1-af0f-144eeea6994e\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-wmz66" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.763010 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3c750c46-8c0b-4853-80ec-dac544558a67-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-9wv9b\" (UID: \"3c750c46-8c0b-4853-80ec-dac544558a67\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-9wv9b" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.763031 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3c750c46-8c0b-4853-80ec-dac544558a67-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-9wv9b\" (UID: \"3c750c46-8c0b-4853-80ec-dac544558a67\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-9wv9b" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.763047 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/16404b4a-e9c8-4e18-bd40-0bdcab054a44-tmpfs\") pod \"packageserver-d55dfcdfc-4z7ct\" (UID: \"16404b4a-e9c8-4e18-bd40-0bdcab054a44\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4z7ct" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.763067 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d07f0d40-7f03-41c2-ac98-5e5a7eb72a95-config\") pod \"etcd-operator-b45778765-vt4z2\" (UID: \"d07f0d40-7f03-41c2-ac98-5e5a7eb72a95\") " pod="openshift-etcd-operator/etcd-operator-b45778765-vt4z2" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.763083 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-42x7h\" (UniqueName: \"kubernetes.io/projected/20dcdb37-da72-4fd8-8927-23cd9799d180-kube-api-access-42x7h\") pod \"dns-default-452cx\" (UID: \"20dcdb37-da72-4fd8-8927-23cd9799d180\") " pod="openshift-dns/dns-default-452cx" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.763098 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/99f14898-6d00-4964-8eaa-f58db7e92512-profile-collector-cert\") pod \"olm-operator-6b444d44fb-56grk\" (UID: \"99f14898-6d00-4964-8eaa-f58db7e92512\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-56grk" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.763113 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/15b03acc-55fb-4845-8658-a7035db26deb-certs\") pod \"machine-config-server-cnl46\" (UID: \"15b03acc-55fb-4845-8658-a7035db26deb\") " pod="openshift-machine-config-operator/machine-config-server-cnl46" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.763131 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g7zxq\" (UniqueName: \"kubernetes.io/projected/16404b4a-e9c8-4e18-bd40-0bdcab054a44-kube-api-access-g7zxq\") pod \"packageserver-d55dfcdfc-4z7ct\" (UID: \"16404b4a-e9c8-4e18-bd40-0bdcab054a44\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4z7ct" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.763147 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/d07f0d40-7f03-41c2-ac98-5e5a7eb72a95-etcd-client\") pod \"etcd-operator-b45778765-vt4z2\" (UID: \"d07f0d40-7f03-41c2-ac98-5e5a7eb72a95\") " pod="openshift-etcd-operator/etcd-operator-b45778765-vt4z2" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.763164 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jjsxk\" (UniqueName: \"kubernetes.io/projected/5a190f82-8e34-4e31-b2a1-e773974ccf4e-kube-api-access-jjsxk\") pod \"multus-admission-controller-857f4d67dd-hlltg\" (UID: \"5a190f82-8e34-4e31-b2a1-e773974ccf4e\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-hlltg" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.763185 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/c9abc8ae-a944-4f87-909b-8258f95c2c06-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-cllbv\" (UID: \"c9abc8ae-a944-4f87-909b-8258f95c2c06\") " pod="openshift-marketplace/marketplace-operator-79b997595-cllbv" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.763200 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dhfqt\" (UniqueName: \"kubernetes.io/projected/936506c8-413e-4978-830e-323d53a45cdf-kube-api-access-dhfqt\") pod \"package-server-manager-789f6589d5-qr82q\" (UID: \"936506c8-413e-4978-830e-323d53a45cdf\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-qr82q" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.763224 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jjj4s\" (UniqueName: \"kubernetes.io/projected/bdb530b8-a573-45a5-beae-3d60ee06ede2-kube-api-access-jjj4s\") pod \"service-ca-9c57cc56f-mvkhw\" (UID: \"bdb530b8-a573-45a5-beae-3d60ee06ede2\") " pod="openshift-service-ca/service-ca-9c57cc56f-mvkhw" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.763240 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/917f8462-10e3-4f5d-b991-7e549c9fbc0c-config\") pod \"service-ca-operator-777779d784-7dxnf\" (UID: \"917f8462-10e3-4f5d-b991-7e549c9fbc0c\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-7dxnf" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.763257 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p4cl7\" (UniqueName: \"kubernetes.io/projected/ed213209-32b5-4239-a332-afb26f56e83c-kube-api-access-p4cl7\") pod \"ingress-operator-5b745b69d9-dr58q\" (UID: \"ed213209-32b5-4239-a332-afb26f56e83c\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-dr58q" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.763277 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3c750c46-8c0b-4853-80ec-dac544558a67-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-9wv9b\" (UID: \"3c750c46-8c0b-4853-80ec-dac544558a67\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-9wv9b" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.763293 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-967vg\" (UniqueName: \"kubernetes.io/projected/d07f0d40-7f03-41c2-ac98-5e5a7eb72a95-kube-api-access-967vg\") pod \"etcd-operator-b45778765-vt4z2\" (UID: \"d07f0d40-7f03-41c2-ac98-5e5a7eb72a95\") " pod="openshift-etcd-operator/etcd-operator-b45778765-vt4z2" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.763310 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d7zwb\" (UniqueName: \"kubernetes.io/projected/15b03acc-55fb-4845-8658-a7035db26deb-kube-api-access-d7zwb\") pod \"machine-config-server-cnl46\" (UID: \"15b03acc-55fb-4845-8658-a7035db26deb\") " pod="openshift-machine-config-operator/machine-config-server-cnl46" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.763337 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dgg4l\" (UniqueName: \"kubernetes.io/projected/d36477d3-bcc2-47e0-8aab-687e7ae01f9e-kube-api-access-dgg4l\") pod \"collect-profiles-29402820-vcn7w\" (UID: \"d36477d3-bcc2-47e0-8aab-687e7ae01f9e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29402820-vcn7w" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.763352 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/99f14898-6d00-4964-8eaa-f58db7e92512-srv-cert\") pod \"olm-operator-6b444d44fb-56grk\" (UID: \"99f14898-6d00-4964-8eaa-f58db7e92512\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-56grk" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.763368 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bjwrd\" (UniqueName: \"kubernetes.io/projected/b022f00c-c760-4fc7-85be-bf6bda07ed6c-kube-api-access-bjwrd\") pod \"csi-hostpathplugin-s7n5j\" (UID: \"b022f00c-c760-4fc7-85be-bf6bda07ed6c\") " pod="hostpath-provisioner/csi-hostpathplugin-s7n5j" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.763391 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/917f8462-10e3-4f5d-b991-7e549c9fbc0c-serving-cert\") pod \"service-ca-operator-777779d784-7dxnf\" (UID: \"917f8462-10e3-4f5d-b991-7e549c9fbc0c\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-7dxnf" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.763412 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ed213209-32b5-4239-a332-afb26f56e83c-bound-sa-token\") pod \"ingress-operator-5b745b69d9-dr58q\" (UID: \"ed213209-32b5-4239-a332-afb26f56e83c\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-dr58q" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.763427 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/bdb530b8-a573-45a5-beae-3d60ee06ede2-signing-key\") pod \"service-ca-9c57cc56f-mvkhw\" (UID: \"bdb530b8-a573-45a5-beae-3d60ee06ede2\") " pod="openshift-service-ca/service-ca-9c57cc56f-mvkhw" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.763443 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/936506c8-413e-4978-830e-323d53a45cdf-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-qr82q\" (UID: \"936506c8-413e-4978-830e-323d53a45cdf\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-qr82q" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.763461 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/bebeadc0-d563-42fc-9283-819249f42c0f-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-h952m\" (UID: \"bebeadc0-d563-42fc-9283-819249f42c0f\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-h952m" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.763476 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/20dcdb37-da72-4fd8-8927-23cd9799d180-metrics-tls\") pod \"dns-default-452cx\" (UID: \"20dcdb37-da72-4fd8-8927-23cd9799d180\") " pod="openshift-dns/dns-default-452cx" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.763490 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d36477d3-bcc2-47e0-8aab-687e7ae01f9e-config-volume\") pod \"collect-profiles-29402820-vcn7w\" (UID: \"d36477d3-bcc2-47e0-8aab-687e7ae01f9e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29402820-vcn7w" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.763515 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/b022f00c-c760-4fc7-85be-bf6bda07ed6c-plugins-dir\") pod \"csi-hostpathplugin-s7n5j\" (UID: \"b022f00c-c760-4fc7-85be-bf6bda07ed6c\") " pod="hostpath-provisioner/csi-hostpathplugin-s7n5j" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.763531 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-29647\" (UniqueName: \"kubernetes.io/projected/c75fc6a1-3a11-401c-a519-285f2d2fed86-kube-api-access-29647\") pod \"machine-config-controller-84d6567774-98w8d\" (UID: \"c75fc6a1-3a11-401c-a519-285f2d2fed86\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-98w8d" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.763547 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6428b894-a54a-4acc-8f84-66843f7165f0-serving-cert\") pod \"route-controller-manager-6576b87f9c-sq88m\" (UID: \"6428b894-a54a-4acc-8f84-66843f7165f0\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-sq88m" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.763582 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/16404b4a-e9c8-4e18-bd40-0bdcab054a44-webhook-cert\") pod \"packageserver-d55dfcdfc-4z7ct\" (UID: \"16404b4a-e9c8-4e18-bd40-0bdcab054a44\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4z7ct" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.763597 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sr2m9\" (UniqueName: \"kubernetes.io/projected/917f8462-10e3-4f5d-b991-7e549c9fbc0c-kube-api-access-sr2m9\") pod \"service-ca-operator-777779d784-7dxnf\" (UID: \"917f8462-10e3-4f5d-b991-7e549c9fbc0c\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-7dxnf" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.763614 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0a45fa88-9d4e-4801-8d54-a7fc5ca37240-proxy-tls\") pod \"machine-config-operator-74547568cd-mwxx7\" (UID: \"0a45fa88-9d4e-4801-8d54-a7fc5ca37240\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-mwxx7" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.763628 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jfrll\" (UniqueName: \"kubernetes.io/projected/1a208865-a883-4a5e-9a56-2fb270dd3ffc-kube-api-access-jfrll\") pod \"kube-storage-version-migrator-operator-b67b599dd-4q4dl\" (UID: \"1a208865-a883-4a5e-9a56-2fb270dd3ffc\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-4q4dl" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.763643 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k8ggt\" (UniqueName: \"kubernetes.io/projected/99f14898-6d00-4964-8eaa-f58db7e92512-kube-api-access-k8ggt\") pod \"olm-operator-6b444d44fb-56grk\" (UID: \"99f14898-6d00-4964-8eaa-f58db7e92512\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-56grk" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.763658 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6428b894-a54a-4acc-8f84-66843f7165f0-config\") pod \"route-controller-manager-6576b87f9c-sq88m\" (UID: \"6428b894-a54a-4acc-8f84-66843f7165f0\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-sq88m" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.763673 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/0a45fa88-9d4e-4801-8d54-a7fc5ca37240-images\") pod \"machine-config-operator-74547568cd-mwxx7\" (UID: \"0a45fa88-9d4e-4801-8d54-a7fc5ca37240\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-mwxx7" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.763691 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lkccq\" (UniqueName: \"kubernetes.io/projected/bebeadc0-d563-42fc-9283-819249f42c0f-kube-api-access-lkccq\") pod \"control-plane-machine-set-operator-78cbb6b69f-h952m\" (UID: \"bebeadc0-d563-42fc-9283-819249f42c0f\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-h952m" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.763707 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/16404b4a-e9c8-4e18-bd40-0bdcab054a44-apiservice-cert\") pod \"packageserver-d55dfcdfc-4z7ct\" (UID: \"16404b4a-e9c8-4e18-bd40-0bdcab054a44\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4z7ct" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.763722 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/d07f0d40-7f03-41c2-ac98-5e5a7eb72a95-etcd-ca\") pod \"etcd-operator-b45778765-vt4z2\" (UID: \"d07f0d40-7f03-41c2-ac98-5e5a7eb72a95\") " pod="openshift-etcd-operator/etcd-operator-b45778765-vt4z2" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.763736 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/b022f00c-c760-4fc7-85be-bf6bda07ed6c-socket-dir\") pod \"csi-hostpathplugin-s7n5j\" (UID: \"b022f00c-c760-4fc7-85be-bf6bda07ed6c\") " pod="hostpath-provisioner/csi-hostpathplugin-s7n5j" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.763751 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c9abc8ae-a944-4f87-909b-8258f95c2c06-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-cllbv\" (UID: \"c9abc8ae-a944-4f87-909b-8258f95c2c06\") " pod="openshift-marketplace/marketplace-operator-79b997595-cllbv" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.763769 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nrsd2\" (UID: \"80f8d801-cad5-4d41-b2c7-1bb2306b1b25\") " pod="openshift-image-registry/image-registry-697d97f7c8-nrsd2" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.763783 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bae6a4ce-857f-47f4-8eda-688a559a21d2-cert\") pod \"ingress-canary-h5xj5\" (UID: \"bae6a4ce-857f-47f4-8eda-688a559a21d2\") " pod="openshift-ingress-canary/ingress-canary-h5xj5" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.763799 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6428b894-a54a-4acc-8f84-66843f7165f0-client-ca\") pod \"route-controller-manager-6576b87f9c-sq88m\" (UID: \"6428b894-a54a-4acc-8f84-66843f7165f0\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-sq88m" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.763807 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/bdb530b8-a573-45a5-beae-3d60ee06ede2-signing-cabundle\") pod \"service-ca-9c57cc56f-mvkhw\" (UID: \"bdb530b8-a573-45a5-beae-3d60ee06ede2\") " pod="openshift-service-ca/service-ca-9c57cc56f-mvkhw" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.763814 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f72tq\" (UniqueName: \"kubernetes.io/projected/6428b894-a54a-4acc-8f84-66843f7165f0-kube-api-access-f72tq\") pod \"route-controller-manager-6576b87f9c-sq88m\" (UID: \"6428b894-a54a-4acc-8f84-66843f7165f0\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-sq88m" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.763874 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wzzr5\" (UniqueName: \"kubernetes.io/projected/c9abc8ae-a944-4f87-909b-8258f95c2c06-kube-api-access-wzzr5\") pod \"marketplace-operator-79b997595-cllbv\" (UID: \"c9abc8ae-a944-4f87-909b-8258f95c2c06\") " pod="openshift-marketplace/marketplace-operator-79b997595-cllbv" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.763901 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ed213209-32b5-4239-a332-afb26f56e83c-trusted-ca\") pod \"ingress-operator-5b745b69d9-dr58q\" (UID: \"ed213209-32b5-4239-a332-afb26f56e83c\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-dr58q" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.763927 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1a208865-a883-4a5e-9a56-2fb270dd3ffc-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-4q4dl\" (UID: \"1a208865-a883-4a5e-9a56-2fb270dd3ffc\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-4q4dl" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.763957 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/5a190f82-8e34-4e31-b2a1-e773974ccf4e-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-hlltg\" (UID: \"5a190f82-8e34-4e31-b2a1-e773974ccf4e\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-hlltg" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.763976 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qnk98\" (UniqueName: \"kubernetes.io/projected/0a45fa88-9d4e-4801-8d54-a7fc5ca37240-kube-api-access-qnk98\") pod \"machine-config-operator-74547568cd-mwxx7\" (UID: \"0a45fa88-9d4e-4801-8d54-a7fc5ca37240\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-mwxx7" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.763991 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/d07f0d40-7f03-41c2-ac98-5e5a7eb72a95-etcd-service-ca\") pod \"etcd-operator-b45778765-vt4z2\" (UID: \"d07f0d40-7f03-41c2-ac98-5e5a7eb72a95\") " pod="openshift-etcd-operator/etcd-operator-b45778765-vt4z2" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.764015 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/c75fc6a1-3a11-401c-a519-285f2d2fed86-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-98w8d\" (UID: \"c75fc6a1-3a11-401c-a519-285f2d2fed86\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-98w8d" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.764041 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b006dace-4a31-4ab1-af0f-144eeea6994e-srv-cert\") pod \"catalog-operator-68c6474976-wmz66\" (UID: \"b006dace-4a31-4ab1-af0f-144eeea6994e\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-wmz66" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.764056 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ed213209-32b5-4239-a332-afb26f56e83c-metrics-tls\") pod \"ingress-operator-5b745b69d9-dr58q\" (UID: \"ed213209-32b5-4239-a332-afb26f56e83c\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-dr58q" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.764071 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0a45fa88-9d4e-4801-8d54-a7fc5ca37240-auth-proxy-config\") pod \"machine-config-operator-74547568cd-mwxx7\" (UID: \"0a45fa88-9d4e-4801-8d54-a7fc5ca37240\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-mwxx7" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.764090 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d07f0d40-7f03-41c2-ac98-5e5a7eb72a95-config\") pod \"etcd-operator-b45778765-vt4z2\" (UID: \"d07f0d40-7f03-41c2-ac98-5e5a7eb72a95\") " pod="openshift-etcd-operator/etcd-operator-b45778765-vt4z2" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.764115 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d07f0d40-7f03-41c2-ac98-5e5a7eb72a95-serving-cert\") pod \"etcd-operator-b45778765-vt4z2\" (UID: \"d07f0d40-7f03-41c2-ac98-5e5a7eb72a95\") " pod="openshift-etcd-operator/etcd-operator-b45778765-vt4z2" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.764174 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c75fc6a1-3a11-401c-a519-285f2d2fed86-proxy-tls\") pod \"machine-config-controller-84d6567774-98w8d\" (UID: \"c75fc6a1-3a11-401c-a519-285f2d2fed86\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-98w8d" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.764948 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1a208865-a883-4a5e-9a56-2fb270dd3ffc-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-4q4dl\" (UID: \"1a208865-a883-4a5e-9a56-2fb270dd3ffc\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-4q4dl" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.765835 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/15b03acc-55fb-4845-8658-a7035db26deb-node-bootstrap-token\") pod \"machine-config-server-cnl46\" (UID: \"15b03acc-55fb-4845-8658-a7035db26deb\") " pod="openshift-machine-config-operator/machine-config-server-cnl46" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.766421 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/99f14898-6d00-4964-8eaa-f58db7e92512-profile-collector-cert\") pod \"olm-operator-6b444d44fb-56grk\" (UID: \"99f14898-6d00-4964-8eaa-f58db7e92512\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-56grk" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.766771 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ed213209-32b5-4239-a332-afb26f56e83c-trusted-ca\") pod \"ingress-operator-5b745b69d9-dr58q\" (UID: \"ed213209-32b5-4239-a332-afb26f56e83c\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-dr58q" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.767192 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d07f0d40-7f03-41c2-ac98-5e5a7eb72a95-serving-cert\") pod \"etcd-operator-b45778765-vt4z2\" (UID: \"d07f0d40-7f03-41c2-ac98-5e5a7eb72a95\") " pod="openshift-etcd-operator/etcd-operator-b45778765-vt4z2" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.767684 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0a45fa88-9d4e-4801-8d54-a7fc5ca37240-auth-proxy-config\") pod \"machine-config-operator-74547568cd-mwxx7\" (UID: \"0a45fa88-9d4e-4801-8d54-a7fc5ca37240\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-mwxx7" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.767944 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b006dace-4a31-4ab1-af0f-144eeea6994e-srv-cert\") pod \"catalog-operator-68c6474976-wmz66\" (UID: \"b006dace-4a31-4ab1-af0f-144eeea6994e\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-wmz66" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.768006 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c75fc6a1-3a11-401c-a519-285f2d2fed86-proxy-tls\") pod \"machine-config-controller-84d6567774-98w8d\" (UID: \"c75fc6a1-3a11-401c-a519-285f2d2fed86\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-98w8d" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.768079 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/b022f00c-c760-4fc7-85be-bf6bda07ed6c-mountpoint-dir\") pod \"csi-hostpathplugin-s7n5j\" (UID: \"b022f00c-c760-4fc7-85be-bf6bda07ed6c\") " pod="hostpath-provisioner/csi-hostpathplugin-s7n5j" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.768359 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d36477d3-bcc2-47e0-8aab-687e7ae01f9e-secret-volume\") pod \"collect-profiles-29402820-vcn7w\" (UID: \"d36477d3-bcc2-47e0-8aab-687e7ae01f9e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29402820-vcn7w" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.768660 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3c750c46-8c0b-4853-80ec-dac544558a67-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-9wv9b\" (UID: \"3c750c46-8c0b-4853-80ec-dac544558a67\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-9wv9b" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.768686 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/d07f0d40-7f03-41c2-ac98-5e5a7eb72a95-etcd-service-ca\") pod \"etcd-operator-b45778765-vt4z2\" (UID: \"d07f0d40-7f03-41c2-ac98-5e5a7eb72a95\") " pod="openshift-etcd-operator/etcd-operator-b45778765-vt4z2" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.769304 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d36477d3-bcc2-47e0-8aab-687e7ae01f9e-config-volume\") pod \"collect-profiles-29402820-vcn7w\" (UID: \"d36477d3-bcc2-47e0-8aab-687e7ae01f9e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29402820-vcn7w" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.769460 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/b022f00c-c760-4fc7-85be-bf6bda07ed6c-registration-dir\") pod \"csi-hostpathplugin-s7n5j\" (UID: \"b022f00c-c760-4fc7-85be-bf6bda07ed6c\") " pod="hostpath-provisioner/csi-hostpathplugin-s7n5j" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.769611 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/b022f00c-c760-4fc7-85be-bf6bda07ed6c-plugins-dir\") pod \"csi-hostpathplugin-s7n5j\" (UID: \"b022f00c-c760-4fc7-85be-bf6bda07ed6c\") " pod="hostpath-provisioner/csi-hostpathplugin-s7n5j" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.770345 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/15b03acc-55fb-4845-8658-a7035db26deb-certs\") pod \"machine-config-server-cnl46\" (UID: \"15b03acc-55fb-4845-8658-a7035db26deb\") " pod="openshift-machine-config-operator/machine-config-server-cnl46" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.770658 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/b022f00c-c760-4fc7-85be-bf6bda07ed6c-csi-data-dir\") pod \"csi-hostpathplugin-s7n5j\" (UID: \"b022f00c-c760-4fc7-85be-bf6bda07ed6c\") " pod="hostpath-provisioner/csi-hostpathplugin-s7n5j" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.770718 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/20dcdb37-da72-4fd8-8927-23cd9799d180-config-volume\") pod \"dns-default-452cx\" (UID: \"20dcdb37-da72-4fd8-8927-23cd9799d180\") " pod="openshift-dns/dns-default-452cx" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.770776 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/c75fc6a1-3a11-401c-a519-285f2d2fed86-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-98w8d\" (UID: \"c75fc6a1-3a11-401c-a519-285f2d2fed86\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-98w8d" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.770891 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/16404b4a-e9c8-4e18-bd40-0bdcab054a44-tmpfs\") pod \"packageserver-d55dfcdfc-4z7ct\" (UID: \"16404b4a-e9c8-4e18-bd40-0bdcab054a44\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4z7ct" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.771379 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b006dace-4a31-4ab1-af0f-144eeea6994e-profile-collector-cert\") pod \"catalog-operator-68c6474976-wmz66\" (UID: \"b006dace-4a31-4ab1-af0f-144eeea6994e\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-wmz66" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.772090 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/0a45fa88-9d4e-4801-8d54-a7fc5ca37240-images\") pod \"machine-config-operator-74547568cd-mwxx7\" (UID: \"0a45fa88-9d4e-4801-8d54-a7fc5ca37240\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-mwxx7" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.772117 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ed213209-32b5-4239-a332-afb26f56e83c-metrics-tls\") pod \"ingress-operator-5b745b69d9-dr58q\" (UID: \"ed213209-32b5-4239-a332-afb26f56e83c\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-dr58q" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.772190 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-dss42" event={"ID":"21c0b576-b3af-4d08-8e09-2c3728c8623e","Type":"ContainerStarted","Data":"922a0f563b0bcbe0266469c9db21898a5984b4955d5ffcf9f400951cf54995e6"} Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.772233 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-dss42" event={"ID":"21c0b576-b3af-4d08-8e09-2c3728c8623e","Type":"ContainerStarted","Data":"99a90ce8e7350bb22ead93c90ef66f2288bc38cce129ed78d6d1195c736bb5bd"} Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.773636 4785 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-dss42" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.773936 4785 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-dss42 container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.5:8443/healthz\": dial tcp 10.217.0.5:8443: connect: connection refused" start-of-body= Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.773989 4785 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-dss42" podUID="21c0b576-b3af-4d08-8e09-2c3728c8623e" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.5:8443/healthz\": dial tcp 10.217.0.5:8443: connect: connection refused" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.774034 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/b022f00c-c760-4fc7-85be-bf6bda07ed6c-socket-dir\") pod \"csi-hostpathplugin-s7n5j\" (UID: \"b022f00c-c760-4fc7-85be-bf6bda07ed6c\") " pod="hostpath-provisioner/csi-hostpathplugin-s7n5j" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.774452 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1a208865-a883-4a5e-9a56-2fb270dd3ffc-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-4q4dl\" (UID: \"1a208865-a883-4a5e-9a56-2fb270dd3ffc\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-4q4dl" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.775221 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/20dcdb37-da72-4fd8-8927-23cd9799d180-metrics-tls\") pod \"dns-default-452cx\" (UID: \"20dcdb37-da72-4fd8-8927-23cd9799d180\") " pod="openshift-dns/dns-default-452cx" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.775364 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-z8vql" event={"ID":"b475816f-0aee-4aed-92a4-82ced173b416","Type":"ContainerStarted","Data":"ba79af6d395043b2ac8058c9ead673f5ec1354cee63993ab5a91436d14c951e5"} Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.775734 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-z8vql" event={"ID":"b475816f-0aee-4aed-92a4-82ced173b416","Type":"ContainerStarted","Data":"99277a7394d5d4adb955407df0e6bc3b1639970dab415a09ef09fa5d75286125"} Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.775918 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c9abc8ae-a944-4f87-909b-8258f95c2c06-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-cllbv\" (UID: \"c9abc8ae-a944-4f87-909b-8258f95c2c06\") " pod="openshift-marketplace/marketplace-operator-79b997595-cllbv" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.776019 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/16404b4a-e9c8-4e18-bd40-0bdcab054a44-apiservice-cert\") pod \"packageserver-d55dfcdfc-4z7ct\" (UID: \"16404b4a-e9c8-4e18-bd40-0bdcab054a44\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4z7ct" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.776100 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/917f8462-10e3-4f5d-b991-7e549c9fbc0c-config\") pod \"service-ca-operator-777779d784-7dxnf\" (UID: \"917f8462-10e3-4f5d-b991-7e549c9fbc0c\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-7dxnf" Nov 26 15:09:38 crc kubenswrapper[4785]: E1126 15:09:38.776221 4785 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-26 15:09:39.276204695 +0000 UTC m=+142.954570469 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nrsd2" (UID: "80f8d801-cad5-4d41-b2c7-1bb2306b1b25") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.776596 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0a45fa88-9d4e-4801-8d54-a7fc5ca37240-proxy-tls\") pod \"machine-config-operator-74547568cd-mwxx7\" (UID: \"0a45fa88-9d4e-4801-8d54-a7fc5ca37240\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-mwxx7" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.776901 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6428b894-a54a-4acc-8f84-66843f7165f0-config\") pod \"route-controller-manager-6576b87f9c-sq88m\" (UID: \"6428b894-a54a-4acc-8f84-66843f7165f0\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-sq88m" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.776953 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/bdb530b8-a573-45a5-beae-3d60ee06ede2-signing-key\") pod \"service-ca-9c57cc56f-mvkhw\" (UID: \"bdb530b8-a573-45a5-beae-3d60ee06ede2\") " pod="openshift-service-ca/service-ca-9c57cc56f-mvkhw" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.777149 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-6cx55" event={"ID":"44ec5c36-3bfd-42c0-a4a7-28f7e0e5aad7","Type":"ContainerStarted","Data":"55216d7ffb121692660ebb06254131fb77b3e52cc872bb3aff57f986de9142e2"} Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.777328 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/99f14898-6d00-4964-8eaa-f58db7e92512-srv-cert\") pod \"olm-operator-6b444d44fb-56grk\" (UID: \"99f14898-6d00-4964-8eaa-f58db7e92512\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-56grk" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.777879 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-6tqsq" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.778500 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/936506c8-413e-4978-830e-323d53a45cdf-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-qr82q\" (UID: \"936506c8-413e-4978-830e-323d53a45cdf\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-qr82q" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.778668 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6428b894-a54a-4acc-8f84-66843f7165f0-client-ca\") pod \"route-controller-manager-6576b87f9c-sq88m\" (UID: \"6428b894-a54a-4acc-8f84-66843f7165f0\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-sq88m" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.779187 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/d07f0d40-7f03-41c2-ac98-5e5a7eb72a95-etcd-ca\") pod \"etcd-operator-b45778765-vt4z2\" (UID: \"d07f0d40-7f03-41c2-ac98-5e5a7eb72a95\") " pod="openshift-etcd-operator/etcd-operator-b45778765-vt4z2" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.779303 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j5npj\" (UniqueName: \"kubernetes.io/projected/cb9844e2-63a7-4437-ab9d-5047b2363580-kube-api-access-j5npj\") pod \"downloads-7954f5f757-vn49n\" (UID: \"cb9844e2-63a7-4437-ab9d-5047b2363580\") " pod="openshift-console/downloads-7954f5f757-vn49n" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.779788 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/917f8462-10e3-4f5d-b991-7e549c9fbc0c-serving-cert\") pod \"service-ca-operator-777779d784-7dxnf\" (UID: \"917f8462-10e3-4f5d-b991-7e549c9fbc0c\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-7dxnf" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.779906 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/5a190f82-8e34-4e31-b2a1-e773974ccf4e-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-hlltg\" (UID: \"5a190f82-8e34-4e31-b2a1-e773974ccf4e\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-hlltg" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.780027 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bae6a4ce-857f-47f4-8eda-688a559a21d2-cert\") pod \"ingress-canary-h5xj5\" (UID: \"bae6a4ce-857f-47f4-8eda-688a559a21d2\") " pod="openshift-ingress-canary/ingress-canary-h5xj5" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.780504 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/16404b4a-e9c8-4e18-bd40-0bdcab054a44-webhook-cert\") pod \"packageserver-d55dfcdfc-4z7ct\" (UID: \"16404b4a-e9c8-4e18-bd40-0bdcab054a44\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4z7ct" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.781294 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3c750c46-8c0b-4853-80ec-dac544558a67-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-9wv9b\" (UID: \"3c750c46-8c0b-4853-80ec-dac544558a67\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-9wv9b" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.781655 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6428b894-a54a-4acc-8f84-66843f7165f0-serving-cert\") pod \"route-controller-manager-6576b87f9c-sq88m\" (UID: \"6428b894-a54a-4acc-8f84-66843f7165f0\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-sq88m" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.782149 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/d07f0d40-7f03-41c2-ac98-5e5a7eb72a95-etcd-client\") pod \"etcd-operator-b45778765-vt4z2\" (UID: \"d07f0d40-7f03-41c2-ac98-5e5a7eb72a95\") " pod="openshift-etcd-operator/etcd-operator-b45778765-vt4z2" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.782936 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/c9abc8ae-a944-4f87-909b-8258f95c2c06-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-cllbv\" (UID: \"c9abc8ae-a944-4f87-909b-8258f95c2c06\") " pod="openshift-marketplace/marketplace-operator-79b997595-cllbv" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.783170 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/bebeadc0-d563-42fc-9283-819249f42c0f-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-h952m\" (UID: \"bebeadc0-d563-42fc-9283-819249f42c0f\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-h952m" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.807003 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lq97x\" (UniqueName: \"kubernetes.io/projected/b9d58930-37e7-40ed-ab98-48654e752018-kube-api-access-lq97x\") pod \"dns-operator-744455d44c-bzwjs\" (UID: \"b9d58930-37e7-40ed-ab98-48654e752018\") " pod="openshift-dns-operator/dns-operator-744455d44c-bzwjs" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.815730 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tnmlk\" (UniqueName: \"kubernetes.io/projected/8cd4a8fe-0b02-4ee4-b125-c23de9e56acf-kube-api-access-tnmlk\") pod \"cluster-samples-operator-665b6dd947-4k7ds\" (UID: \"8cd4a8fe-0b02-4ee4-b125-c23de9e56acf\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-4k7ds" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.838642 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cp8bf\" (UniqueName: \"kubernetes.io/projected/03c85461-2fda-45e4-a240-5984b368b216-kube-api-access-cp8bf\") pod \"apiserver-76f77b778f-s6nf7\" (UID: \"03c85461-2fda-45e4-a240-5984b368b216\") " pod="openshift-apiserver/apiserver-76f77b778f-s6nf7" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.860970 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-lcfs6" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.861624 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xcdzs\" (UniqueName: \"kubernetes.io/projected/5d3c5506-f58a-45bb-adbe-e895b7e4d646-kube-api-access-xcdzs\") pod \"oauth-openshift-558db77b4-jmg2s\" (UID: \"5d3c5506-f58a-45bb-adbe-e895b7e4d646\") " pod="openshift-authentication/oauth-openshift-558db77b4-jmg2s" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.865361 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 26 15:09:38 crc kubenswrapper[4785]: E1126 15:09:38.865484 4785 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-26 15:09:39.365457399 +0000 UTC m=+143.043823163 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.865886 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nrsd2\" (UID: \"80f8d801-cad5-4d41-b2c7-1bb2306b1b25\") " pod="openshift-image-registry/image-registry-697d97f7c8-nrsd2" Nov 26 15:09:38 crc kubenswrapper[4785]: E1126 15:09:38.866677 4785 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-26 15:09:39.36665924 +0000 UTC m=+143.045025104 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nrsd2" (UID: "80f8d801-cad5-4d41-b2c7-1bb2306b1b25") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.882471 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/deef326b-464a-4a51-89e8-7fae04b5669a-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-b2j9t\" (UID: \"deef326b-464a-4a51-89e8-7fae04b5669a\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-b2j9t" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.899152 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/45ee2e82-4124-490e-acb9-d7dc4b5c7cc4-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-s7fwr\" (UID: \"45ee2e82-4124-490e-acb9-d7dc4b5c7cc4\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-s7fwr" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.915959 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dllmj\" (UniqueName: \"kubernetes.io/projected/fb4884c6-873b-4728-8633-4ce0b794dfcd-kube-api-access-dllmj\") pod \"router-default-5444994796-4v4sr\" (UID: \"fb4884c6-873b-4728-8633-4ce0b794dfcd\") " pod="openshift-ingress/router-default-5444994796-4v4sr" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.947304 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-njvq8\" (UniqueName: \"kubernetes.io/projected/5c5bd1c8-cf40-4ad2-8556-6269ab2a08d0-kube-api-access-njvq8\") pod \"apiserver-7bbb656c7d-4h5d5\" (UID: \"5c5bd1c8-cf40-4ad2-8556-6269ab2a08d0\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4h5d5" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.947831 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-vz9fw" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.965842 4785 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-6tqsq"] Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.966777 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.967271 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xmzhp\" (UniqueName: \"kubernetes.io/projected/f3f5b38b-0a65-4340-b4a2-18e4da7574f4-kube-api-access-xmzhp\") pod \"console-operator-58897d9998-pnqcw\" (UID: \"f3f5b38b-0a65-4340-b4a2-18e4da7574f4\") " pod="openshift-console-operator/console-operator-58897d9998-pnqcw" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.967478 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4h5d5" Nov 26 15:09:38 crc kubenswrapper[4785]: E1126 15:09:38.967543 4785 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-26 15:09:39.467523586 +0000 UTC m=+143.145889350 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.984528 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-pnqcw" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.987398 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/80f8d801-cad5-4d41-b2c7-1bb2306b1b25-bound-sa-token\") pod \"image-registry-697d97f7c8-nrsd2\" (UID: \"80f8d801-cad5-4d41-b2c7-1bb2306b1b25\") " pod="openshift-image-registry/image-registry-697d97f7c8-nrsd2" Nov 26 15:09:38 crc kubenswrapper[4785]: I1126 15:09:38.995742 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-w4twq" Nov 26 15:09:39 crc kubenswrapper[4785]: I1126 15:09:39.000126 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5s6x4\" (UniqueName: \"kubernetes.io/projected/e167755f-d572-4f94-a24b-99a6b0c15552-kube-api-access-5s6x4\") pod \"authentication-operator-69f744f599-qdc2p\" (UID: \"e167755f-d572-4f94-a24b-99a6b0c15552\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-qdc2p" Nov 26 15:09:39 crc kubenswrapper[4785]: I1126 15:09:39.006428 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-jmg2s" Nov 26 15:09:39 crc kubenswrapper[4785]: I1126 15:09:39.016505 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ln7mr\" (UniqueName: \"kubernetes.io/projected/45ee2e82-4124-490e-acb9-d7dc4b5c7cc4-kube-api-access-ln7mr\") pod \"cluster-image-registry-operator-dc59b4c8b-s7fwr\" (UID: \"45ee2e82-4124-490e-acb9-d7dc4b5c7cc4\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-s7fwr" Nov 26 15:09:39 crc kubenswrapper[4785]: I1126 15:09:39.016831 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-4k7ds" Nov 26 15:09:39 crc kubenswrapper[4785]: I1126 15:09:39.039988 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fbb8r\" (UniqueName: \"kubernetes.io/projected/1d527b78-bffe-47b6-bbaa-f06bcaab2316-kube-api-access-fbb8r\") pod \"openshift-apiserver-operator-796bbdcf4f-wb7dh\" (UID: \"1d527b78-bffe-47b6-bbaa-f06bcaab2316\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-wb7dh" Nov 26 15:09:39 crc kubenswrapper[4785]: I1126 15:09:39.047041 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-s7fwr" Nov 26 15:09:39 crc kubenswrapper[4785]: I1126 15:09:39.055493 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-vn49n" Nov 26 15:09:39 crc kubenswrapper[4785]: I1126 15:09:39.059435 4785 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-lcfs6"] Nov 26 15:09:39 crc kubenswrapper[4785]: I1126 15:09:39.062766 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kp6vh\" (UniqueName: \"kubernetes.io/projected/80f8d801-cad5-4d41-b2c7-1bb2306b1b25-kube-api-access-kp6vh\") pod \"image-registry-697d97f7c8-nrsd2\" (UID: \"80f8d801-cad5-4d41-b2c7-1bb2306b1b25\") " pod="openshift-image-registry/image-registry-697d97f7c8-nrsd2" Nov 26 15:09:39 crc kubenswrapper[4785]: I1126 15:09:39.065563 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-bzwjs" Nov 26 15:09:39 crc kubenswrapper[4785]: I1126 15:09:39.068587 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nrsd2\" (UID: \"80f8d801-cad5-4d41-b2c7-1bb2306b1b25\") " pod="openshift-image-registry/image-registry-697d97f7c8-nrsd2" Nov 26 15:09:39 crc kubenswrapper[4785]: E1126 15:09:39.069441 4785 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-26 15:09:39.569428588 +0000 UTC m=+143.247794352 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nrsd2" (UID: "80f8d801-cad5-4d41-b2c7-1bb2306b1b25") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 15:09:39 crc kubenswrapper[4785]: I1126 15:09:39.086444 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-4v4sr" Nov 26 15:09:39 crc kubenswrapper[4785]: I1126 15:09:39.092834 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-b2j9t" Nov 26 15:09:39 crc kubenswrapper[4785]: I1126 15:09:39.109362 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-74fx8\" (UniqueName: \"kubernetes.io/projected/bae6a4ce-857f-47f4-8eda-688a559a21d2-kube-api-access-74fx8\") pod \"ingress-canary-h5xj5\" (UID: \"bae6a4ce-857f-47f4-8eda-688a559a21d2\") " pod="openshift-ingress-canary/ingress-canary-h5xj5" Nov 26 15:09:39 crc kubenswrapper[4785]: I1126 15:09:39.117709 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-s6nf7" Nov 26 15:09:39 crc kubenswrapper[4785]: I1126 15:09:39.119304 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2wldj\" (UniqueName: \"kubernetes.io/projected/b006dace-4a31-4ab1-af0f-144eeea6994e-kube-api-access-2wldj\") pod \"catalog-operator-68c6474976-wmz66\" (UID: \"b006dace-4a31-4ab1-af0f-144eeea6994e\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-wmz66" Nov 26 15:09:39 crc kubenswrapper[4785]: I1126 15:09:39.140648 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f72tq\" (UniqueName: \"kubernetes.io/projected/6428b894-a54a-4acc-8f84-66843f7165f0-kube-api-access-f72tq\") pod \"route-controller-manager-6576b87f9c-sq88m\" (UID: \"6428b894-a54a-4acc-8f84-66843f7165f0\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-sq88m" Nov 26 15:09:39 crc kubenswrapper[4785]: I1126 15:09:39.145936 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-wmz66" Nov 26 15:09:39 crc kubenswrapper[4785]: I1126 15:09:39.163569 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-42x7h\" (UniqueName: \"kubernetes.io/projected/20dcdb37-da72-4fd8-8927-23cd9799d180-kube-api-access-42x7h\") pod \"dns-default-452cx\" (UID: \"20dcdb37-da72-4fd8-8927-23cd9799d180\") " pod="openshift-dns/dns-default-452cx" Nov 26 15:09:39 crc kubenswrapper[4785]: I1126 15:09:39.169980 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 26 15:09:39 crc kubenswrapper[4785]: E1126 15:09:39.170120 4785 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-26 15:09:39.670099308 +0000 UTC m=+143.348465072 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 15:09:39 crc kubenswrapper[4785]: I1126 15:09:39.170668 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nrsd2\" (UID: \"80f8d801-cad5-4d41-b2c7-1bb2306b1b25\") " pod="openshift-image-registry/image-registry-697d97f7c8-nrsd2" Nov 26 15:09:39 crc kubenswrapper[4785]: E1126 15:09:39.171050 4785 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-26 15:09:39.671033942 +0000 UTC m=+143.349399706 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nrsd2" (UID: "80f8d801-cad5-4d41-b2c7-1bb2306b1b25") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 15:09:39 crc kubenswrapper[4785]: I1126 15:09:39.178890 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qnk98\" (UniqueName: \"kubernetes.io/projected/0a45fa88-9d4e-4801-8d54-a7fc5ca37240-kube-api-access-qnk98\") pod \"machine-config-operator-74547568cd-mwxx7\" (UID: \"0a45fa88-9d4e-4801-8d54-a7fc5ca37240\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-mwxx7" Nov 26 15:09:39 crc kubenswrapper[4785]: I1126 15:09:39.192278 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-qdc2p" Nov 26 15:09:39 crc kubenswrapper[4785]: I1126 15:09:39.196195 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wzzr5\" (UniqueName: \"kubernetes.io/projected/c9abc8ae-a944-4f87-909b-8258f95c2c06-kube-api-access-wzzr5\") pod \"marketplace-operator-79b997595-cllbv\" (UID: \"c9abc8ae-a944-4f87-909b-8258f95c2c06\") " pod="openshift-marketplace/marketplace-operator-79b997595-cllbv" Nov 26 15:09:39 crc kubenswrapper[4785]: I1126 15:09:39.216455 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cbxjh\" (UniqueName: \"kubernetes.io/projected/bc1be764-4067-4f28-83e1-684ab5cdebaa-kube-api-access-cbxjh\") pod \"migrator-59844c95c7-fsdwl\" (UID: \"bc1be764-4067-4f28-83e1-684ab5cdebaa\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-fsdwl" Nov 26 15:09:39 crc kubenswrapper[4785]: I1126 15:09:39.216843 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-wb7dh" Nov 26 15:09:39 crc kubenswrapper[4785]: I1126 15:09:39.219892 4785 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-4h5d5"] Nov 26 15:09:39 crc kubenswrapper[4785]: I1126 15:09:39.221092 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-cllbv" Nov 26 15:09:39 crc kubenswrapper[4785]: I1126 15:09:39.236624 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-967vg\" (UniqueName: \"kubernetes.io/projected/d07f0d40-7f03-41c2-ac98-5e5a7eb72a95-kube-api-access-967vg\") pod \"etcd-operator-b45778765-vt4z2\" (UID: \"d07f0d40-7f03-41c2-ac98-5e5a7eb72a95\") " pod="openshift-etcd-operator/etcd-operator-b45778765-vt4z2" Nov 26 15:09:39 crc kubenswrapper[4785]: I1126 15:09:39.241723 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-452cx" Nov 26 15:09:39 crc kubenswrapper[4785]: I1126 15:09:39.254047 4785 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-pnqcw"] Nov 26 15:09:39 crc kubenswrapper[4785]: I1126 15:09:39.257717 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g7zxq\" (UniqueName: \"kubernetes.io/projected/16404b4a-e9c8-4e18-bd40-0bdcab054a44-kube-api-access-g7zxq\") pod \"packageserver-d55dfcdfc-4z7ct\" (UID: \"16404b4a-e9c8-4e18-bd40-0bdcab054a44\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4z7ct" Nov 26 15:09:39 crc kubenswrapper[4785]: I1126 15:09:39.258358 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-vt4z2" Nov 26 15:09:39 crc kubenswrapper[4785]: I1126 15:09:39.271911 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-h5xj5" Nov 26 15:09:39 crc kubenswrapper[4785]: I1126 15:09:39.272378 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 26 15:09:39 crc kubenswrapper[4785]: E1126 15:09:39.272537 4785 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-26 15:09:39.772506723 +0000 UTC m=+143.450872497 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 15:09:39 crc kubenswrapper[4785]: I1126 15:09:39.272975 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nrsd2\" (UID: \"80f8d801-cad5-4d41-b2c7-1bb2306b1b25\") " pod="openshift-image-registry/image-registry-697d97f7c8-nrsd2" Nov 26 15:09:39 crc kubenswrapper[4785]: E1126 15:09:39.273344 4785 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-26 15:09:39.773326625 +0000 UTC m=+143.451692389 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nrsd2" (UID: "80f8d801-cad5-4d41-b2c7-1bb2306b1b25") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 15:09:39 crc kubenswrapper[4785]: I1126 15:09:39.274712 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lkccq\" (UniqueName: \"kubernetes.io/projected/bebeadc0-d563-42fc-9283-819249f42c0f-kube-api-access-lkccq\") pod \"control-plane-machine-set-operator-78cbb6b69f-h952m\" (UID: \"bebeadc0-d563-42fc-9283-819249f42c0f\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-h952m" Nov 26 15:09:39 crc kubenswrapper[4785]: W1126 15:09:39.294857 4785 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddc5c31e5_88ab_41d0_9976_b63f97b85543.slice/crio-a58b71600de9f243894a506b47f281fe6589d1cab3ef508274cc79b08edd87c2 WatchSource:0}: Error finding container a58b71600de9f243894a506b47f281fe6589d1cab3ef508274cc79b08edd87c2: Status 404 returned error can't find the container with id a58b71600de9f243894a506b47f281fe6589d1cab3ef508274cc79b08edd87c2 Nov 26 15:09:39 crc kubenswrapper[4785]: I1126 15:09:39.295389 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dgg4l\" (UniqueName: \"kubernetes.io/projected/d36477d3-bcc2-47e0-8aab-687e7ae01f9e-kube-api-access-dgg4l\") pod \"collect-profiles-29402820-vcn7w\" (UID: \"d36477d3-bcc2-47e0-8aab-687e7ae01f9e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29402820-vcn7w" Nov 26 15:09:39 crc kubenswrapper[4785]: W1126 15:09:39.297445 4785 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5c5bd1c8_cf40_4ad2_8556_6269ab2a08d0.slice/crio-921ffd51b4e2294d3befe3d233ce7b1841aeb1c67ee0bafb0aa64d97519a1525 WatchSource:0}: Error finding container 921ffd51b4e2294d3befe3d233ce7b1841aeb1c67ee0bafb0aa64d97519a1525: Status 404 returned error can't find the container with id 921ffd51b4e2294d3befe3d233ce7b1841aeb1c67ee0bafb0aa64d97519a1525 Nov 26 15:09:39 crc kubenswrapper[4785]: I1126 15:09:39.315995 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-29647\" (UniqueName: \"kubernetes.io/projected/c75fc6a1-3a11-401c-a519-285f2d2fed86-kube-api-access-29647\") pod \"machine-config-controller-84d6567774-98w8d\" (UID: \"c75fc6a1-3a11-401c-a519-285f2d2fed86\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-98w8d" Nov 26 15:09:39 crc kubenswrapper[4785]: I1126 15:09:39.334245 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ed213209-32b5-4239-a332-afb26f56e83c-bound-sa-token\") pod \"ingress-operator-5b745b69d9-dr58q\" (UID: \"ed213209-32b5-4239-a332-afb26f56e83c\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-dr58q" Nov 26 15:09:39 crc kubenswrapper[4785]: I1126 15:09:39.375334 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 26 15:09:39 crc kubenswrapper[4785]: E1126 15:09:39.375831 4785 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-26 15:09:39.875811542 +0000 UTC m=+143.554177316 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 15:09:39 crc kubenswrapper[4785]: I1126 15:09:39.390084 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jjsxk\" (UniqueName: \"kubernetes.io/projected/5a190f82-8e34-4e31-b2a1-e773974ccf4e-kube-api-access-jjsxk\") pod \"multus-admission-controller-857f4d67dd-hlltg\" (UID: \"5a190f82-8e34-4e31-b2a1-e773974ccf4e\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-hlltg" Nov 26 15:09:39 crc kubenswrapper[4785]: I1126 15:09:39.402312 4785 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-vz9fw"] Nov 26 15:09:39 crc kubenswrapper[4785]: I1126 15:09:39.409770 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3c750c46-8c0b-4853-80ec-dac544558a67-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-9wv9b\" (UID: \"3c750c46-8c0b-4853-80ec-dac544558a67\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-9wv9b" Nov 26 15:09:39 crc kubenswrapper[4785]: I1126 15:09:39.418091 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-9wv9b" Nov 26 15:09:39 crc kubenswrapper[4785]: I1126 15:09:39.418094 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-98w8d" Nov 26 15:09:39 crc kubenswrapper[4785]: I1126 15:09:39.421804 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-sq88m" Nov 26 15:09:39 crc kubenswrapper[4785]: I1126 15:09:39.428812 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-mwxx7" Nov 26 15:09:39 crc kubenswrapper[4785]: I1126 15:09:39.430390 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sr2m9\" (UniqueName: \"kubernetes.io/projected/917f8462-10e3-4f5d-b991-7e549c9fbc0c-kube-api-access-sr2m9\") pod \"service-ca-operator-777779d784-7dxnf\" (UID: \"917f8462-10e3-4f5d-b991-7e549c9fbc0c\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-7dxnf" Nov 26 15:09:39 crc kubenswrapper[4785]: I1126 15:09:39.439088 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jjj4s\" (UniqueName: \"kubernetes.io/projected/bdb530b8-a573-45a5-beae-3d60ee06ede2-kube-api-access-jjj4s\") pod \"service-ca-9c57cc56f-mvkhw\" (UID: \"bdb530b8-a573-45a5-beae-3d60ee06ede2\") " pod="openshift-service-ca/service-ca-9c57cc56f-mvkhw" Nov 26 15:09:39 crc kubenswrapper[4785]: I1126 15:09:39.444443 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jfrll\" (UniqueName: \"kubernetes.io/projected/1a208865-a883-4a5e-9a56-2fb270dd3ffc-kube-api-access-jfrll\") pod \"kube-storage-version-migrator-operator-b67b599dd-4q4dl\" (UID: \"1a208865-a883-4a5e-9a56-2fb270dd3ffc\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-4q4dl" Nov 26 15:09:39 crc kubenswrapper[4785]: I1126 15:09:39.461766 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k8ggt\" (UniqueName: \"kubernetes.io/projected/99f14898-6d00-4964-8eaa-f58db7e92512-kube-api-access-k8ggt\") pod \"olm-operator-6b444d44fb-56grk\" (UID: \"99f14898-6d00-4964-8eaa-f58db7e92512\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-56grk" Nov 26 15:09:39 crc kubenswrapper[4785]: I1126 15:09:39.471661 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-hlltg" Nov 26 15:09:39 crc kubenswrapper[4785]: I1126 15:09:39.476849 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nrsd2\" (UID: \"80f8d801-cad5-4d41-b2c7-1bb2306b1b25\") " pod="openshift-image-registry/image-registry-697d97f7c8-nrsd2" Nov 26 15:09:39 crc kubenswrapper[4785]: E1126 15:09:39.477204 4785 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-26 15:09:39.977187341 +0000 UTC m=+143.655553105 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nrsd2" (UID: "80f8d801-cad5-4d41-b2c7-1bb2306b1b25") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 15:09:39 crc kubenswrapper[4785]: I1126 15:09:39.480332 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4z7ct" Nov 26 15:09:39 crc kubenswrapper[4785]: I1126 15:09:39.488125 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-7dxnf" Nov 26 15:09:39 crc kubenswrapper[4785]: I1126 15:09:39.493770 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-h952m" Nov 26 15:09:39 crc kubenswrapper[4785]: I1126 15:09:39.500862 4785 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-4k7ds"] Nov 26 15:09:39 crc kubenswrapper[4785]: I1126 15:09:39.503534 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-mvkhw" Nov 26 15:09:39 crc kubenswrapper[4785]: I1126 15:09:39.505352 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d7zwb\" (UniqueName: \"kubernetes.io/projected/15b03acc-55fb-4845-8658-a7035db26deb-kube-api-access-d7zwb\") pod \"machine-config-server-cnl46\" (UID: \"15b03acc-55fb-4845-8658-a7035db26deb\") " pod="openshift-machine-config-operator/machine-config-server-cnl46" Nov 26 15:09:39 crc kubenswrapper[4785]: I1126 15:09:39.505984 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dhfqt\" (UniqueName: \"kubernetes.io/projected/936506c8-413e-4978-830e-323d53a45cdf-kube-api-access-dhfqt\") pod \"package-server-manager-789f6589d5-qr82q\" (UID: \"936506c8-413e-4978-830e-323d53a45cdf\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-qr82q" Nov 26 15:09:39 crc kubenswrapper[4785]: I1126 15:09:39.508895 4785 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-jmg2s"] Nov 26 15:09:39 crc kubenswrapper[4785]: I1126 15:09:39.511124 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-fsdwl" Nov 26 15:09:39 crc kubenswrapper[4785]: I1126 15:09:39.526078 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p4cl7\" (UniqueName: \"kubernetes.io/projected/ed213209-32b5-4239-a332-afb26f56e83c-kube-api-access-p4cl7\") pod \"ingress-operator-5b745b69d9-dr58q\" (UID: \"ed213209-32b5-4239-a332-afb26f56e83c\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-dr58q" Nov 26 15:09:39 crc kubenswrapper[4785]: I1126 15:09:39.529922 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29402820-vcn7w" Nov 26 15:09:39 crc kubenswrapper[4785]: I1126 15:09:39.545236 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bjwrd\" (UniqueName: \"kubernetes.io/projected/b022f00c-c760-4fc7-85be-bf6bda07ed6c-kube-api-access-bjwrd\") pod \"csi-hostpathplugin-s7n5j\" (UID: \"b022f00c-c760-4fc7-85be-bf6bda07ed6c\") " pod="hostpath-provisioner/csi-hostpathplugin-s7n5j" Nov 26 15:09:39 crc kubenswrapper[4785]: I1126 15:09:39.568381 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-s7n5j" Nov 26 15:09:39 crc kubenswrapper[4785]: I1126 15:09:39.576082 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-cnl46" Nov 26 15:09:39 crc kubenswrapper[4785]: I1126 15:09:39.577619 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 26 15:09:39 crc kubenswrapper[4785]: E1126 15:09:39.578213 4785 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-26 15:09:40.078119328 +0000 UTC m=+143.756485092 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 15:09:39 crc kubenswrapper[4785]: W1126 15:09:39.594613 4785 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5d3c5506_f58a_45bb_adbe_e895b7e4d646.slice/crio-81c8a5c7e5f27c279372c67eb67bd1782d9657a8b6e2e8cb209424884773bbae WatchSource:0}: Error finding container 81c8a5c7e5f27c279372c67eb67bd1782d9657a8b6e2e8cb209424884773bbae: Status 404 returned error can't find the container with id 81c8a5c7e5f27c279372c67eb67bd1782d9657a8b6e2e8cb209424884773bbae Nov 26 15:09:39 crc kubenswrapper[4785]: I1126 15:09:39.679473 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nrsd2\" (UID: \"80f8d801-cad5-4d41-b2c7-1bb2306b1b25\") " pod="openshift-image-registry/image-registry-697d97f7c8-nrsd2" Nov 26 15:09:39 crc kubenswrapper[4785]: E1126 15:09:39.679907 4785 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-26 15:09:40.179892946 +0000 UTC m=+143.858258710 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nrsd2" (UID: "80f8d801-cad5-4d41-b2c7-1bb2306b1b25") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 15:09:39 crc kubenswrapper[4785]: I1126 15:09:39.699499 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-4q4dl" Nov 26 15:09:39 crc kubenswrapper[4785]: I1126 15:09:39.728070 4785 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-vn49n"] Nov 26 15:09:39 crc kubenswrapper[4785]: I1126 15:09:39.733964 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-qr82q" Nov 26 15:09:39 crc kubenswrapper[4785]: I1126 15:09:39.754337 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-56grk" Nov 26 15:09:39 crc kubenswrapper[4785]: I1126 15:09:39.762809 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-dr58q" Nov 26 15:09:39 crc kubenswrapper[4785]: I1126 15:09:39.780656 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 26 15:09:39 crc kubenswrapper[4785]: E1126 15:09:39.781216 4785 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-26 15:09:40.281200123 +0000 UTC m=+143.959565887 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 15:09:39 crc kubenswrapper[4785]: I1126 15:09:39.816825 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-6cx55" event={"ID":"44ec5c36-3bfd-42c0-a4a7-28f7e0e5aad7","Type":"ContainerStarted","Data":"120fb33f4b194231ea834edbccea803eca5addf97245ff6c74d9a567991ca51c"} Nov 26 15:09:39 crc kubenswrapper[4785]: I1126 15:09:39.828657 4785 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-vt4z2"] Nov 26 15:09:39 crc kubenswrapper[4785]: I1126 15:09:39.829173 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-vz9fw" event={"ID":"50a056ca-c016-46ad-bb2c-0bf1d27511c3","Type":"ContainerStarted","Data":"0cad44332fe0c6caeacdea86600020d24843548337eecb0296ab90f3053c6cc6"} Nov 26 15:09:39 crc kubenswrapper[4785]: I1126 15:09:39.838016 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-pnqcw" event={"ID":"f3f5b38b-0a65-4340-b4a2-18e4da7574f4","Type":"ContainerStarted","Data":"87a72253ffe55d64664d18dab76b774467bdeba26c9990b5bd948b86e00433ac"} Nov 26 15:09:39 crc kubenswrapper[4785]: I1126 15:09:39.850611 4785 generic.go:334] "Generic (PLEG): container finished" podID="b475816f-0aee-4aed-92a4-82ced173b416" containerID="ba79af6d395043b2ac8058c9ead673f5ec1354cee63993ab5a91436d14c951e5" exitCode=0 Nov 26 15:09:39 crc kubenswrapper[4785]: I1126 15:09:39.850684 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-z8vql" event={"ID":"b475816f-0aee-4aed-92a4-82ced173b416","Type":"ContainerDied","Data":"ba79af6d395043b2ac8058c9ead673f5ec1354cee63993ab5a91436d14c951e5"} Nov 26 15:09:39 crc kubenswrapper[4785]: I1126 15:09:39.880169 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-w4twq" event={"ID":"2fb7b6f6-18b7-4fcd-b339-061bf56eb47a","Type":"ContainerStarted","Data":"1c299873aa5f07c9e92132276b1b01ef240dec976dbf7bbd0047a7998432594e"} Nov 26 15:09:39 crc kubenswrapper[4785]: I1126 15:09:39.883405 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nrsd2\" (UID: \"80f8d801-cad5-4d41-b2c7-1bb2306b1b25\") " pod="openshift-image-registry/image-registry-697d97f7c8-nrsd2" Nov 26 15:09:39 crc kubenswrapper[4785]: E1126 15:09:39.884656 4785 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-26 15:09:40.384641885 +0000 UTC m=+144.063007649 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nrsd2" (UID: "80f8d801-cad5-4d41-b2c7-1bb2306b1b25") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 15:09:39 crc kubenswrapper[4785]: I1126 15:09:39.885053 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-6tqsq" event={"ID":"53e6c54c-b18d-459d-b2b4-208e82921018","Type":"ContainerStarted","Data":"d861b2de69003eeb50e7883a3479594af8b0f2b992b3c61572c1ea64933c76fb"} Nov 26 15:09:39 crc kubenswrapper[4785]: I1126 15:09:39.897263 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-jmg2s" event={"ID":"5d3c5506-f58a-45bb-adbe-e895b7e4d646","Type":"ContainerStarted","Data":"81c8a5c7e5f27c279372c67eb67bd1782d9657a8b6e2e8cb209424884773bbae"} Nov 26 15:09:39 crc kubenswrapper[4785]: I1126 15:09:39.900516 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4h5d5" event={"ID":"5c5bd1c8-cf40-4ad2-8556-6269ab2a08d0","Type":"ContainerStarted","Data":"921ffd51b4e2294d3befe3d233ce7b1841aeb1c67ee0bafb0aa64d97519a1525"} Nov 26 15:09:39 crc kubenswrapper[4785]: I1126 15:09:39.912045 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-lcfs6" event={"ID":"dc5c31e5-88ab-41d0-9976-b63f97b85543","Type":"ContainerStarted","Data":"a58b71600de9f243894a506b47f281fe6589d1cab3ef508274cc79b08edd87c2"} Nov 26 15:09:39 crc kubenswrapper[4785]: I1126 15:09:39.924788 4785 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-bzwjs"] Nov 26 15:09:39 crc kubenswrapper[4785]: I1126 15:09:39.958799 4785 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-dss42" podStartSLOduration=121.958778868 podStartE2EDuration="2m1.958778868s" podCreationTimestamp="2025-11-26 15:07:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 15:09:39.958580312 +0000 UTC m=+143.636946096" watchObservedRunningTime="2025-11-26 15:09:39.958778868 +0000 UTC m=+143.637144652" Nov 26 15:09:39 crc kubenswrapper[4785]: I1126 15:09:39.984084 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 26 15:09:39 crc kubenswrapper[4785]: E1126 15:09:39.985361 4785 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-26 15:09:40.485345046 +0000 UTC m=+144.163710810 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 15:09:40 crc kubenswrapper[4785]: I1126 15:09:40.013930 4785 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-452cx"] Nov 26 15:09:40 crc kubenswrapper[4785]: I1126 15:09:40.021954 4785 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-h5xj5"] Nov 26 15:09:40 crc kubenswrapper[4785]: I1126 15:09:40.074029 4785 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-wb7dh"] Nov 26 15:09:40 crc kubenswrapper[4785]: I1126 15:09:40.085738 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nrsd2\" (UID: \"80f8d801-cad5-4d41-b2c7-1bb2306b1b25\") " pod="openshift-image-registry/image-registry-697d97f7c8-nrsd2" Nov 26 15:09:40 crc kubenswrapper[4785]: E1126 15:09:40.086102 4785 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-26 15:09:40.586084698 +0000 UTC m=+144.264450522 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nrsd2" (UID: "80f8d801-cad5-4d41-b2c7-1bb2306b1b25") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 15:09:40 crc kubenswrapper[4785]: I1126 15:09:40.186988 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 26 15:09:40 crc kubenswrapper[4785]: E1126 15:09:40.187321 4785 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-26 15:09:40.687308363 +0000 UTC m=+144.365674127 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 15:09:40 crc kubenswrapper[4785]: I1126 15:09:40.275634 4785 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-wmz66"] Nov 26 15:09:40 crc kubenswrapper[4785]: I1126 15:09:40.277895 4785 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-s7fwr"] Nov 26 15:09:40 crc kubenswrapper[4785]: I1126 15:09:40.295941 4785 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-dss42" Nov 26 15:09:40 crc kubenswrapper[4785]: I1126 15:09:40.296980 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nrsd2\" (UID: \"80f8d801-cad5-4d41-b2c7-1bb2306b1b25\") " pod="openshift-image-registry/image-registry-697d97f7c8-nrsd2" Nov 26 15:09:40 crc kubenswrapper[4785]: E1126 15:09:40.297439 4785 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-26 15:09:40.797414308 +0000 UTC m=+144.475780122 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nrsd2" (UID: "80f8d801-cad5-4d41-b2c7-1bb2306b1b25") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 15:09:40 crc kubenswrapper[4785]: W1126 15:09:40.346793 4785 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcb9844e2_63a7_4437_ab9d_5047b2363580.slice/crio-947933b5794ad7867092ce29de8313ad5aed566d7a5613795b9eecf7ec4efc6b WatchSource:0}: Error finding container 947933b5794ad7867092ce29de8313ad5aed566d7a5613795b9eecf7ec4efc6b: Status 404 returned error can't find the container with id 947933b5794ad7867092ce29de8313ad5aed566d7a5613795b9eecf7ec4efc6b Nov 26 15:09:40 crc kubenswrapper[4785]: W1126 15:09:40.350054 4785 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb9d58930_37e7_40ed_ab98_48654e752018.slice/crio-9d2d8bd64aa55c6154adf46460b0338fee3c073a535d7da6bdc790b5a7a88e24 WatchSource:0}: Error finding container 9d2d8bd64aa55c6154adf46460b0338fee3c073a535d7da6bdc790b5a7a88e24: Status 404 returned error can't find the container with id 9d2d8bd64aa55c6154adf46460b0338fee3c073a535d7da6bdc790b5a7a88e24 Nov 26 15:09:40 crc kubenswrapper[4785]: W1126 15:09:40.358767 4785 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbae6a4ce_857f_47f4_8eda_688a559a21d2.slice/crio-690bc872344e74164f3aa88fe2089a926637700a1a44efb4d4eceafa4437a425 WatchSource:0}: Error finding container 690bc872344e74164f3aa88fe2089a926637700a1a44efb4d4eceafa4437a425: Status 404 returned error can't find the container with id 690bc872344e74164f3aa88fe2089a926637700a1a44efb4d4eceafa4437a425 Nov 26 15:09:40 crc kubenswrapper[4785]: W1126 15:09:40.365173 4785 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1d527b78_bffe_47b6_bbaa_f06bcaab2316.slice/crio-3c4bdb9a069d5678e23a3266af6e7b32b7f3ba622ffb4687e3f565638ab72676 WatchSource:0}: Error finding container 3c4bdb9a069d5678e23a3266af6e7b32b7f3ba622ffb4687e3f565638ab72676: Status 404 returned error can't find the container with id 3c4bdb9a069d5678e23a3266af6e7b32b7f3ba622ffb4687e3f565638ab72676 Nov 26 15:09:40 crc kubenswrapper[4785]: I1126 15:09:40.405095 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 26 15:09:40 crc kubenswrapper[4785]: E1126 15:09:40.405825 4785 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-26 15:09:40.905793628 +0000 UTC m=+144.584159392 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 15:09:40 crc kubenswrapper[4785]: I1126 15:09:40.506562 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nrsd2\" (UID: \"80f8d801-cad5-4d41-b2c7-1bb2306b1b25\") " pod="openshift-image-registry/image-registry-697d97f7c8-nrsd2" Nov 26 15:09:40 crc kubenswrapper[4785]: E1126 15:09:40.506941 4785 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-26 15:09:41.00692634 +0000 UTC m=+144.685292104 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nrsd2" (UID: "80f8d801-cad5-4d41-b2c7-1bb2306b1b25") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 15:09:40 crc kubenswrapper[4785]: I1126 15:09:40.610256 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 26 15:09:40 crc kubenswrapper[4785]: E1126 15:09:40.610724 4785 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-26 15:09:41.110699311 +0000 UTC m=+144.789065075 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 15:09:40 crc kubenswrapper[4785]: I1126 15:09:40.721889 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nrsd2\" (UID: \"80f8d801-cad5-4d41-b2c7-1bb2306b1b25\") " pod="openshift-image-registry/image-registry-697d97f7c8-nrsd2" Nov 26 15:09:40 crc kubenswrapper[4785]: E1126 15:09:40.722349 4785 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-26 15:09:41.222337066 +0000 UTC m=+144.900702830 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nrsd2" (UID: "80f8d801-cad5-4d41-b2c7-1bb2306b1b25") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 15:09:40 crc kubenswrapper[4785]: I1126 15:09:40.825887 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 26 15:09:40 crc kubenswrapper[4785]: E1126 15:09:40.826232 4785 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-26 15:09:41.326217069 +0000 UTC m=+145.004582833 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 15:09:40 crc kubenswrapper[4785]: I1126 15:09:40.829589 4785 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-qdc2p"] Nov 26 15:09:40 crc kubenswrapper[4785]: I1126 15:09:40.835412 4785 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-b2j9t"] Nov 26 15:09:40 crc kubenswrapper[4785]: I1126 15:09:40.885313 4785 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-fsdwl"] Nov 26 15:09:40 crc kubenswrapper[4785]: I1126 15:09:40.917426 4785 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-s7n5j"] Nov 26 15:09:40 crc kubenswrapper[4785]: I1126 15:09:40.928610 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nrsd2\" (UID: \"80f8d801-cad5-4d41-b2c7-1bb2306b1b25\") " pod="openshift-image-registry/image-registry-697d97f7c8-nrsd2" Nov 26 15:09:40 crc kubenswrapper[4785]: E1126 15:09:40.929000 4785 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-26 15:09:41.428985434 +0000 UTC m=+145.107351198 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nrsd2" (UID: "80f8d801-cad5-4d41-b2c7-1bb2306b1b25") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 15:09:40 crc kubenswrapper[4785]: I1126 15:09:40.930904 4785 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-s6nf7"] Nov 26 15:09:40 crc kubenswrapper[4785]: I1126 15:09:40.934884 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-6tqsq" event={"ID":"53e6c54c-b18d-459d-b2b4-208e82921018","Type":"ContainerStarted","Data":"51d9241d5fc680f7229b65d9f9b4b9e774708059a4edea1d75c89a5d959ce6a8"} Nov 26 15:09:40 crc kubenswrapper[4785]: I1126 15:09:40.938615 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-4v4sr" event={"ID":"fb4884c6-873b-4728-8633-4ce0b794dfcd","Type":"ContainerStarted","Data":"a2fafc558eafa3cdf10d30911466994f8563befc7c66fabefb5c5aa91d926193"} Nov 26 15:09:40 crc kubenswrapper[4785]: I1126 15:09:40.945945 4785 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-sq88m"] Nov 26 15:09:40 crc kubenswrapper[4785]: I1126 15:09:40.951958 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-452cx" event={"ID":"20dcdb37-da72-4fd8-8927-23cd9799d180","Type":"ContainerStarted","Data":"861422d7ee0227920baf638c2d83e6955a7734d4f19bd73bda8646fa2720a64c"} Nov 26 15:09:40 crc kubenswrapper[4785]: I1126 15:09:40.953287 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-s7fwr" event={"ID":"45ee2e82-4124-490e-acb9-d7dc4b5c7cc4","Type":"ContainerStarted","Data":"0c3ea171e523be3d45109ac6d9fe579a794cd0f2e8fc6760833595d5a587db36"} Nov 26 15:09:40 crc kubenswrapper[4785]: I1126 15:09:40.954570 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-h5xj5" event={"ID":"bae6a4ce-857f-47f4-8eda-688a559a21d2","Type":"ContainerStarted","Data":"690bc872344e74164f3aa88fe2089a926637700a1a44efb4d4eceafa4437a425"} Nov 26 15:09:40 crc kubenswrapper[4785]: I1126 15:09:40.964769 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-w4twq" event={"ID":"2fb7b6f6-18b7-4fcd-b339-061bf56eb47a","Type":"ContainerStarted","Data":"a00f256a1f5622d12bdc7bc2c996dfe1e16f30328277af7ad166c87ea511f4fe"} Nov 26 15:09:40 crc kubenswrapper[4785]: I1126 15:09:40.966064 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-vt4z2" event={"ID":"d07f0d40-7f03-41c2-ac98-5e5a7eb72a95","Type":"ContainerStarted","Data":"4d7aa3220acf6ba7dbeda66d35fd215b8e560f68361eb1379fecdd265fbd0f8d"} Nov 26 15:09:40 crc kubenswrapper[4785]: I1126 15:09:40.971271 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-wb7dh" event={"ID":"1d527b78-bffe-47b6-bbaa-f06bcaab2316","Type":"ContainerStarted","Data":"3c4bdb9a069d5678e23a3266af6e7b32b7f3ba622ffb4687e3f565638ab72676"} Nov 26 15:09:40 crc kubenswrapper[4785]: I1126 15:09:40.975049 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-bzwjs" event={"ID":"b9d58930-37e7-40ed-ab98-48654e752018","Type":"ContainerStarted","Data":"9d2d8bd64aa55c6154adf46460b0338fee3c073a535d7da6bdc790b5a7a88e24"} Nov 26 15:09:40 crc kubenswrapper[4785]: I1126 15:09:40.984204 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-wmz66" event={"ID":"b006dace-4a31-4ab1-af0f-144eeea6994e","Type":"ContainerStarted","Data":"bf6bb0f03b76a9e9c4710fc958eaf17cfb38eb2e64bd6af1267d6c9ad6803c6b"} Nov 26 15:09:40 crc kubenswrapper[4785]: I1126 15:09:40.986981 4785 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4z7ct"] Nov 26 15:09:40 crc kubenswrapper[4785]: I1126 15:09:40.999600 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-vn49n" event={"ID":"cb9844e2-63a7-4437-ab9d-5047b2363580","Type":"ContainerStarted","Data":"947933b5794ad7867092ce29de8313ad5aed566d7a5613795b9eecf7ec4efc6b"} Nov 26 15:09:41 crc kubenswrapper[4785]: I1126 15:09:41.013973 4785 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-cllbv"] Nov 26 15:09:41 crc kubenswrapper[4785]: I1126 15:09:41.029572 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 26 15:09:41 crc kubenswrapper[4785]: E1126 15:09:41.029936 4785 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-26 15:09:41.529921041 +0000 UTC m=+145.208286795 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 15:09:41 crc kubenswrapper[4785]: I1126 15:09:41.085959 4785 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-9wv9b"] Nov 26 15:09:41 crc kubenswrapper[4785]: I1126 15:09:41.132447 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nrsd2\" (UID: \"80f8d801-cad5-4d41-b2c7-1bb2306b1b25\") " pod="openshift-image-registry/image-registry-697d97f7c8-nrsd2" Nov 26 15:09:41 crc kubenswrapper[4785]: E1126 15:09:41.134240 4785 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-26 15:09:41.634228395 +0000 UTC m=+145.312594159 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nrsd2" (UID: "80f8d801-cad5-4d41-b2c7-1bb2306b1b25") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 15:09:41 crc kubenswrapper[4785]: I1126 15:09:41.176267 4785 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-6cx55" podStartSLOduration=123.176251925 podStartE2EDuration="2m3.176251925s" podCreationTimestamp="2025-11-26 15:07:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 15:09:41.174588472 +0000 UTC m=+144.852954256" watchObservedRunningTime="2025-11-26 15:09:41.176251925 +0000 UTC m=+144.854617689" Nov 26 15:09:41 crc kubenswrapper[4785]: I1126 15:09:41.234074 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 26 15:09:41 crc kubenswrapper[4785]: E1126 15:09:41.234480 4785 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-26 15:09:41.734467404 +0000 UTC m=+145.412833158 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 15:09:41 crc kubenswrapper[4785]: I1126 15:09:41.335576 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nrsd2\" (UID: \"80f8d801-cad5-4d41-b2c7-1bb2306b1b25\") " pod="openshift-image-registry/image-registry-697d97f7c8-nrsd2" Nov 26 15:09:41 crc kubenswrapper[4785]: E1126 15:09:41.335906 4785 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-26 15:09:41.835894824 +0000 UTC m=+145.514260588 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nrsd2" (UID: "80f8d801-cad5-4d41-b2c7-1bb2306b1b25") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 15:09:41 crc kubenswrapper[4785]: I1126 15:09:41.426503 4785 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-7dxnf"] Nov 26 15:09:41 crc kubenswrapper[4785]: I1126 15:09:41.436674 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 26 15:09:41 crc kubenswrapper[4785]: E1126 15:09:41.437124 4785 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-26 15:09:41.937107228 +0000 UTC m=+145.615472992 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 15:09:41 crc kubenswrapper[4785]: W1126 15:09:41.485345 4785 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6428b894_a54a_4acc_8f84_66843f7165f0.slice/crio-ee9e5c86c3bf579e7074af649739a8a3d5bf490e5fb2b8ef6149e9e9e08f2d51 WatchSource:0}: Error finding container ee9e5c86c3bf579e7074af649739a8a3d5bf490e5fb2b8ef6149e9e9e08f2d51: Status 404 returned error can't find the container with id ee9e5c86c3bf579e7074af649739a8a3d5bf490e5fb2b8ef6149e9e9e08f2d51 Nov 26 15:09:41 crc kubenswrapper[4785]: I1126 15:09:41.540129 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nrsd2\" (UID: \"80f8d801-cad5-4d41-b2c7-1bb2306b1b25\") " pod="openshift-image-registry/image-registry-697d97f7c8-nrsd2" Nov 26 15:09:41 crc kubenswrapper[4785]: E1126 15:09:41.540476 4785 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-26 15:09:42.040465008 +0000 UTC m=+145.718830772 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nrsd2" (UID: "80f8d801-cad5-4d41-b2c7-1bb2306b1b25") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 15:09:41 crc kubenswrapper[4785]: I1126 15:09:41.550294 4785 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-4q4dl"] Nov 26 15:09:41 crc kubenswrapper[4785]: I1126 15:09:41.593832 4785 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29402820-vcn7w"] Nov 26 15:09:41 crc kubenswrapper[4785]: I1126 15:09:41.642535 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 26 15:09:41 crc kubenswrapper[4785]: E1126 15:09:41.642946 4785 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-26 15:09:42.142928125 +0000 UTC m=+145.821293889 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 15:09:41 crc kubenswrapper[4785]: I1126 15:09:41.654678 4785 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-98w8d"] Nov 26 15:09:41 crc kubenswrapper[4785]: I1126 15:09:41.688091 4785 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-mvkhw"] Nov 26 15:09:41 crc kubenswrapper[4785]: I1126 15:09:41.712484 4785 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-h952m"] Nov 26 15:09:41 crc kubenswrapper[4785]: I1126 15:09:41.745820 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nrsd2\" (UID: \"80f8d801-cad5-4d41-b2c7-1bb2306b1b25\") " pod="openshift-image-registry/image-registry-697d97f7c8-nrsd2" Nov 26 15:09:41 crc kubenswrapper[4785]: E1126 15:09:41.746267 4785 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-26 15:09:42.246251954 +0000 UTC m=+145.924617708 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nrsd2" (UID: "80f8d801-cad5-4d41-b2c7-1bb2306b1b25") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 15:09:41 crc kubenswrapper[4785]: I1126 15:09:41.834523 4785 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-56grk"] Nov 26 15:09:41 crc kubenswrapper[4785]: I1126 15:09:41.847158 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 26 15:09:41 crc kubenswrapper[4785]: E1126 15:09:41.847447 4785 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-26 15:09:42.347364516 +0000 UTC m=+146.025730280 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 15:09:41 crc kubenswrapper[4785]: I1126 15:09:41.847509 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nrsd2\" (UID: \"80f8d801-cad5-4d41-b2c7-1bb2306b1b25\") " pod="openshift-image-registry/image-registry-697d97f7c8-nrsd2" Nov 26 15:09:41 crc kubenswrapper[4785]: E1126 15:09:41.848869 4785 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-26 15:09:42.348859005 +0000 UTC m=+146.027224769 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nrsd2" (UID: "80f8d801-cad5-4d41-b2c7-1bb2306b1b25") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 15:09:41 crc kubenswrapper[4785]: W1126 15:09:41.863364 4785 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc75fc6a1_3a11_401c_a519_285f2d2fed86.slice/crio-0a014f9f3e01c86df6383fe955901661d1562fd6f71e08da04049382d0888915 WatchSource:0}: Error finding container 0a014f9f3e01c86df6383fe955901661d1562fd6f71e08da04049382d0888915: Status 404 returned error can't find the container with id 0a014f9f3e01c86df6383fe955901661d1562fd6f71e08da04049382d0888915 Nov 26 15:09:41 crc kubenswrapper[4785]: I1126 15:09:41.899455 4785 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-hlltg"] Nov 26 15:09:41 crc kubenswrapper[4785]: I1126 15:09:41.905788 4785 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-mwxx7"] Nov 26 15:09:41 crc kubenswrapper[4785]: I1126 15:09:41.910665 4785 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-dr58q"] Nov 26 15:09:41 crc kubenswrapper[4785]: I1126 15:09:41.927973 4785 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-qr82q"] Nov 26 15:09:41 crc kubenswrapper[4785]: I1126 15:09:41.950925 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 26 15:09:41 crc kubenswrapper[4785]: E1126 15:09:41.951208 4785 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-26 15:09:42.451193427 +0000 UTC m=+146.129559181 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 15:09:42 crc kubenswrapper[4785]: I1126 15:09:42.010669 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-98w8d" event={"ID":"c75fc6a1-3a11-401c-a519-285f2d2fed86","Type":"ContainerStarted","Data":"0a014f9f3e01c86df6383fe955901661d1562fd6f71e08da04049382d0888915"} Nov 26 15:09:42 crc kubenswrapper[4785]: I1126 15:09:42.011533 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4z7ct" event={"ID":"16404b4a-e9c8-4e18-bd40-0bdcab054a44","Type":"ContainerStarted","Data":"ec1288e5eb6d5e5227a898459a4be88f1c55b5affabce4b385287212ad6180b5"} Nov 26 15:09:42 crc kubenswrapper[4785]: I1126 15:09:42.012544 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-mwxx7" event={"ID":"0a45fa88-9d4e-4801-8d54-a7fc5ca37240","Type":"ContainerStarted","Data":"00d630bf522555130156ffb25abf67ef0e64a638be8bb3a7726cc9c76db0f74a"} Nov 26 15:09:42 crc kubenswrapper[4785]: I1126 15:09:42.013113 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-s7n5j" event={"ID":"b022f00c-c760-4fc7-85be-bf6bda07ed6c","Type":"ContainerStarted","Data":"daef11cebe2c8753dba4d7e1c007a708b35aa9edaeb00813484831bb192a5922"} Nov 26 15:09:42 crc kubenswrapper[4785]: I1126 15:09:42.015104 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-pnqcw" event={"ID":"f3f5b38b-0a65-4340-b4a2-18e4da7574f4","Type":"ContainerStarted","Data":"cc83f790b57a17e46aff2c5dd31c26563f396754ae6019bc7d1a938e4b48f201"} Nov 26 15:09:42 crc kubenswrapper[4785]: I1126 15:09:42.015451 4785 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-pnqcw" Nov 26 15:09:42 crc kubenswrapper[4785]: I1126 15:09:42.017208 4785 generic.go:334] "Generic (PLEG): container finished" podID="5c5bd1c8-cf40-4ad2-8556-6269ab2a08d0" containerID="5192eba86f480d6d8e83797ca53a1bfd2d00cff20b7cd6e6a439d6a29d1864fc" exitCode=0 Nov 26 15:09:42 crc kubenswrapper[4785]: I1126 15:09:42.017276 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4h5d5" event={"ID":"5c5bd1c8-cf40-4ad2-8556-6269ab2a08d0","Type":"ContainerDied","Data":"5192eba86f480d6d8e83797ca53a1bfd2d00cff20b7cd6e6a439d6a29d1864fc"} Nov 26 15:09:42 crc kubenswrapper[4785]: I1126 15:09:42.017415 4785 patch_prober.go:28] interesting pod/console-operator-58897d9998-pnqcw container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.10:8443/readyz\": dial tcp 10.217.0.10:8443: connect: connection refused" start-of-body= Nov 26 15:09:42 crc kubenswrapper[4785]: I1126 15:09:42.018098 4785 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-pnqcw" podUID="f3f5b38b-0a65-4340-b4a2-18e4da7574f4" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.10:8443/readyz\": dial tcp 10.217.0.10:8443: connect: connection refused" Nov 26 15:09:42 crc kubenswrapper[4785]: I1126 15:09:42.018378 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-cllbv" event={"ID":"c9abc8ae-a944-4f87-909b-8258f95c2c06","Type":"ContainerStarted","Data":"a36dcba9ed41cef720133ca4e4654b2cc3a3664cafeb1c6c82703ec22ef8cc11"} Nov 26 15:09:42 crc kubenswrapper[4785]: I1126 15:09:42.020148 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-7dxnf" event={"ID":"917f8462-10e3-4f5d-b991-7e549c9fbc0c","Type":"ContainerStarted","Data":"ede1e0e6399de02a6d1a1f8617160f05bd98319f08cf14667d1c4cae14857078"} Nov 26 15:09:42 crc kubenswrapper[4785]: I1126 15:09:42.020754 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-qdc2p" event={"ID":"e167755f-d572-4f94-a24b-99a6b0c15552","Type":"ContainerStarted","Data":"ac9cd1525aba8f1fca9939f449c98d820e8cd8c713e3b227bf18057d59c7b6ad"} Nov 26 15:09:42 crc kubenswrapper[4785]: I1126 15:09:42.021233 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-fsdwl" event={"ID":"bc1be764-4067-4f28-83e1-684ab5cdebaa","Type":"ContainerStarted","Data":"a10858d5b399f95c2493e3d3270e22e5d510e6fbb0b7b158ca22cf05aec5e0ad"} Nov 26 15:09:42 crc kubenswrapper[4785]: I1126 15:09:42.021766 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-9wv9b" event={"ID":"3c750c46-8c0b-4853-80ec-dac544558a67","Type":"ContainerStarted","Data":"f3e2524ab78addf7c8c0a10c9631937bae0d58e46e819288bde40197f9ca665b"} Nov 26 15:09:42 crc kubenswrapper[4785]: I1126 15:09:42.022259 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-sq88m" event={"ID":"6428b894-a54a-4acc-8f84-66843f7165f0","Type":"ContainerStarted","Data":"ee9e5c86c3bf579e7074af649739a8a3d5bf490e5fb2b8ef6149e9e9e08f2d51"} Nov 26 15:09:42 crc kubenswrapper[4785]: I1126 15:09:42.023702 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29402820-vcn7w" event={"ID":"d36477d3-bcc2-47e0-8aab-687e7ae01f9e","Type":"ContainerStarted","Data":"5b4f9b2012bb6182e942ffcb0b9b438fe88c639cf056d08c661d9f893e08df82"} Nov 26 15:09:42 crc kubenswrapper[4785]: I1126 15:09:42.024397 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-h952m" event={"ID":"bebeadc0-d563-42fc-9283-819249f42c0f","Type":"ContainerStarted","Data":"035151f5ff024d558542f1156ec8c7c5ed9acb08b9898131b94dd5454aeba3df"} Nov 26 15:09:42 crc kubenswrapper[4785]: I1126 15:09:42.027366 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-hlltg" event={"ID":"5a190f82-8e34-4e31-b2a1-e773974ccf4e","Type":"ContainerStarted","Data":"d277e799e78325f477eeafdf833d3470891bbc06563617536a00e44fc7dc5f53"} Nov 26 15:09:42 crc kubenswrapper[4785]: I1126 15:09:42.029064 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-b2j9t" event={"ID":"deef326b-464a-4a51-89e8-7fae04b5669a","Type":"ContainerStarted","Data":"7e45596cc37ecea30795d773699b1f6eecc19f5e494047bcd551e4eaa7dbda80"} Nov 26 15:09:42 crc kubenswrapper[4785]: I1126 15:09:42.028637 4785 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-pnqcw" podStartSLOduration=124.028624185 podStartE2EDuration="2m4.028624185s" podCreationTimestamp="2025-11-26 15:07:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 15:09:42.028119931 +0000 UTC m=+145.706485695" watchObservedRunningTime="2025-11-26 15:09:42.028624185 +0000 UTC m=+145.706989949" Nov 26 15:09:42 crc kubenswrapper[4785]: I1126 15:09:42.033177 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-s6nf7" event={"ID":"03c85461-2fda-45e4-a240-5984b368b216","Type":"ContainerStarted","Data":"fe38da89f768acb63c380e9446d141fcd64a4d0c1b865843948d1d776d51a377"} Nov 26 15:09:42 crc kubenswrapper[4785]: I1126 15:09:42.033939 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-4q4dl" event={"ID":"1a208865-a883-4a5e-9a56-2fb270dd3ffc","Type":"ContainerStarted","Data":"da78a0a2fb704c1a62c40d82a488d8a14c5a4ca5521c12b8d53f020ad2c9f7e1"} Nov 26 15:09:42 crc kubenswrapper[4785]: I1126 15:09:42.035000 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-lcfs6" event={"ID":"dc5c31e5-88ab-41d0-9976-b63f97b85543","Type":"ContainerStarted","Data":"b6b96429174dd4dc1757ce71866fa57169855d8357e70c7270982584e1f241a6"} Nov 26 15:09:42 crc kubenswrapper[4785]: I1126 15:09:42.035667 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-56grk" event={"ID":"99f14898-6d00-4964-8eaa-f58db7e92512","Type":"ContainerStarted","Data":"44d120eda15a0d79d01b962e91189e5ad361a271d971a43526452c48fafeb02e"} Nov 26 15:09:42 crc kubenswrapper[4785]: I1126 15:09:42.037060 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-bzwjs" event={"ID":"b9d58930-37e7-40ed-ab98-48654e752018","Type":"ContainerStarted","Data":"73a1b6b47863d62aa75a31238e2790a07f234ba802f6967028e47bfb4074b7a5"} Nov 26 15:09:42 crc kubenswrapper[4785]: I1126 15:09:42.038118 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-4k7ds" event={"ID":"8cd4a8fe-0b02-4ee4-b125-c23de9e56acf","Type":"ContainerStarted","Data":"18e3f0c6f38644e277937f877e6ef1ddcbf85509eebd7dbce34a222c96956a1e"} Nov 26 15:09:42 crc kubenswrapper[4785]: I1126 15:09:42.038940 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-mvkhw" event={"ID":"bdb530b8-a573-45a5-beae-3d60ee06ede2","Type":"ContainerStarted","Data":"bf58e2cc6597bd4dc8935311044c16ceb653617eaddee26a23beeb259ee7114f"} Nov 26 15:09:42 crc kubenswrapper[4785]: I1126 15:09:42.040419 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-cnl46" event={"ID":"15b03acc-55fb-4845-8658-a7035db26deb","Type":"ContainerStarted","Data":"039630fd5d09cda581e946ecf1a690adaecdd712de02416ef8aec113e294869d"} Nov 26 15:09:42 crc kubenswrapper[4785]: I1126 15:09:42.052732 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nrsd2\" (UID: \"80f8d801-cad5-4d41-b2c7-1bb2306b1b25\") " pod="openshift-image-registry/image-registry-697d97f7c8-nrsd2" Nov 26 15:09:42 crc kubenswrapper[4785]: E1126 15:09:42.053218 4785 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-26 15:09:42.553203992 +0000 UTC m=+146.231569746 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nrsd2" (UID: "80f8d801-cad5-4d41-b2c7-1bb2306b1b25") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 15:09:42 crc kubenswrapper[4785]: I1126 15:09:42.063789 4785 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-6tqsq" podStartSLOduration=124.063765796 podStartE2EDuration="2m4.063765796s" podCreationTimestamp="2025-11-26 15:07:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 15:09:42.058956851 +0000 UTC m=+145.737322615" watchObservedRunningTime="2025-11-26 15:09:42.063765796 +0000 UTC m=+145.742131570" Nov 26 15:09:42 crc kubenswrapper[4785]: W1126 15:09:42.140420 4785 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poded213209_32b5_4239_a332_afb26f56e83c.slice/crio-d64aac4083b2e97210750ab6c85c3d6a5a6cd5c79b98e6f5d65a5d24108b2794 WatchSource:0}: Error finding container d64aac4083b2e97210750ab6c85c3d6a5a6cd5c79b98e6f5d65a5d24108b2794: Status 404 returned error can't find the container with id d64aac4083b2e97210750ab6c85c3d6a5a6cd5c79b98e6f5d65a5d24108b2794 Nov 26 15:09:42 crc kubenswrapper[4785]: W1126 15:09:42.142176 4785 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod936506c8_413e_4978_830e_323d53a45cdf.slice/crio-62d719eae5b1b6a0005932ba98aaec687d11af37a468f4735caf39c989d647d5 WatchSource:0}: Error finding container 62d719eae5b1b6a0005932ba98aaec687d11af37a468f4735caf39c989d647d5: Status 404 returned error can't find the container with id 62d719eae5b1b6a0005932ba98aaec687d11af37a468f4735caf39c989d647d5 Nov 26 15:09:42 crc kubenswrapper[4785]: I1126 15:09:42.156594 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 26 15:09:42 crc kubenswrapper[4785]: E1126 15:09:42.156730 4785 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-26 15:09:42.656709966 +0000 UTC m=+146.335075730 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 15:09:42 crc kubenswrapper[4785]: I1126 15:09:42.157305 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nrsd2\" (UID: \"80f8d801-cad5-4d41-b2c7-1bb2306b1b25\") " pod="openshift-image-registry/image-registry-697d97f7c8-nrsd2" Nov 26 15:09:42 crc kubenswrapper[4785]: E1126 15:09:42.159858 4785 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-26 15:09:42.659835307 +0000 UTC m=+146.338201161 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nrsd2" (UID: "80f8d801-cad5-4d41-b2c7-1bb2306b1b25") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 15:09:42 crc kubenswrapper[4785]: I1126 15:09:42.260492 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 26 15:09:42 crc kubenswrapper[4785]: E1126 15:09:42.260684 4785 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-26 15:09:42.760667751 +0000 UTC m=+146.439033515 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 15:09:42 crc kubenswrapper[4785]: I1126 15:09:42.261284 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nrsd2\" (UID: \"80f8d801-cad5-4d41-b2c7-1bb2306b1b25\") " pod="openshift-image-registry/image-registry-697d97f7c8-nrsd2" Nov 26 15:09:42 crc kubenswrapper[4785]: E1126 15:09:42.261652 4785 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-26 15:09:42.761638486 +0000 UTC m=+146.440004250 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nrsd2" (UID: "80f8d801-cad5-4d41-b2c7-1bb2306b1b25") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 15:09:42 crc kubenswrapper[4785]: I1126 15:09:42.361997 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 26 15:09:42 crc kubenswrapper[4785]: E1126 15:09:42.362142 4785 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-26 15:09:42.862116922 +0000 UTC m=+146.540482696 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 15:09:42 crc kubenswrapper[4785]: I1126 15:09:42.362298 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nrsd2\" (UID: \"80f8d801-cad5-4d41-b2c7-1bb2306b1b25\") " pod="openshift-image-registry/image-registry-697d97f7c8-nrsd2" Nov 26 15:09:42 crc kubenswrapper[4785]: E1126 15:09:42.362602 4785 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-26 15:09:42.862591324 +0000 UTC m=+146.540957088 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nrsd2" (UID: "80f8d801-cad5-4d41-b2c7-1bb2306b1b25") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 15:09:42 crc kubenswrapper[4785]: I1126 15:09:42.463708 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 26 15:09:42 crc kubenswrapper[4785]: E1126 15:09:42.463865 4785 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-26 15:09:42.963839859 +0000 UTC m=+146.642205623 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 15:09:42 crc kubenswrapper[4785]: I1126 15:09:42.464492 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nrsd2\" (UID: \"80f8d801-cad5-4d41-b2c7-1bb2306b1b25\") " pod="openshift-image-registry/image-registry-697d97f7c8-nrsd2" Nov 26 15:09:42 crc kubenswrapper[4785]: E1126 15:09:42.465018 4785 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-26 15:09:42.965007879 +0000 UTC m=+146.643373643 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nrsd2" (UID: "80f8d801-cad5-4d41-b2c7-1bb2306b1b25") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 15:09:42 crc kubenswrapper[4785]: I1126 15:09:42.565434 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 26 15:09:42 crc kubenswrapper[4785]: E1126 15:09:42.565621 4785 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-26 15:09:43.065593597 +0000 UTC m=+146.743959361 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 15:09:42 crc kubenswrapper[4785]: I1126 15:09:42.565723 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nrsd2\" (UID: \"80f8d801-cad5-4d41-b2c7-1bb2306b1b25\") " pod="openshift-image-registry/image-registry-697d97f7c8-nrsd2" Nov 26 15:09:42 crc kubenswrapper[4785]: E1126 15:09:42.566158 4785 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-26 15:09:43.06609615 +0000 UTC m=+146.744461994 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nrsd2" (UID: "80f8d801-cad5-4d41-b2c7-1bb2306b1b25") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 15:09:42 crc kubenswrapper[4785]: I1126 15:09:42.666762 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 26 15:09:42 crc kubenswrapper[4785]: E1126 15:09:42.667420 4785 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-26 15:09:43.167399257 +0000 UTC m=+146.845765031 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 15:09:42 crc kubenswrapper[4785]: I1126 15:09:42.768223 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nrsd2\" (UID: \"80f8d801-cad5-4d41-b2c7-1bb2306b1b25\") " pod="openshift-image-registry/image-registry-697d97f7c8-nrsd2" Nov 26 15:09:42 crc kubenswrapper[4785]: E1126 15:09:42.768734 4785 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-26 15:09:43.268723584 +0000 UTC m=+146.947089348 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nrsd2" (UID: "80f8d801-cad5-4d41-b2c7-1bb2306b1b25") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 15:09:42 crc kubenswrapper[4785]: I1126 15:09:42.869094 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 26 15:09:42 crc kubenswrapper[4785]: E1126 15:09:42.869531 4785 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-26 15:09:43.369510667 +0000 UTC m=+147.047876441 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 15:09:42 crc kubenswrapper[4785]: I1126 15:09:42.972932 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nrsd2\" (UID: \"80f8d801-cad5-4d41-b2c7-1bb2306b1b25\") " pod="openshift-image-registry/image-registry-697d97f7c8-nrsd2" Nov 26 15:09:42 crc kubenswrapper[4785]: E1126 15:09:42.976180 4785 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-26 15:09:43.476156323 +0000 UTC m=+147.154522087 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nrsd2" (UID: "80f8d801-cad5-4d41-b2c7-1bb2306b1b25") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 15:09:43 crc kubenswrapper[4785]: I1126 15:09:43.072819 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-vz9fw" event={"ID":"50a056ca-c016-46ad-bb2c-0bf1d27511c3","Type":"ContainerStarted","Data":"3f4aaa55e3317e3cd308a8a14d221fffcbe3074e923c7967c2977b98a7d9a89d"} Nov 26 15:09:43 crc kubenswrapper[4785]: I1126 15:09:43.074128 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 26 15:09:43 crc kubenswrapper[4785]: E1126 15:09:43.074525 4785 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-26 15:09:43.574512533 +0000 UTC m=+147.252878297 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 15:09:43 crc kubenswrapper[4785]: I1126 15:09:43.082675 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-qr82q" event={"ID":"936506c8-413e-4978-830e-323d53a45cdf","Type":"ContainerStarted","Data":"da92ab8001cfcecb110f75a9a3e9de8a8b5512b4b9051aa47beaf59405cc1415"} Nov 26 15:09:43 crc kubenswrapper[4785]: I1126 15:09:43.082718 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-qr82q" event={"ID":"936506c8-413e-4978-830e-323d53a45cdf","Type":"ContainerStarted","Data":"62d719eae5b1b6a0005932ba98aaec687d11af37a468f4735caf39c989d647d5"} Nov 26 15:09:43 crc kubenswrapper[4785]: I1126 15:09:43.106832 4785 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-vz9fw" podStartSLOduration=125.1068097 podStartE2EDuration="2m5.1068097s" podCreationTimestamp="2025-11-26 15:07:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 15:09:43.098125695 +0000 UTC m=+146.776491469" watchObservedRunningTime="2025-11-26 15:09:43.1068097 +0000 UTC m=+146.785175494" Nov 26 15:09:43 crc kubenswrapper[4785]: I1126 15:09:43.115198 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-fsdwl" event={"ID":"bc1be764-4067-4f28-83e1-684ab5cdebaa","Type":"ContainerStarted","Data":"ab77123b0b860b92d2158138c8182f4e3848b89ffa352da1190be36fa65f4d35"} Nov 26 15:09:43 crc kubenswrapper[4785]: I1126 15:09:43.156925 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-lcfs6" event={"ID":"dc5c31e5-88ab-41d0-9976-b63f97b85543","Type":"ContainerStarted","Data":"cf48486ba307d2d6acc422adead677e79d569241e2fe767276a7a37acc276f1d"} Nov 26 15:09:43 crc kubenswrapper[4785]: I1126 15:09:43.165706 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-wmz66" event={"ID":"b006dace-4a31-4ab1-af0f-144eeea6994e","Type":"ContainerStarted","Data":"d5d893267b17588106bf0f813ca6b4af569888912c91eba9784044f26ab1879e"} Nov 26 15:09:43 crc kubenswrapper[4785]: I1126 15:09:43.168287 4785 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-wmz66" Nov 26 15:09:43 crc kubenswrapper[4785]: I1126 15:09:43.175518 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-4v4sr" event={"ID":"fb4884c6-873b-4728-8633-4ce0b794dfcd","Type":"ContainerStarted","Data":"8c44f16954042325e789c7540ec65d1e12cf86002447780589daf9db4d2442c6"} Nov 26 15:09:43 crc kubenswrapper[4785]: I1126 15:09:43.176607 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nrsd2\" (UID: \"80f8d801-cad5-4d41-b2c7-1bb2306b1b25\") " pod="openshift-image-registry/image-registry-697d97f7c8-nrsd2" Nov 26 15:09:43 crc kubenswrapper[4785]: E1126 15:09:43.178267 4785 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-26 15:09:43.678255243 +0000 UTC m=+147.356621007 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nrsd2" (UID: "80f8d801-cad5-4d41-b2c7-1bb2306b1b25") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 15:09:43 crc kubenswrapper[4785]: I1126 15:09:43.181683 4785 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-wmz66 container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.28:8443/healthz\": dial tcp 10.217.0.28:8443: connect: connection refused" start-of-body= Nov 26 15:09:43 crc kubenswrapper[4785]: I1126 15:09:43.181716 4785 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-wmz66" podUID="b006dace-4a31-4ab1-af0f-144eeea6994e" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.28:8443/healthz\": dial tcp 10.217.0.28:8443: connect: connection refused" Nov 26 15:09:43 crc kubenswrapper[4785]: I1126 15:09:43.183682 4785 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-lcfs6" podStartSLOduration=125.183664903 podStartE2EDuration="2m5.183664903s" podCreationTimestamp="2025-11-26 15:07:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 15:09:43.182805161 +0000 UTC m=+146.861170925" watchObservedRunningTime="2025-11-26 15:09:43.183664903 +0000 UTC m=+146.862030667" Nov 26 15:09:43 crc kubenswrapper[4785]: I1126 15:09:43.185826 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-dr58q" event={"ID":"ed213209-32b5-4239-a332-afb26f56e83c","Type":"ContainerStarted","Data":"284eaae3700982d93290b9288f48ead3dab5e58f705856e3346f172c34c89fc3"} Nov 26 15:09:43 crc kubenswrapper[4785]: I1126 15:09:43.185865 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-dr58q" event={"ID":"ed213209-32b5-4239-a332-afb26f56e83c","Type":"ContainerStarted","Data":"d64aac4083b2e97210750ab6c85c3d6a5a6cd5c79b98e6f5d65a5d24108b2794"} Nov 26 15:09:43 crc kubenswrapper[4785]: I1126 15:09:43.201996 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29402820-vcn7w" event={"ID":"d36477d3-bcc2-47e0-8aab-687e7ae01f9e","Type":"ContainerStarted","Data":"0aeed79d147bc76f39e87b05fe0e521053bd74345d229524269acb2c64fdc1e2"} Nov 26 15:09:43 crc kubenswrapper[4785]: I1126 15:09:43.221517 4785 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-wmz66" podStartSLOduration=125.221503304 podStartE2EDuration="2m5.221503304s" podCreationTimestamp="2025-11-26 15:07:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 15:09:43.22057423 +0000 UTC m=+146.898940004" watchObservedRunningTime="2025-11-26 15:09:43.221503304 +0000 UTC m=+146.899869068" Nov 26 15:09:43 crc kubenswrapper[4785]: I1126 15:09:43.227915 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-4k7ds" event={"ID":"8cd4a8fe-0b02-4ee4-b125-c23de9e56acf","Type":"ContainerStarted","Data":"fdf784fa5811f2024c321b9dbc4290d4c5a3de241cf31fa5b2dfaa6a6aba6bb9"} Nov 26 15:09:43 crc kubenswrapper[4785]: I1126 15:09:43.246989 4785 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-4v4sr" podStartSLOduration=125.246969314 podStartE2EDuration="2m5.246969314s" podCreationTimestamp="2025-11-26 15:07:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 15:09:43.245484336 +0000 UTC m=+146.923850110" watchObservedRunningTime="2025-11-26 15:09:43.246969314 +0000 UTC m=+146.925335078" Nov 26 15:09:43 crc kubenswrapper[4785]: I1126 15:09:43.248952 4785 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-56grk" Nov 26 15:09:43 crc kubenswrapper[4785]: I1126 15:09:43.262745 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-b2j9t" event={"ID":"deef326b-464a-4a51-89e8-7fae04b5669a","Type":"ContainerStarted","Data":"43fef0b52484d5af753f59a102b58a3674d2a6cd3aae56ce2ee3dd3e2e2d62f2"} Nov 26 15:09:43 crc kubenswrapper[4785]: I1126 15:09:43.265982 4785 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-4k7ds" podStartSLOduration=125.265966187 podStartE2EDuration="2m5.265966187s" podCreationTimestamp="2025-11-26 15:07:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 15:09:43.26377864 +0000 UTC m=+146.942144404" watchObservedRunningTime="2025-11-26 15:09:43.265966187 +0000 UTC m=+146.944331951" Nov 26 15:09:43 crc kubenswrapper[4785]: I1126 15:09:43.271389 4785 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-56grk container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.34:8443/healthz\": dial tcp 10.217.0.34:8443: connect: connection refused" start-of-body= Nov 26 15:09:43 crc kubenswrapper[4785]: I1126 15:09:43.271588 4785 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-56grk" podUID="99f14898-6d00-4964-8eaa-f58db7e92512" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.34:8443/healthz\": dial tcp 10.217.0.34:8443: connect: connection refused" Nov 26 15:09:43 crc kubenswrapper[4785]: I1126 15:09:43.285179 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 26 15:09:43 crc kubenswrapper[4785]: E1126 15:09:43.286436 4785 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-26 15:09:43.786421127 +0000 UTC m=+147.464786891 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 15:09:43 crc kubenswrapper[4785]: I1126 15:09:43.303723 4785 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29402820-vcn7w" podStartSLOduration=125.303710276 podStartE2EDuration="2m5.303710276s" podCreationTimestamp="2025-11-26 15:07:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 15:09:43.301920649 +0000 UTC m=+146.980286423" watchObservedRunningTime="2025-11-26 15:09:43.303710276 +0000 UTC m=+146.982076030" Nov 26 15:09:43 crc kubenswrapper[4785]: I1126 15:09:43.312293 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-vn49n" event={"ID":"cb9844e2-63a7-4437-ab9d-5047b2363580","Type":"ContainerStarted","Data":"620f2ea431f657ec9611dcd6d4ec6156ac01697f310bdf3acdaa604dd9e3d584"} Nov 26 15:09:43 crc kubenswrapper[4785]: I1126 15:09:43.313070 4785 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-vn49n" Nov 26 15:09:43 crc kubenswrapper[4785]: I1126 15:09:43.343867 4785 patch_prober.go:28] interesting pod/downloads-7954f5f757-vn49n container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" start-of-body= Nov 26 15:09:43 crc kubenswrapper[4785]: I1126 15:09:43.343917 4785 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-vn49n" podUID="cb9844e2-63a7-4437-ab9d-5047b2363580" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" Nov 26 15:09:43 crc kubenswrapper[4785]: I1126 15:09:43.344646 4785 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-b2j9t" podStartSLOduration=125.344629717 podStartE2EDuration="2m5.344629717s" podCreationTimestamp="2025-11-26 15:07:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 15:09:43.344507613 +0000 UTC m=+147.022873397" watchObservedRunningTime="2025-11-26 15:09:43.344629717 +0000 UTC m=+147.022995481" Nov 26 15:09:43 crc kubenswrapper[4785]: I1126 15:09:43.354584 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-z8vql" event={"ID":"b475816f-0aee-4aed-92a4-82ced173b416","Type":"ContainerStarted","Data":"30b476f0c1d9b887b8214c29ba4c37a3fff3ae2c2fb4b0bebe9b3d21c6abb285"} Nov 26 15:09:43 crc kubenswrapper[4785]: I1126 15:09:43.354920 4785 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-z8vql" Nov 26 15:09:43 crc kubenswrapper[4785]: I1126 15:09:43.373910 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-452cx" event={"ID":"20dcdb37-da72-4fd8-8927-23cd9799d180","Type":"ContainerStarted","Data":"d038305580c592938b4100d38b950189b6724b894e0cc6ddf4f21d386cfab9b8"} Nov 26 15:09:43 crc kubenswrapper[4785]: I1126 15:09:43.387195 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nrsd2\" (UID: \"80f8d801-cad5-4d41-b2c7-1bb2306b1b25\") " pod="openshift-image-registry/image-registry-697d97f7c8-nrsd2" Nov 26 15:09:43 crc kubenswrapper[4785]: E1126 15:09:43.389233 4785 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-26 15:09:43.889217123 +0000 UTC m=+147.567582887 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nrsd2" (UID: "80f8d801-cad5-4d41-b2c7-1bb2306b1b25") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 15:09:43 crc kubenswrapper[4785]: I1126 15:09:43.411898 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-vt4z2" event={"ID":"d07f0d40-7f03-41c2-ac98-5e5a7eb72a95","Type":"ContainerStarted","Data":"23dc674ac736a615e7619f3128540132d5a2117c32b86f2c28ca1d7691ad2e2a"} Nov 26 15:09:43 crc kubenswrapper[4785]: I1126 15:09:43.429028 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-4q4dl" event={"ID":"1a208865-a883-4a5e-9a56-2fb270dd3ffc","Type":"ContainerStarted","Data":"feb4d6249329b6f8c5b4f8806270a70ba2203d8616b2c90522d358cb7d0fce78"} Nov 26 15:09:43 crc kubenswrapper[4785]: I1126 15:09:43.461456 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-wb7dh" event={"ID":"1d527b78-bffe-47b6-bbaa-f06bcaab2316","Type":"ContainerStarted","Data":"859c530a47ab319b0a28112301f3c3dde98badfc11b156a9a840c3cc5c39263e"} Nov 26 15:09:43 crc kubenswrapper[4785]: I1126 15:09:43.477785 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-cnl46" event={"ID":"15b03acc-55fb-4845-8658-a7035db26deb","Type":"ContainerStarted","Data":"81b1b9a7be6e8fd66674275e104f563ce34c2fe745850fa57ca1c7041c4af053"} Nov 26 15:09:43 crc kubenswrapper[4785]: I1126 15:09:43.495766 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 26 15:09:43 crc kubenswrapper[4785]: E1126 15:09:43.497391 4785 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-26 15:09:43.997370277 +0000 UTC m=+147.675736041 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 15:09:43 crc kubenswrapper[4785]: I1126 15:09:43.497426 4785 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-56grk" podStartSLOduration=125.497411698 podStartE2EDuration="2m5.497411698s" podCreationTimestamp="2025-11-26 15:07:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 15:09:43.398508304 +0000 UTC m=+147.076874088" watchObservedRunningTime="2025-11-26 15:09:43.497411698 +0000 UTC m=+147.175777462" Nov 26 15:09:43 crc kubenswrapper[4785]: I1126 15:09:43.499782 4785 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-z8vql" podStartSLOduration=125.499771619 podStartE2EDuration="2m5.499771619s" podCreationTimestamp="2025-11-26 15:07:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 15:09:43.495981501 +0000 UTC m=+147.174347285" watchObservedRunningTime="2025-11-26 15:09:43.499771619 +0000 UTC m=+147.178137383" Nov 26 15:09:43 crc kubenswrapper[4785]: I1126 15:09:43.503944 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-qdc2p" event={"ID":"e167755f-d572-4f94-a24b-99a6b0c15552","Type":"ContainerStarted","Data":"8f12950da08aeed0ff5f8ad881dc2380709b0e124e48fa00b1b734e2449e41ef"} Nov 26 15:09:43 crc kubenswrapper[4785]: I1126 15:09:43.518966 4785 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-vn49n" podStartSLOduration=125.518946426 podStartE2EDuration="2m5.518946426s" podCreationTimestamp="2025-11-26 15:07:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 15:09:43.51717224 +0000 UTC m=+147.195538014" watchObservedRunningTime="2025-11-26 15:09:43.518946426 +0000 UTC m=+147.197312200" Nov 26 15:09:43 crc kubenswrapper[4785]: I1126 15:09:43.523083 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-h952m" event={"ID":"bebeadc0-d563-42fc-9283-819249f42c0f","Type":"ContainerStarted","Data":"aaaa36be7b195f43c91fa895c93a6741438029e47cebdf55d9cc55698886d86d"} Nov 26 15:09:43 crc kubenswrapper[4785]: I1126 15:09:43.533711 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-sq88m" event={"ID":"6428b894-a54a-4acc-8f84-66843f7165f0","Type":"ContainerStarted","Data":"97f31c1fc87ff33c5bd9529349429c4b06ece2231977446049756454a3dc85ed"} Nov 26 15:09:43 crc kubenswrapper[4785]: I1126 15:09:43.534398 4785 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-sq88m" Nov 26 15:09:43 crc kubenswrapper[4785]: I1126 15:09:43.545132 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-98w8d" event={"ID":"c75fc6a1-3a11-401c-a519-285f2d2fed86","Type":"ContainerStarted","Data":"ee54f08401dc39e8ac0d9554218d2d2966cea083a8dcfb8d1ec5c0cc1a3ab637"} Nov 26 15:09:43 crc kubenswrapper[4785]: I1126 15:09:43.560380 4785 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-sq88m container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.19:8443/healthz\": dial tcp 10.217.0.19:8443: connect: connection refused" start-of-body= Nov 26 15:09:43 crc kubenswrapper[4785]: I1126 15:09:43.560442 4785 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-sq88m" podUID="6428b894-a54a-4acc-8f84-66843f7165f0" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.19:8443/healthz\": dial tcp 10.217.0.19:8443: connect: connection refused" Nov 26 15:09:43 crc kubenswrapper[4785]: I1126 15:09:43.562787 4785 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-9wv9b" podStartSLOduration=125.562763132 podStartE2EDuration="2m5.562763132s" podCreationTimestamp="2025-11-26 15:07:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 15:09:43.541863881 +0000 UTC m=+147.220229645" watchObservedRunningTime="2025-11-26 15:09:43.562763132 +0000 UTC m=+147.241128886" Nov 26 15:09:43 crc kubenswrapper[4785]: I1126 15:09:43.577885 4785 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-cnl46" podStartSLOduration=7.577866104 podStartE2EDuration="7.577866104s" podCreationTimestamp="2025-11-26 15:09:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 15:09:43.577737081 +0000 UTC m=+147.256102855" watchObservedRunningTime="2025-11-26 15:09:43.577866104 +0000 UTC m=+147.256231868" Nov 26 15:09:43 crc kubenswrapper[4785]: I1126 15:09:43.580501 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4z7ct" event={"ID":"16404b4a-e9c8-4e18-bd40-0bdcab054a44","Type":"ContainerStarted","Data":"aa5a6565d8d17cf9669c50eae0989845fec15226a0db58b408eb213ec220970c"} Nov 26 15:09:43 crc kubenswrapper[4785]: I1126 15:09:43.582226 4785 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4z7ct" Nov 26 15:09:43 crc kubenswrapper[4785]: I1126 15:09:43.585897 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-mwxx7" event={"ID":"0a45fa88-9d4e-4801-8d54-a7fc5ca37240","Type":"ContainerStarted","Data":"5e2227e6851514b2be75059bd7da291fd0117e80b6a88df56237828e99d2a2a5"} Nov 26 15:09:43 crc kubenswrapper[4785]: I1126 15:09:43.598257 4785 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-4z7ct container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.36:5443/healthz\": dial tcp 10.217.0.36:5443: connect: connection refused" start-of-body= Nov 26 15:09:43 crc kubenswrapper[4785]: I1126 15:09:43.598319 4785 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4z7ct" podUID="16404b4a-e9c8-4e18-bd40-0bdcab054a44" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.36:5443/healthz\": dial tcp 10.217.0.36:5443: connect: connection refused" Nov 26 15:09:43 crc kubenswrapper[4785]: I1126 15:09:43.599211 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nrsd2\" (UID: \"80f8d801-cad5-4d41-b2c7-1bb2306b1b25\") " pod="openshift-image-registry/image-registry-697d97f7c8-nrsd2" Nov 26 15:09:43 crc kubenswrapper[4785]: I1126 15:09:43.605712 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-cllbv" event={"ID":"c9abc8ae-a944-4f87-909b-8258f95c2c06","Type":"ContainerStarted","Data":"fde06ae70c4fd282c87c511071786aca9fc0b03847e2f611cd8c7ab5e03c1f1b"} Nov 26 15:09:43 crc kubenswrapper[4785]: I1126 15:09:43.605772 4785 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-cllbv" Nov 26 15:09:43 crc kubenswrapper[4785]: E1126 15:09:43.612260 4785 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-26 15:09:44.112233305 +0000 UTC m=+147.790599069 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nrsd2" (UID: "80f8d801-cad5-4d41-b2c7-1bb2306b1b25") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 15:09:43 crc kubenswrapper[4785]: I1126 15:09:43.626509 4785 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-h952m" podStartSLOduration=125.626488665 podStartE2EDuration="2m5.626488665s" podCreationTimestamp="2025-11-26 15:07:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 15:09:43.619700429 +0000 UTC m=+147.298066203" watchObservedRunningTime="2025-11-26 15:09:43.626488665 +0000 UTC m=+147.304854439" Nov 26 15:09:43 crc kubenswrapper[4785]: I1126 15:09:43.638128 4785 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-cllbv container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.40:8080/healthz\": dial tcp 10.217.0.40:8080: connect: connection refused" start-of-body= Nov 26 15:09:43 crc kubenswrapper[4785]: I1126 15:09:43.638168 4785 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-cllbv" podUID="c9abc8ae-a944-4f87-909b-8258f95c2c06" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.40:8080/healthz\": dial tcp 10.217.0.40:8080: connect: connection refused" Nov 26 15:09:43 crc kubenswrapper[4785]: I1126 15:09:43.639234 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-7dxnf" event={"ID":"917f8462-10e3-4f5d-b991-7e549c9fbc0c","Type":"ContainerStarted","Data":"ec6fef5e4c9b9961c0fe3d92d933f6c1cc5e639b3c746272fac83ca82ed98477"} Nov 26 15:09:43 crc kubenswrapper[4785]: I1126 15:09:43.668522 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-h5xj5" event={"ID":"bae6a4ce-857f-47f4-8eda-688a559a21d2","Type":"ContainerStarted","Data":"466cd7c8e6f102b641d7a3b3fd4c24398ae01e413c0edbafd5346835be32f880"} Nov 26 15:09:43 crc kubenswrapper[4785]: I1126 15:09:43.682178 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-jmg2s" event={"ID":"5d3c5506-f58a-45bb-adbe-e895b7e4d646","Type":"ContainerStarted","Data":"5d26e7e5415ccbf92293e4a6ac21162b55c99206553e2d3996f3fadbee0ded50"} Nov 26 15:09:43 crc kubenswrapper[4785]: I1126 15:09:43.682464 4785 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-jmg2s" Nov 26 15:09:43 crc kubenswrapper[4785]: I1126 15:09:43.696195 4785 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-vt4z2" podStartSLOduration=125.696178312 podStartE2EDuration="2m5.696178312s" podCreationTimestamp="2025-11-26 15:07:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 15:09:43.695241527 +0000 UTC m=+147.373607301" watchObservedRunningTime="2025-11-26 15:09:43.696178312 +0000 UTC m=+147.374544076" Nov 26 15:09:43 crc kubenswrapper[4785]: I1126 15:09:43.696798 4785 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-sq88m" podStartSLOduration=125.696791498 podStartE2EDuration="2m5.696791498s" podCreationTimestamp="2025-11-26 15:07:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 15:09:43.668052782 +0000 UTC m=+147.346418556" watchObservedRunningTime="2025-11-26 15:09:43.696791498 +0000 UTC m=+147.375157282" Nov 26 15:09:43 crc kubenswrapper[4785]: I1126 15:09:43.700203 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-s7fwr" event={"ID":"45ee2e82-4124-490e-acb9-d7dc4b5c7cc4","Type":"ContainerStarted","Data":"7cdfc1af77404d133cb6bfe5fee3ef7c1a114e3d3e3ed4ad1682e1017acbca32"} Nov 26 15:09:43 crc kubenswrapper[4785]: I1126 15:09:43.700317 4785 patch_prober.go:28] interesting pod/console-operator-58897d9998-pnqcw container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.10:8443/readyz\": dial tcp 10.217.0.10:8443: connect: connection refused" start-of-body= Nov 26 15:09:43 crc kubenswrapper[4785]: I1126 15:09:43.700347 4785 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-pnqcw" podUID="f3f5b38b-0a65-4340-b4a2-18e4da7574f4" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.10:8443/readyz\": dial tcp 10.217.0.10:8443: connect: connection refused" Nov 26 15:09:43 crc kubenswrapper[4785]: I1126 15:09:43.700905 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 26 15:09:43 crc kubenswrapper[4785]: E1126 15:09:43.703424 4785 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-26 15:09:44.203406169 +0000 UTC m=+147.881771933 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 15:09:43 crc kubenswrapper[4785]: I1126 15:09:43.704801 4785 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-jmg2s container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.25:6443/healthz\": dial tcp 10.217.0.25:6443: connect: connection refused" start-of-body= Nov 26 15:09:43 crc kubenswrapper[4785]: I1126 15:09:43.704846 4785 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-jmg2s" podUID="5d3c5506-f58a-45bb-adbe-e895b7e4d646" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.25:6443/healthz\": dial tcp 10.217.0.25:6443: connect: connection refused" Nov 26 15:09:43 crc kubenswrapper[4785]: I1126 15:09:43.757893 4785 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-4q4dl" podStartSLOduration=125.757876201 podStartE2EDuration="2m5.757876201s" podCreationTimestamp="2025-11-26 15:07:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 15:09:43.730314007 +0000 UTC m=+147.408679771" watchObservedRunningTime="2025-11-26 15:09:43.757876201 +0000 UTC m=+147.436241965" Nov 26 15:09:43 crc kubenswrapper[4785]: I1126 15:09:43.759176 4785 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-qdc2p" podStartSLOduration=125.759169935 podStartE2EDuration="2m5.759169935s" podCreationTimestamp="2025-11-26 15:07:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 15:09:43.75745111 +0000 UTC m=+147.435816874" watchObservedRunningTime="2025-11-26 15:09:43.759169935 +0000 UTC m=+147.437535699" Nov 26 15:09:43 crc kubenswrapper[4785]: I1126 15:09:43.791846 4785 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-wb7dh" podStartSLOduration=125.791832782 podStartE2EDuration="2m5.791832782s" podCreationTimestamp="2025-11-26 15:07:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 15:09:43.790032205 +0000 UTC m=+147.468397979" watchObservedRunningTime="2025-11-26 15:09:43.791832782 +0000 UTC m=+147.470198546" Nov 26 15:09:43 crc kubenswrapper[4785]: I1126 15:09:43.803344 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nrsd2\" (UID: \"80f8d801-cad5-4d41-b2c7-1bb2306b1b25\") " pod="openshift-image-registry/image-registry-697d97f7c8-nrsd2" Nov 26 15:09:43 crc kubenswrapper[4785]: E1126 15:09:43.805938 4785 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-26 15:09:44.305915867 +0000 UTC m=+147.984281631 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nrsd2" (UID: "80f8d801-cad5-4d41-b2c7-1bb2306b1b25") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 15:09:43 crc kubenswrapper[4785]: I1126 15:09:43.807010 4785 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-cllbv" podStartSLOduration=125.806995705 podStartE2EDuration="2m5.806995705s" podCreationTimestamp="2025-11-26 15:07:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 15:09:43.806331558 +0000 UTC m=+147.484697332" watchObservedRunningTime="2025-11-26 15:09:43.806995705 +0000 UTC m=+147.485361469" Nov 26 15:09:43 crc kubenswrapper[4785]: I1126 15:09:43.878219 4785 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4z7ct" podStartSLOduration=125.878202811 podStartE2EDuration="2m5.878202811s" podCreationTimestamp="2025-11-26 15:07:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 15:09:43.877785571 +0000 UTC m=+147.556151345" watchObservedRunningTime="2025-11-26 15:09:43.878202811 +0000 UTC m=+147.556568565" Nov 26 15:09:43 crc kubenswrapper[4785]: I1126 15:09:43.879040 4785 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-s7fwr" podStartSLOduration=125.879032023 podStartE2EDuration="2m5.879032023s" podCreationTimestamp="2025-11-26 15:07:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 15:09:43.847098985 +0000 UTC m=+147.525464769" watchObservedRunningTime="2025-11-26 15:09:43.879032023 +0000 UTC m=+147.557397787" Nov 26 15:09:43 crc kubenswrapper[4785]: I1126 15:09:43.904338 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 26 15:09:43 crc kubenswrapper[4785]: E1126 15:09:43.905052 4785 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-26 15:09:44.405032917 +0000 UTC m=+148.083398681 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 15:09:43 crc kubenswrapper[4785]: I1126 15:09:43.912825 4785 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-mwxx7" podStartSLOduration=125.912809389 podStartE2EDuration="2m5.912809389s" podCreationTimestamp="2025-11-26 15:07:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 15:09:43.911634258 +0000 UTC m=+147.590000032" watchObservedRunningTime="2025-11-26 15:09:43.912809389 +0000 UTC m=+147.591175153" Nov 26 15:09:43 crc kubenswrapper[4785]: I1126 15:09:43.978141 4785 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-98w8d" podStartSLOduration=125.978126102 podStartE2EDuration="2m5.978126102s" podCreationTimestamp="2025-11-26 15:07:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 15:09:43.945973179 +0000 UTC m=+147.624338953" watchObservedRunningTime="2025-11-26 15:09:43.978126102 +0000 UTC m=+147.656491856" Nov 26 15:09:43 crc kubenswrapper[4785]: I1126 15:09:43.979583 4785 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-jmg2s" podStartSLOduration=125.97957663 podStartE2EDuration="2m5.97957663s" podCreationTimestamp="2025-11-26 15:07:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 15:09:43.975990067 +0000 UTC m=+147.654355841" watchObservedRunningTime="2025-11-26 15:09:43.97957663 +0000 UTC m=+147.657942394" Nov 26 15:09:43 crc kubenswrapper[4785]: I1126 15:09:43.996155 4785 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-7dxnf" podStartSLOduration=125.996137499 podStartE2EDuration="2m5.996137499s" podCreationTimestamp="2025-11-26 15:07:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 15:09:43.994150638 +0000 UTC m=+147.672516402" watchObservedRunningTime="2025-11-26 15:09:43.996137499 +0000 UTC m=+147.674503263" Nov 26 15:09:44 crc kubenswrapper[4785]: I1126 15:09:44.006102 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nrsd2\" (UID: \"80f8d801-cad5-4d41-b2c7-1bb2306b1b25\") " pod="openshift-image-registry/image-registry-697d97f7c8-nrsd2" Nov 26 15:09:44 crc kubenswrapper[4785]: E1126 15:09:44.006394 4785 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-26 15:09:44.506382365 +0000 UTC m=+148.184748129 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nrsd2" (UID: "80f8d801-cad5-4d41-b2c7-1bb2306b1b25") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 15:09:44 crc kubenswrapper[4785]: I1126 15:09:44.014190 4785 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-h5xj5" podStartSLOduration=8.014169457 podStartE2EDuration="8.014169457s" podCreationTimestamp="2025-11-26 15:09:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 15:09:44.012407131 +0000 UTC m=+147.690772905" watchObservedRunningTime="2025-11-26 15:09:44.014169457 +0000 UTC m=+147.692535221" Nov 26 15:09:44 crc kubenswrapper[4785]: I1126 15:09:44.086685 4785 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-4v4sr" Nov 26 15:09:44 crc kubenswrapper[4785]: I1126 15:09:44.095079 4785 patch_prober.go:28] interesting pod/router-default-5444994796-4v4sr container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 26 15:09:44 crc kubenswrapper[4785]: [-]has-synced failed: reason withheld Nov 26 15:09:44 crc kubenswrapper[4785]: [+]process-running ok Nov 26 15:09:44 crc kubenswrapper[4785]: healthz check failed Nov 26 15:09:44 crc kubenswrapper[4785]: I1126 15:09:44.095138 4785 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-4v4sr" podUID="fb4884c6-873b-4728-8633-4ce0b794dfcd" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 26 15:09:44 crc kubenswrapper[4785]: I1126 15:09:44.109051 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 26 15:09:44 crc kubenswrapper[4785]: E1126 15:09:44.109202 4785 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-26 15:09:44.60918193 +0000 UTC m=+148.287547694 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 15:09:44 crc kubenswrapper[4785]: I1126 15:09:44.109614 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nrsd2\" (UID: \"80f8d801-cad5-4d41-b2c7-1bb2306b1b25\") " pod="openshift-image-registry/image-registry-697d97f7c8-nrsd2" Nov 26 15:09:44 crc kubenswrapper[4785]: E1126 15:09:44.109946 4785 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-26 15:09:44.60993413 +0000 UTC m=+148.288299894 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nrsd2" (UID: "80f8d801-cad5-4d41-b2c7-1bb2306b1b25") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 15:09:44 crc kubenswrapper[4785]: I1126 15:09:44.210674 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 26 15:09:44 crc kubenswrapper[4785]: E1126 15:09:44.211019 4785 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-26 15:09:44.71099526 +0000 UTC m=+148.389361014 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 15:09:44 crc kubenswrapper[4785]: I1126 15:09:44.231981 4785 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-925q9" Nov 26 15:09:44 crc kubenswrapper[4785]: I1126 15:09:44.312178 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nrsd2\" (UID: \"80f8d801-cad5-4d41-b2c7-1bb2306b1b25\") " pod="openshift-image-registry/image-registry-697d97f7c8-nrsd2" Nov 26 15:09:44 crc kubenswrapper[4785]: E1126 15:09:44.313110 4785 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-26 15:09:44.813094527 +0000 UTC m=+148.491460291 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nrsd2" (UID: "80f8d801-cad5-4d41-b2c7-1bb2306b1b25") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 15:09:44 crc kubenswrapper[4785]: I1126 15:09:44.413235 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 26 15:09:44 crc kubenswrapper[4785]: E1126 15:09:44.413415 4785 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-26 15:09:44.913389878 +0000 UTC m=+148.591755642 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 15:09:44 crc kubenswrapper[4785]: I1126 15:09:44.413669 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nrsd2\" (UID: \"80f8d801-cad5-4d41-b2c7-1bb2306b1b25\") " pod="openshift-image-registry/image-registry-697d97f7c8-nrsd2" Nov 26 15:09:44 crc kubenswrapper[4785]: E1126 15:09:44.414067 4785 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-26 15:09:44.914053565 +0000 UTC m=+148.592419319 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nrsd2" (UID: "80f8d801-cad5-4d41-b2c7-1bb2306b1b25") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 15:09:44 crc kubenswrapper[4785]: I1126 15:09:44.515495 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 26 15:09:44 crc kubenswrapper[4785]: E1126 15:09:44.515682 4785 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-26 15:09:45.01565521 +0000 UTC m=+148.694020974 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 15:09:44 crc kubenswrapper[4785]: I1126 15:09:44.617227 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nrsd2\" (UID: \"80f8d801-cad5-4d41-b2c7-1bb2306b1b25\") " pod="openshift-image-registry/image-registry-697d97f7c8-nrsd2" Nov 26 15:09:44 crc kubenswrapper[4785]: E1126 15:09:44.617598 4785 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-26 15:09:45.117582062 +0000 UTC m=+148.795947826 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nrsd2" (UID: "80f8d801-cad5-4d41-b2c7-1bb2306b1b25") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 15:09:44 crc kubenswrapper[4785]: I1126 15:09:44.705437 4785 generic.go:334] "Generic (PLEG): container finished" podID="03c85461-2fda-45e4-a240-5984b368b216" containerID="73c0a7c4e4d85c990cc231d2230d8ecc9686418b34c9318fd8133954542441e5" exitCode=0 Nov 26 15:09:44 crc kubenswrapper[4785]: I1126 15:09:44.705521 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-s6nf7" event={"ID":"03c85461-2fda-45e4-a240-5984b368b216","Type":"ContainerDied","Data":"73c0a7c4e4d85c990cc231d2230d8ecc9686418b34c9318fd8133954542441e5"} Nov 26 15:09:44 crc kubenswrapper[4785]: I1126 15:09:44.705585 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-s6nf7" event={"ID":"03c85461-2fda-45e4-a240-5984b368b216","Type":"ContainerStarted","Data":"6836094bdb7a188aa39e83892b4277bdc7a86de1bc29a4a1c667ef2bac8266f1"} Nov 26 15:09:44 crc kubenswrapper[4785]: I1126 15:09:44.705597 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-s6nf7" event={"ID":"03c85461-2fda-45e4-a240-5984b368b216","Type":"ContainerStarted","Data":"f0b8204c183476a5c879d587647bcec0268e93386654ab200a7e334392e99357"} Nov 26 15:09:44 crc kubenswrapper[4785]: I1126 15:09:44.707582 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-w4twq" event={"ID":"2fb7b6f6-18b7-4fcd-b339-061bf56eb47a","Type":"ContainerStarted","Data":"011cd367b37bfcf5c8399c27f79f9d848d89febed7b1f8e26a2d0b9669cfe1ca"} Nov 26 15:09:44 crc kubenswrapper[4785]: I1126 15:09:44.709280 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-4k7ds" event={"ID":"8cd4a8fe-0b02-4ee4-b125-c23de9e56acf","Type":"ContainerStarted","Data":"1f0cc4309a0a6c8a8115f94789c6f90ef8d48442e34a1c4e31d47fd1860a0c65"} Nov 26 15:09:44 crc kubenswrapper[4785]: I1126 15:09:44.723381 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 26 15:09:44 crc kubenswrapper[4785]: E1126 15:09:44.723816 4785 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-26 15:09:45.223797576 +0000 UTC m=+148.902163340 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 15:09:44 crc kubenswrapper[4785]: I1126 15:09:44.735728 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-dr58q" event={"ID":"ed213209-32b5-4239-a332-afb26f56e83c","Type":"ContainerStarted","Data":"cd7b2d8227abc894d86581bc5f650b197530701cb19e9956c68ddbd65f65eda4"} Nov 26 15:09:44 crc kubenswrapper[4785]: I1126 15:09:44.744602 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-fsdwl" event={"ID":"bc1be764-4067-4f28-83e1-684ab5cdebaa","Type":"ContainerStarted","Data":"ae0809f5a95c2c3f25687221a228b908ae4cb4080b29e3bc89b9cc65be60f2a9"} Nov 26 15:09:44 crc kubenswrapper[4785]: I1126 15:09:44.746329 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-452cx" event={"ID":"20dcdb37-da72-4fd8-8927-23cd9799d180","Type":"ContainerStarted","Data":"e1f4138f2c096282d194e8c41b42b9280047279d99a3eb05eec90ab5972c4b22"} Nov 26 15:09:44 crc kubenswrapper[4785]: I1126 15:09:44.746401 4785 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-452cx" Nov 26 15:09:44 crc kubenswrapper[4785]: I1126 15:09:44.747580 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-98w8d" event={"ID":"c75fc6a1-3a11-401c-a519-285f2d2fed86","Type":"ContainerStarted","Data":"522d47886a0e751e402f3ee6160857e973d3994f4a403b7ac3547447edb8d234"} Nov 26 15:09:44 crc kubenswrapper[4785]: I1126 15:09:44.762740 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-9wv9b" event={"ID":"3c750c46-8c0b-4853-80ec-dac544558a67","Type":"ContainerStarted","Data":"5732ef71f4432e0e0f6aa920aca32a34ac6254471633d0255123367b7fac51c3"} Nov 26 15:09:44 crc kubenswrapper[4785]: I1126 15:09:44.774791 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-mwxx7" event={"ID":"0a45fa88-9d4e-4801-8d54-a7fc5ca37240","Type":"ContainerStarted","Data":"31c30a284bbcebc3ddf6bdc7832b1665bdacfcd183e477d2f2f3f909f74916af"} Nov 26 15:09:44 crc kubenswrapper[4785]: I1126 15:09:44.784149 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-qr82q" event={"ID":"936506c8-413e-4978-830e-323d53a45cdf","Type":"ContainerStarted","Data":"7adf35646158785624d6c493df4250a002c23b994e1d10a469c7ccf60b106e58"} Nov 26 15:09:44 crc kubenswrapper[4785]: I1126 15:09:44.784738 4785 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-qr82q" Nov 26 15:09:44 crc kubenswrapper[4785]: I1126 15:09:44.792656 4785 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-s6nf7" podStartSLOduration=126.792640871 podStartE2EDuration="2m6.792640871s" podCreationTimestamp="2025-11-26 15:07:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 15:09:44.786593125 +0000 UTC m=+148.464958889" watchObservedRunningTime="2025-11-26 15:09:44.792640871 +0000 UTC m=+148.471006635" Nov 26 15:09:44 crc kubenswrapper[4785]: I1126 15:09:44.796723 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-56grk" event={"ID":"99f14898-6d00-4964-8eaa-f58db7e92512","Type":"ContainerStarted","Data":"70a49cee11888de2918bdf2cfbd8b280cee21c3da0266b7b61288326acc03091"} Nov 26 15:09:44 crc kubenswrapper[4785]: I1126 15:09:44.804350 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4h5d5" event={"ID":"5c5bd1c8-cf40-4ad2-8556-6269ab2a08d0","Type":"ContainerStarted","Data":"e22ca45bdca99ff4820b98d0293a7b47e6ea3d29d8d068335fa2b815d8cf2f06"} Nov 26 15:09:44 crc kubenswrapper[4785]: I1126 15:09:44.826882 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-mvkhw" event={"ID":"bdb530b8-a573-45a5-beae-3d60ee06ede2","Type":"ContainerStarted","Data":"3a92961411a86b86d764e8771c4028c8878f597a15c96c756d9cf6fa1e7a1a7f"} Nov 26 15:09:44 crc kubenswrapper[4785]: I1126 15:09:44.826997 4785 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-56grk" Nov 26 15:09:44 crc kubenswrapper[4785]: I1126 15:09:44.827620 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nrsd2\" (UID: \"80f8d801-cad5-4d41-b2c7-1bb2306b1b25\") " pod="openshift-image-registry/image-registry-697d97f7c8-nrsd2" Nov 26 15:09:44 crc kubenswrapper[4785]: E1126 15:09:44.828602 4785 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-26 15:09:45.328589873 +0000 UTC m=+149.006955637 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nrsd2" (UID: "80f8d801-cad5-4d41-b2c7-1bb2306b1b25") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 15:09:44 crc kubenswrapper[4785]: I1126 15:09:44.844064 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-s7n5j" event={"ID":"b022f00c-c760-4fc7-85be-bf6bda07ed6c","Type":"ContainerStarted","Data":"b4c47fbed2e8d3a84b035ad16b72b994a893b64591b5f24e101f9857c04199ae"} Nov 26 15:09:44 crc kubenswrapper[4785]: I1126 15:09:44.869781 4785 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-w4twq" podStartSLOduration=126.869763521 podStartE2EDuration="2m6.869763521s" podCreationTimestamp="2025-11-26 15:07:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 15:09:44.869125885 +0000 UTC m=+148.547491669" watchObservedRunningTime="2025-11-26 15:09:44.869763521 +0000 UTC m=+148.548129285" Nov 26 15:09:44 crc kubenswrapper[4785]: I1126 15:09:44.871492 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-bzwjs" event={"ID":"b9d58930-37e7-40ed-ab98-48654e752018","Type":"ContainerStarted","Data":"c983036657519bb02ccfc5a943d783617b91e015fe851507bbff95e955c31c49"} Nov 26 15:09:44 crc kubenswrapper[4785]: I1126 15:09:44.883418 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-hlltg" event={"ID":"5a190f82-8e34-4e31-b2a1-e773974ccf4e","Type":"ContainerStarted","Data":"800ccc38b4f30f89bb48fe9328307be1dc777951574554f03535488c2ef657be"} Nov 26 15:09:44 crc kubenswrapper[4785]: I1126 15:09:44.883454 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-hlltg" event={"ID":"5a190f82-8e34-4e31-b2a1-e773974ccf4e","Type":"ContainerStarted","Data":"d3c0e3d4a8deff332f3fad860f8d6f3b300c9775f4da2defba1ea09f9a4d994e"} Nov 26 15:09:44 crc kubenswrapper[4785]: I1126 15:09:44.884649 4785 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-cllbv container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.40:8080/healthz\": dial tcp 10.217.0.40:8080: connect: connection refused" start-of-body= Nov 26 15:09:44 crc kubenswrapper[4785]: I1126 15:09:44.884681 4785 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-cllbv" podUID="c9abc8ae-a944-4f87-909b-8258f95c2c06" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.40:8080/healthz\": dial tcp 10.217.0.40:8080: connect: connection refused" Nov 26 15:09:44 crc kubenswrapper[4785]: I1126 15:09:44.887249 4785 patch_prober.go:28] interesting pod/downloads-7954f5f757-vn49n container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" start-of-body= Nov 26 15:09:44 crc kubenswrapper[4785]: I1126 15:09:44.887282 4785 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-vn49n" podUID="cb9844e2-63a7-4437-ab9d-5047b2363580" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" Nov 26 15:09:44 crc kubenswrapper[4785]: I1126 15:09:44.904682 4785 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-wmz66" Nov 26 15:09:44 crc kubenswrapper[4785]: I1126 15:09:44.905569 4785 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-sq88m" Nov 26 15:09:44 crc kubenswrapper[4785]: I1126 15:09:44.928569 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 26 15:09:44 crc kubenswrapper[4785]: E1126 15:09:44.928773 4785 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-26 15:09:45.42874061 +0000 UTC m=+149.107106384 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 15:09:44 crc kubenswrapper[4785]: I1126 15:09:44.929101 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 26 15:09:44 crc kubenswrapper[4785]: I1126 15:09:44.929343 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nrsd2\" (UID: \"80f8d801-cad5-4d41-b2c7-1bb2306b1b25\") " pod="openshift-image-registry/image-registry-697d97f7c8-nrsd2" Nov 26 15:09:44 crc kubenswrapper[4785]: I1126 15:09:44.929435 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 26 15:09:44 crc kubenswrapper[4785]: I1126 15:09:44.929581 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 26 15:09:44 crc kubenswrapper[4785]: I1126 15:09:44.929679 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 26 15:09:44 crc kubenswrapper[4785]: E1126 15:09:44.931121 4785 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-26 15:09:45.431109062 +0000 UTC m=+149.109474826 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nrsd2" (UID: "80f8d801-cad5-4d41-b2c7-1bb2306b1b25") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 15:09:44 crc kubenswrapper[4785]: I1126 15:09:44.933712 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 26 15:09:44 crc kubenswrapper[4785]: I1126 15:09:44.940245 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 26 15:09:44 crc kubenswrapper[4785]: I1126 15:09:44.941703 4785 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-fsdwl" podStartSLOduration=126.941689226 podStartE2EDuration="2m6.941689226s" podCreationTimestamp="2025-11-26 15:07:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 15:09:44.932687753 +0000 UTC m=+148.611053537" watchObservedRunningTime="2025-11-26 15:09:44.941689226 +0000 UTC m=+148.620054990" Nov 26 15:09:44 crc kubenswrapper[4785]: I1126 15:09:44.953165 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 26 15:09:44 crc kubenswrapper[4785]: I1126 15:09:44.981316 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 26 15:09:45 crc kubenswrapper[4785]: I1126 15:09:45.032716 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 26 15:09:45 crc kubenswrapper[4785]: E1126 15:09:45.033114 4785 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-26 15:09:45.533092806 +0000 UTC m=+149.211458570 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 15:09:45 crc kubenswrapper[4785]: I1126 15:09:45.033698 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nrsd2\" (UID: \"80f8d801-cad5-4d41-b2c7-1bb2306b1b25\") " pod="openshift-image-registry/image-registry-697d97f7c8-nrsd2" Nov 26 15:09:45 crc kubenswrapper[4785]: E1126 15:09:45.036001 4785 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-26 15:09:45.535986071 +0000 UTC m=+149.214351835 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nrsd2" (UID: "80f8d801-cad5-4d41-b2c7-1bb2306b1b25") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 15:09:45 crc kubenswrapper[4785]: I1126 15:09:45.069288 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 26 15:09:45 crc kubenswrapper[4785]: I1126 15:09:45.076474 4785 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-dr58q" podStartSLOduration=127.07645838 podStartE2EDuration="2m7.07645838s" podCreationTimestamp="2025-11-26 15:07:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 15:09:45.07568181 +0000 UTC m=+148.754047584" watchObservedRunningTime="2025-11-26 15:09:45.07645838 +0000 UTC m=+148.754824144" Nov 26 15:09:45 crc kubenswrapper[4785]: I1126 15:09:45.079964 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 26 15:09:45 crc kubenswrapper[4785]: I1126 15:09:45.119935 4785 patch_prober.go:28] interesting pod/router-default-5444994796-4v4sr container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 26 15:09:45 crc kubenswrapper[4785]: [-]has-synced failed: reason withheld Nov 26 15:09:45 crc kubenswrapper[4785]: [+]process-running ok Nov 26 15:09:45 crc kubenswrapper[4785]: healthz check failed Nov 26 15:09:45 crc kubenswrapper[4785]: I1126 15:09:45.120269 4785 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-4v4sr" podUID="fb4884c6-873b-4728-8633-4ce0b794dfcd" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 26 15:09:45 crc kubenswrapper[4785]: I1126 15:09:45.121027 4785 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-452cx" podStartSLOduration=9.121005725 podStartE2EDuration="9.121005725s" podCreationTimestamp="2025-11-26 15:09:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 15:09:45.12081764 +0000 UTC m=+148.799183414" watchObservedRunningTime="2025-11-26 15:09:45.121005725 +0000 UTC m=+148.799371509" Nov 26 15:09:45 crc kubenswrapper[4785]: I1126 15:09:45.138047 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 26 15:09:45 crc kubenswrapper[4785]: E1126 15:09:45.138347 4785 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-26 15:09:45.638332285 +0000 UTC m=+149.316698049 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 15:09:45 crc kubenswrapper[4785]: I1126 15:09:45.156818 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 26 15:09:45 crc kubenswrapper[4785]: I1126 15:09:45.157636 4785 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-qr82q" podStartSLOduration=127.157622815 podStartE2EDuration="2m7.157622815s" podCreationTimestamp="2025-11-26 15:07:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 15:09:45.156046224 +0000 UTC m=+148.834411998" watchObservedRunningTime="2025-11-26 15:09:45.157622815 +0000 UTC m=+148.835988579" Nov 26 15:09:45 crc kubenswrapper[4785]: I1126 15:09:45.210694 4785 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-mvkhw" podStartSLOduration=127.21067485 podStartE2EDuration="2m7.21067485s" podCreationTimestamp="2025-11-26 15:07:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 15:09:45.202782076 +0000 UTC m=+148.881147850" watchObservedRunningTime="2025-11-26 15:09:45.21067485 +0000 UTC m=+148.889040604" Nov 26 15:09:45 crc kubenswrapper[4785]: I1126 15:09:45.241589 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nrsd2\" (UID: \"80f8d801-cad5-4d41-b2c7-1bb2306b1b25\") " pod="openshift-image-registry/image-registry-697d97f7c8-nrsd2" Nov 26 15:09:45 crc kubenswrapper[4785]: E1126 15:09:45.241905 4785 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-26 15:09:45.74189355 +0000 UTC m=+149.420259314 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nrsd2" (UID: "80f8d801-cad5-4d41-b2c7-1bb2306b1b25") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 15:09:45 crc kubenswrapper[4785]: I1126 15:09:45.267943 4785 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-hlltg" podStartSLOduration=127.267921065 podStartE2EDuration="2m7.267921065s" podCreationTimestamp="2025-11-26 15:07:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 15:09:45.267112164 +0000 UTC m=+148.945477948" watchObservedRunningTime="2025-11-26 15:09:45.267921065 +0000 UTC m=+148.946286829" Nov 26 15:09:45 crc kubenswrapper[4785]: I1126 15:09:45.269004 4785 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4h5d5" podStartSLOduration=127.268994702 podStartE2EDuration="2m7.268994702s" podCreationTimestamp="2025-11-26 15:07:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 15:09:45.23650509 +0000 UTC m=+148.914870864" watchObservedRunningTime="2025-11-26 15:09:45.268994702 +0000 UTC m=+148.947360476" Nov 26 15:09:45 crc kubenswrapper[4785]: I1126 15:09:45.343116 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 26 15:09:45 crc kubenswrapper[4785]: E1126 15:09:45.343344 4785 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-26 15:09:45.843309839 +0000 UTC m=+149.521675603 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 15:09:45 crc kubenswrapper[4785]: I1126 15:09:45.343397 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nrsd2\" (UID: \"80f8d801-cad5-4d41-b2c7-1bb2306b1b25\") " pod="openshift-image-registry/image-registry-697d97f7c8-nrsd2" Nov 26 15:09:45 crc kubenswrapper[4785]: E1126 15:09:45.343702 4785 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-26 15:09:45.843690979 +0000 UTC m=+149.522056743 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nrsd2" (UID: "80f8d801-cad5-4d41-b2c7-1bb2306b1b25") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 15:09:45 crc kubenswrapper[4785]: I1126 15:09:45.358982 4785 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-bzwjs" podStartSLOduration=127.358963185 podStartE2EDuration="2m7.358963185s" podCreationTimestamp="2025-11-26 15:07:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 15:09:45.357756344 +0000 UTC m=+149.036122108" watchObservedRunningTime="2025-11-26 15:09:45.358963185 +0000 UTC m=+149.037328949" Nov 26 15:09:45 crc kubenswrapper[4785]: I1126 15:09:45.453632 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 26 15:09:45 crc kubenswrapper[4785]: E1126 15:09:45.454164 4785 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-26 15:09:45.954144083 +0000 UTC m=+149.632509847 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 15:09:45 crc kubenswrapper[4785]: I1126 15:09:45.566596 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nrsd2\" (UID: \"80f8d801-cad5-4d41-b2c7-1bb2306b1b25\") " pod="openshift-image-registry/image-registry-697d97f7c8-nrsd2" Nov 26 15:09:45 crc kubenswrapper[4785]: E1126 15:09:45.567256 4785 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-26 15:09:46.067242065 +0000 UTC m=+149.745607829 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nrsd2" (UID: "80f8d801-cad5-4d41-b2c7-1bb2306b1b25") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 15:09:45 crc kubenswrapper[4785]: I1126 15:09:45.583611 4785 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-zb7zt"] Nov 26 15:09:45 crc kubenswrapper[4785]: I1126 15:09:45.584476 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zb7zt" Nov 26 15:09:45 crc kubenswrapper[4785]: I1126 15:09:45.602499 4785 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-zb7zt"] Nov 26 15:09:45 crc kubenswrapper[4785]: I1126 15:09:45.602701 4785 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Nov 26 15:09:45 crc kubenswrapper[4785]: I1126 15:09:45.669095 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 26 15:09:45 crc kubenswrapper[4785]: I1126 15:09:45.669310 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9558a7a8-65c7-4127-9174-aa1036efc91f-catalog-content\") pod \"certified-operators-zb7zt\" (UID: \"9558a7a8-65c7-4127-9174-aa1036efc91f\") " pod="openshift-marketplace/certified-operators-zb7zt" Nov 26 15:09:45 crc kubenswrapper[4785]: I1126 15:09:45.669369 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9558a7a8-65c7-4127-9174-aa1036efc91f-utilities\") pod \"certified-operators-zb7zt\" (UID: \"9558a7a8-65c7-4127-9174-aa1036efc91f\") " pod="openshift-marketplace/certified-operators-zb7zt" Nov 26 15:09:45 crc kubenswrapper[4785]: I1126 15:09:45.669406 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gctt6\" (UniqueName: \"kubernetes.io/projected/9558a7a8-65c7-4127-9174-aa1036efc91f-kube-api-access-gctt6\") pod \"certified-operators-zb7zt\" (UID: \"9558a7a8-65c7-4127-9174-aa1036efc91f\") " pod="openshift-marketplace/certified-operators-zb7zt" Nov 26 15:09:45 crc kubenswrapper[4785]: E1126 15:09:45.669529 4785 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-26 15:09:46.169510046 +0000 UTC m=+149.847875810 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 15:09:45 crc kubenswrapper[4785]: I1126 15:09:45.771695 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gctt6\" (UniqueName: \"kubernetes.io/projected/9558a7a8-65c7-4127-9174-aa1036efc91f-kube-api-access-gctt6\") pod \"certified-operators-zb7zt\" (UID: \"9558a7a8-65c7-4127-9174-aa1036efc91f\") " pod="openshift-marketplace/certified-operators-zb7zt" Nov 26 15:09:45 crc kubenswrapper[4785]: I1126 15:09:45.771792 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9558a7a8-65c7-4127-9174-aa1036efc91f-catalog-content\") pod \"certified-operators-zb7zt\" (UID: \"9558a7a8-65c7-4127-9174-aa1036efc91f\") " pod="openshift-marketplace/certified-operators-zb7zt" Nov 26 15:09:45 crc kubenswrapper[4785]: I1126 15:09:45.771819 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nrsd2\" (UID: \"80f8d801-cad5-4d41-b2c7-1bb2306b1b25\") " pod="openshift-image-registry/image-registry-697d97f7c8-nrsd2" Nov 26 15:09:45 crc kubenswrapper[4785]: I1126 15:09:45.771858 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9558a7a8-65c7-4127-9174-aa1036efc91f-utilities\") pod \"certified-operators-zb7zt\" (UID: \"9558a7a8-65c7-4127-9174-aa1036efc91f\") " pod="openshift-marketplace/certified-operators-zb7zt" Nov 26 15:09:45 crc kubenswrapper[4785]: I1126 15:09:45.773293 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9558a7a8-65c7-4127-9174-aa1036efc91f-catalog-content\") pod \"certified-operators-zb7zt\" (UID: \"9558a7a8-65c7-4127-9174-aa1036efc91f\") " pod="openshift-marketplace/certified-operators-zb7zt" Nov 26 15:09:45 crc kubenswrapper[4785]: E1126 15:09:45.773585 4785 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-26 15:09:46.273574884 +0000 UTC m=+149.951940648 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nrsd2" (UID: "80f8d801-cad5-4d41-b2c7-1bb2306b1b25") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 15:09:45 crc kubenswrapper[4785]: I1126 15:09:45.776721 4785 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-q8fnj"] Nov 26 15:09:45 crc kubenswrapper[4785]: I1126 15:09:45.777880 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9558a7a8-65c7-4127-9174-aa1036efc91f-utilities\") pod \"certified-operators-zb7zt\" (UID: \"9558a7a8-65c7-4127-9174-aa1036efc91f\") " pod="openshift-marketplace/certified-operators-zb7zt" Nov 26 15:09:45 crc kubenswrapper[4785]: I1126 15:09:45.787664 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-q8fnj" Nov 26 15:09:45 crc kubenswrapper[4785]: I1126 15:09:45.791877 4785 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-q8fnj"] Nov 26 15:09:45 crc kubenswrapper[4785]: I1126 15:09:45.793999 4785 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Nov 26 15:09:45 crc kubenswrapper[4785]: I1126 15:09:45.817709 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gctt6\" (UniqueName: \"kubernetes.io/projected/9558a7a8-65c7-4127-9174-aa1036efc91f-kube-api-access-gctt6\") pod \"certified-operators-zb7zt\" (UID: \"9558a7a8-65c7-4127-9174-aa1036efc91f\") " pod="openshift-marketplace/certified-operators-zb7zt" Nov 26 15:09:45 crc kubenswrapper[4785]: I1126 15:09:45.874163 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 26 15:09:45 crc kubenswrapper[4785]: I1126 15:09:45.874301 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c5c2881b-79ef-4249-a393-dc3141e2e7c2-utilities\") pod \"community-operators-q8fnj\" (UID: \"c5c2881b-79ef-4249-a393-dc3141e2e7c2\") " pod="openshift-marketplace/community-operators-q8fnj" Nov 26 15:09:45 crc kubenswrapper[4785]: I1126 15:09:45.874345 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dsx94\" (UniqueName: \"kubernetes.io/projected/c5c2881b-79ef-4249-a393-dc3141e2e7c2-kube-api-access-dsx94\") pod \"community-operators-q8fnj\" (UID: \"c5c2881b-79ef-4249-a393-dc3141e2e7c2\") " pod="openshift-marketplace/community-operators-q8fnj" Nov 26 15:09:45 crc kubenswrapper[4785]: I1126 15:09:45.874403 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c5c2881b-79ef-4249-a393-dc3141e2e7c2-catalog-content\") pod \"community-operators-q8fnj\" (UID: \"c5c2881b-79ef-4249-a393-dc3141e2e7c2\") " pod="openshift-marketplace/community-operators-q8fnj" Nov 26 15:09:45 crc kubenswrapper[4785]: E1126 15:09:45.874499 4785 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-26 15:09:46.374483501 +0000 UTC m=+150.052849265 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 15:09:45 crc kubenswrapper[4785]: I1126 15:09:45.886038 4785 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-jmg2s container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.25:6443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Nov 26 15:09:45 crc kubenswrapper[4785]: I1126 15:09:45.886083 4785 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-jmg2s" podUID="5d3c5506-f58a-45bb-adbe-e895b7e4d646" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.25:6443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Nov 26 15:09:45 crc kubenswrapper[4785]: I1126 15:09:45.889146 4785 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-4z7ct container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.36:5443/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Nov 26 15:09:45 crc kubenswrapper[4785]: I1126 15:09:45.889195 4785 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4z7ct" podUID="16404b4a-e9c8-4e18-bd40-0bdcab054a44" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.36:5443/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Nov 26 15:09:45 crc kubenswrapper[4785]: I1126 15:09:45.958949 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-s7n5j" event={"ID":"b022f00c-c760-4fc7-85be-bf6bda07ed6c","Type":"ContainerStarted","Data":"363df07b0ecc49e45521a194ddd0850383301313b137b336755c50dcafd49c71"} Nov 26 15:09:45 crc kubenswrapper[4785]: I1126 15:09:45.961860 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zb7zt" Nov 26 15:09:45 crc kubenswrapper[4785]: I1126 15:09:45.962117 4785 patch_prober.go:28] interesting pod/downloads-7954f5f757-vn49n container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" start-of-body= Nov 26 15:09:45 crc kubenswrapper[4785]: I1126 15:09:45.962156 4785 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-vn49n" podUID="cb9844e2-63a7-4437-ab9d-5047b2363580" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" Nov 26 15:09:45 crc kubenswrapper[4785]: I1126 15:09:45.987138 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nrsd2\" (UID: \"80f8d801-cad5-4d41-b2c7-1bb2306b1b25\") " pod="openshift-image-registry/image-registry-697d97f7c8-nrsd2" Nov 26 15:09:45 crc kubenswrapper[4785]: I1126 15:09:45.987188 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c5c2881b-79ef-4249-a393-dc3141e2e7c2-catalog-content\") pod \"community-operators-q8fnj\" (UID: \"c5c2881b-79ef-4249-a393-dc3141e2e7c2\") " pod="openshift-marketplace/community-operators-q8fnj" Nov 26 15:09:45 crc kubenswrapper[4785]: I1126 15:09:45.987225 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c5c2881b-79ef-4249-a393-dc3141e2e7c2-utilities\") pod \"community-operators-q8fnj\" (UID: \"c5c2881b-79ef-4249-a393-dc3141e2e7c2\") " pod="openshift-marketplace/community-operators-q8fnj" Nov 26 15:09:45 crc kubenswrapper[4785]: I1126 15:09:45.987256 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dsx94\" (UniqueName: \"kubernetes.io/projected/c5c2881b-79ef-4249-a393-dc3141e2e7c2-kube-api-access-dsx94\") pod \"community-operators-q8fnj\" (UID: \"c5c2881b-79ef-4249-a393-dc3141e2e7c2\") " pod="openshift-marketplace/community-operators-q8fnj" Nov 26 15:09:45 crc kubenswrapper[4785]: E1126 15:09:45.987776 4785 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-26 15:09:46.487766148 +0000 UTC m=+150.166131912 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nrsd2" (UID: "80f8d801-cad5-4d41-b2c7-1bb2306b1b25") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 15:09:45 crc kubenswrapper[4785]: I1126 15:09:45.988087 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c5c2881b-79ef-4249-a393-dc3141e2e7c2-utilities\") pod \"community-operators-q8fnj\" (UID: \"c5c2881b-79ef-4249-a393-dc3141e2e7c2\") " pod="openshift-marketplace/community-operators-q8fnj" Nov 26 15:09:45 crc kubenswrapper[4785]: I1126 15:09:45.988174 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c5c2881b-79ef-4249-a393-dc3141e2e7c2-catalog-content\") pod \"community-operators-q8fnj\" (UID: \"c5c2881b-79ef-4249-a393-dc3141e2e7c2\") " pod="openshift-marketplace/community-operators-q8fnj" Nov 26 15:09:46 crc kubenswrapper[4785]: I1126 15:09:46.008540 4785 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-tj757"] Nov 26 15:09:46 crc kubenswrapper[4785]: I1126 15:09:46.009624 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-tj757" Nov 26 15:09:46 crc kubenswrapper[4785]: I1126 15:09:46.022146 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dsx94\" (UniqueName: \"kubernetes.io/projected/c5c2881b-79ef-4249-a393-dc3141e2e7c2-kube-api-access-dsx94\") pod \"community-operators-q8fnj\" (UID: \"c5c2881b-79ef-4249-a393-dc3141e2e7c2\") " pod="openshift-marketplace/community-operators-q8fnj" Nov 26 15:09:46 crc kubenswrapper[4785]: I1126 15:09:46.048350 4785 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-tj757"] Nov 26 15:09:46 crc kubenswrapper[4785]: I1126 15:09:46.096065 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 26 15:09:46 crc kubenswrapper[4785]: I1126 15:09:46.096386 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pgh9q\" (UniqueName: \"kubernetes.io/projected/24729921-e706-4a67-a93d-70bcc92b7bb8-kube-api-access-pgh9q\") pod \"certified-operators-tj757\" (UID: \"24729921-e706-4a67-a93d-70bcc92b7bb8\") " pod="openshift-marketplace/certified-operators-tj757" Nov 26 15:09:46 crc kubenswrapper[4785]: E1126 15:09:46.097405 4785 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-26 15:09:46.59738608 +0000 UTC m=+150.275751844 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 15:09:46 crc kubenswrapper[4785]: I1126 15:09:46.098047 4785 patch_prober.go:28] interesting pod/router-default-5444994796-4v4sr container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 26 15:09:46 crc kubenswrapper[4785]: [-]has-synced failed: reason withheld Nov 26 15:09:46 crc kubenswrapper[4785]: [+]process-running ok Nov 26 15:09:46 crc kubenswrapper[4785]: healthz check failed Nov 26 15:09:46 crc kubenswrapper[4785]: I1126 15:09:46.098102 4785 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-4v4sr" podUID="fb4884c6-873b-4728-8633-4ce0b794dfcd" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 26 15:09:46 crc kubenswrapper[4785]: I1126 15:09:46.101525 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/24729921-e706-4a67-a93d-70bcc92b7bb8-utilities\") pod \"certified-operators-tj757\" (UID: \"24729921-e706-4a67-a93d-70bcc92b7bb8\") " pod="openshift-marketplace/certified-operators-tj757" Nov 26 15:09:46 crc kubenswrapper[4785]: I1126 15:09:46.101641 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/24729921-e706-4a67-a93d-70bcc92b7bb8-catalog-content\") pod \"certified-operators-tj757\" (UID: \"24729921-e706-4a67-a93d-70bcc92b7bb8\") " pod="openshift-marketplace/certified-operators-tj757" Nov 26 15:09:46 crc kubenswrapper[4785]: I1126 15:09:46.160693 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-q8fnj" Nov 26 15:09:46 crc kubenswrapper[4785]: I1126 15:09:46.178466 4785 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-zdn6b"] Nov 26 15:09:46 crc kubenswrapper[4785]: I1126 15:09:46.179628 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zdn6b" Nov 26 15:09:46 crc kubenswrapper[4785]: I1126 15:09:46.204747 4785 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-zdn6b"] Nov 26 15:09:46 crc kubenswrapper[4785]: I1126 15:09:46.212251 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nrsd2\" (UID: \"80f8d801-cad5-4d41-b2c7-1bb2306b1b25\") " pod="openshift-image-registry/image-registry-697d97f7c8-nrsd2" Nov 26 15:09:46 crc kubenswrapper[4785]: I1126 15:09:46.212322 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/24729921-e706-4a67-a93d-70bcc92b7bb8-utilities\") pod \"certified-operators-tj757\" (UID: \"24729921-e706-4a67-a93d-70bcc92b7bb8\") " pod="openshift-marketplace/certified-operators-tj757" Nov 26 15:09:46 crc kubenswrapper[4785]: I1126 15:09:46.212344 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/24729921-e706-4a67-a93d-70bcc92b7bb8-catalog-content\") pod \"certified-operators-tj757\" (UID: \"24729921-e706-4a67-a93d-70bcc92b7bb8\") " pod="openshift-marketplace/certified-operators-tj757" Nov 26 15:09:46 crc kubenswrapper[4785]: I1126 15:09:46.212387 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pgh9q\" (UniqueName: \"kubernetes.io/projected/24729921-e706-4a67-a93d-70bcc92b7bb8-kube-api-access-pgh9q\") pod \"certified-operators-tj757\" (UID: \"24729921-e706-4a67-a93d-70bcc92b7bb8\") " pod="openshift-marketplace/certified-operators-tj757" Nov 26 15:09:46 crc kubenswrapper[4785]: E1126 15:09:46.212904 4785 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-26 15:09:46.712890945 +0000 UTC m=+150.391256709 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nrsd2" (UID: "80f8d801-cad5-4d41-b2c7-1bb2306b1b25") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 15:09:46 crc kubenswrapper[4785]: I1126 15:09:46.213322 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/24729921-e706-4a67-a93d-70bcc92b7bb8-utilities\") pod \"certified-operators-tj757\" (UID: \"24729921-e706-4a67-a93d-70bcc92b7bb8\") " pod="openshift-marketplace/certified-operators-tj757" Nov 26 15:09:46 crc kubenswrapper[4785]: I1126 15:09:46.213530 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/24729921-e706-4a67-a93d-70bcc92b7bb8-catalog-content\") pod \"certified-operators-tj757\" (UID: \"24729921-e706-4a67-a93d-70bcc92b7bb8\") " pod="openshift-marketplace/certified-operators-tj757" Nov 26 15:09:46 crc kubenswrapper[4785]: I1126 15:09:46.306747 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pgh9q\" (UniqueName: \"kubernetes.io/projected/24729921-e706-4a67-a93d-70bcc92b7bb8-kube-api-access-pgh9q\") pod \"certified-operators-tj757\" (UID: \"24729921-e706-4a67-a93d-70bcc92b7bb8\") " pod="openshift-marketplace/certified-operators-tj757" Nov 26 15:09:46 crc kubenswrapper[4785]: I1126 15:09:46.316795 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 26 15:09:46 crc kubenswrapper[4785]: E1126 15:09:46.316876 4785 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-26 15:09:46.816860871 +0000 UTC m=+150.495226635 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 15:09:46 crc kubenswrapper[4785]: I1126 15:09:46.317199 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nrsd2\" (UID: \"80f8d801-cad5-4d41-b2c7-1bb2306b1b25\") " pod="openshift-image-registry/image-registry-697d97f7c8-nrsd2" Nov 26 15:09:46 crc kubenswrapper[4785]: I1126 15:09:46.317265 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/441df9b7-56eb-451e-bdfa-beb5046aa892-utilities\") pod \"community-operators-zdn6b\" (UID: \"441df9b7-56eb-451e-bdfa-beb5046aa892\") " pod="openshift-marketplace/community-operators-zdn6b" Nov 26 15:09:46 crc kubenswrapper[4785]: I1126 15:09:46.317356 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/441df9b7-56eb-451e-bdfa-beb5046aa892-catalog-content\") pod \"community-operators-zdn6b\" (UID: \"441df9b7-56eb-451e-bdfa-beb5046aa892\") " pod="openshift-marketplace/community-operators-zdn6b" Nov 26 15:09:46 crc kubenswrapper[4785]: I1126 15:09:46.317432 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8mqr2\" (UniqueName: \"kubernetes.io/projected/441df9b7-56eb-451e-bdfa-beb5046aa892-kube-api-access-8mqr2\") pod \"community-operators-zdn6b\" (UID: \"441df9b7-56eb-451e-bdfa-beb5046aa892\") " pod="openshift-marketplace/community-operators-zdn6b" Nov 26 15:09:46 crc kubenswrapper[4785]: E1126 15:09:46.317507 4785 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-26 15:09:46.817489497 +0000 UTC m=+150.495855261 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nrsd2" (UID: "80f8d801-cad5-4d41-b2c7-1bb2306b1b25") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 15:09:46 crc kubenswrapper[4785]: I1126 15:09:46.385981 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-tj757" Nov 26 15:09:46 crc kubenswrapper[4785]: W1126 15:09:46.413850 4785 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d751cbb_f2e2_430d_9754_c882a5e924a5.slice/crio-abb0f41855d99033bd09ab38e9097d0c1d33dd3eddf3ec7f947804ae4fb61aa0 WatchSource:0}: Error finding container abb0f41855d99033bd09ab38e9097d0c1d33dd3eddf3ec7f947804ae4fb61aa0: Status 404 returned error can't find the container with id abb0f41855d99033bd09ab38e9097d0c1d33dd3eddf3ec7f947804ae4fb61aa0 Nov 26 15:09:46 crc kubenswrapper[4785]: I1126 15:09:46.419459 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 26 15:09:46 crc kubenswrapper[4785]: I1126 15:09:46.419753 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/441df9b7-56eb-451e-bdfa-beb5046aa892-catalog-content\") pod \"community-operators-zdn6b\" (UID: \"441df9b7-56eb-451e-bdfa-beb5046aa892\") " pod="openshift-marketplace/community-operators-zdn6b" Nov 26 15:09:46 crc kubenswrapper[4785]: I1126 15:09:46.419818 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8mqr2\" (UniqueName: \"kubernetes.io/projected/441df9b7-56eb-451e-bdfa-beb5046aa892-kube-api-access-8mqr2\") pod \"community-operators-zdn6b\" (UID: \"441df9b7-56eb-451e-bdfa-beb5046aa892\") " pod="openshift-marketplace/community-operators-zdn6b" Nov 26 15:09:46 crc kubenswrapper[4785]: I1126 15:09:46.419858 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/441df9b7-56eb-451e-bdfa-beb5046aa892-utilities\") pod \"community-operators-zdn6b\" (UID: \"441df9b7-56eb-451e-bdfa-beb5046aa892\") " pod="openshift-marketplace/community-operators-zdn6b" Nov 26 15:09:46 crc kubenswrapper[4785]: I1126 15:09:46.420464 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/441df9b7-56eb-451e-bdfa-beb5046aa892-utilities\") pod \"community-operators-zdn6b\" (UID: \"441df9b7-56eb-451e-bdfa-beb5046aa892\") " pod="openshift-marketplace/community-operators-zdn6b" Nov 26 15:09:46 crc kubenswrapper[4785]: E1126 15:09:46.420598 4785 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-26 15:09:46.92057947 +0000 UTC m=+150.598945234 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 15:09:46 crc kubenswrapper[4785]: I1126 15:09:46.420905 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/441df9b7-56eb-451e-bdfa-beb5046aa892-catalog-content\") pod \"community-operators-zdn6b\" (UID: \"441df9b7-56eb-451e-bdfa-beb5046aa892\") " pod="openshift-marketplace/community-operators-zdn6b" Nov 26 15:09:46 crc kubenswrapper[4785]: I1126 15:09:46.462486 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8mqr2\" (UniqueName: \"kubernetes.io/projected/441df9b7-56eb-451e-bdfa-beb5046aa892-kube-api-access-8mqr2\") pod \"community-operators-zdn6b\" (UID: \"441df9b7-56eb-451e-bdfa-beb5046aa892\") " pod="openshift-marketplace/community-operators-zdn6b" Nov 26 15:09:46 crc kubenswrapper[4785]: I1126 15:09:46.521531 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nrsd2\" (UID: \"80f8d801-cad5-4d41-b2c7-1bb2306b1b25\") " pod="openshift-image-registry/image-registry-697d97f7c8-nrsd2" Nov 26 15:09:46 crc kubenswrapper[4785]: E1126 15:09:46.521937 4785 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-26 15:09:47.021921328 +0000 UTC m=+150.700287092 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nrsd2" (UID: "80f8d801-cad5-4d41-b2c7-1bb2306b1b25") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 15:09:46 crc kubenswrapper[4785]: I1126 15:09:46.574825 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zdn6b" Nov 26 15:09:46 crc kubenswrapper[4785]: I1126 15:09:46.609079 4785 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-q8fnj"] Nov 26 15:09:46 crc kubenswrapper[4785]: I1126 15:09:46.622934 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 26 15:09:46 crc kubenswrapper[4785]: E1126 15:09:46.623272 4785 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-26 15:09:47.123252975 +0000 UTC m=+150.801618739 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 15:09:46 crc kubenswrapper[4785]: I1126 15:09:46.663798 4785 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-zb7zt"] Nov 26 15:09:46 crc kubenswrapper[4785]: I1126 15:09:46.728681 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nrsd2\" (UID: \"80f8d801-cad5-4d41-b2c7-1bb2306b1b25\") " pod="openshift-image-registry/image-registry-697d97f7c8-nrsd2" Nov 26 15:09:46 crc kubenswrapper[4785]: E1126 15:09:46.729116 4785 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-26 15:09:47.22910452 +0000 UTC m=+150.907470284 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nrsd2" (UID: "80f8d801-cad5-4d41-b2c7-1bb2306b1b25") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 15:09:46 crc kubenswrapper[4785]: I1126 15:09:46.747571 4785 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Nov 26 15:09:46 crc kubenswrapper[4785]: I1126 15:09:46.832416 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 26 15:09:46 crc kubenswrapper[4785]: E1126 15:09:46.832624 4785 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-26 15:09:47.332517271 +0000 UTC m=+151.010883035 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 15:09:46 crc kubenswrapper[4785]: I1126 15:09:46.832843 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nrsd2\" (UID: \"80f8d801-cad5-4d41-b2c7-1bb2306b1b25\") " pod="openshift-image-registry/image-registry-697d97f7c8-nrsd2" Nov 26 15:09:46 crc kubenswrapper[4785]: E1126 15:09:46.833315 4785 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-26 15:09:47.333303432 +0000 UTC m=+151.011669196 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nrsd2" (UID: "80f8d801-cad5-4d41-b2c7-1bb2306b1b25") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 15:09:46 crc kubenswrapper[4785]: I1126 15:09:46.866150 4785 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-tj757"] Nov 26 15:09:46 crc kubenswrapper[4785]: W1126 15:09:46.912514 4785 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod24729921_e706_4a67_a93d_70bcc92b7bb8.slice/crio-0d30f8ccb7be1cea7407bd83ecd05461018f9d8fa6ddb650a27f7d5613b7b217 WatchSource:0}: Error finding container 0d30f8ccb7be1cea7407bd83ecd05461018f9d8fa6ddb650a27f7d5613b7b217: Status 404 returned error can't find the container with id 0d30f8ccb7be1cea7407bd83ecd05461018f9d8fa6ddb650a27f7d5613b7b217 Nov 26 15:09:46 crc kubenswrapper[4785]: I1126 15:09:46.935338 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 26 15:09:46 crc kubenswrapper[4785]: E1126 15:09:46.948758 4785 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-26 15:09:47.448721114 +0000 UTC m=+151.127086878 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 15:09:46 crc kubenswrapper[4785]: I1126 15:09:46.989858 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zb7zt" event={"ID":"9558a7a8-65c7-4127-9174-aa1036efc91f","Type":"ContainerStarted","Data":"87eac17aa6b913faf22fad6e705625c5fcb268424a7043799be8837a60aae5e1"} Nov 26 15:09:46 crc kubenswrapper[4785]: I1126 15:09:46.989899 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zb7zt" event={"ID":"9558a7a8-65c7-4127-9174-aa1036efc91f","Type":"ContainerStarted","Data":"c57f52d524b132a6298c0b6d4f6cdf6751ca2b2ef6821e3cd9b20c9fabc15bfa"} Nov 26 15:09:46 crc kubenswrapper[4785]: I1126 15:09:46.996584 4785 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-zdn6b"] Nov 26 15:09:46 crc kubenswrapper[4785]: I1126 15:09:46.998310 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"fe3626c3cbdab45b594cd793b69192be0088c9e2ba3ff447f24625a0d3c8bbc5"} Nov 26 15:09:46 crc kubenswrapper[4785]: I1126 15:09:46.998350 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"abb0f41855d99033bd09ab38e9097d0c1d33dd3eddf3ec7f947804ae4fb61aa0"} Nov 26 15:09:47 crc kubenswrapper[4785]: I1126 15:09:47.008667 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"2544c8199945aea5cfe6e2a4cf717dc94a9beac94f213bda1232fd05857d2231"} Nov 26 15:09:47 crc kubenswrapper[4785]: I1126 15:09:47.008730 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"83faf3c96150f9bc15e5a6e55d7d3bd8178f0f48f37cd18f9051ccf412f75b03"} Nov 26 15:09:47 crc kubenswrapper[4785]: I1126 15:09:47.009540 4785 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 26 15:09:47 crc kubenswrapper[4785]: I1126 15:09:47.024032 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-s7n5j" event={"ID":"b022f00c-c760-4fc7-85be-bf6bda07ed6c","Type":"ContainerStarted","Data":"6d0eee65fcd66a43286e0b7edfe500534c392d336089a290d2a829175196d2ba"} Nov 26 15:09:47 crc kubenswrapper[4785]: I1126 15:09:47.024080 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-s7n5j" event={"ID":"b022f00c-c760-4fc7-85be-bf6bda07ed6c","Type":"ContainerStarted","Data":"671b1db5ca5623aafe976427c285989cb54488955ee375d1243b903419321c20"} Nov 26 15:09:47 crc kubenswrapper[4785]: I1126 15:09:47.026688 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"26121d066a0d6469daf0e0e35aab53f323bb5cfad20c8126a431d4a3a0ad213a"} Nov 26 15:09:47 crc kubenswrapper[4785]: I1126 15:09:47.026723 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"43c06f5be68545ac5f45b3f2695009aef929ad9d709be036e4ea760d7706f6c4"} Nov 26 15:09:47 crc kubenswrapper[4785]: I1126 15:09:47.028039 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tj757" event={"ID":"24729921-e706-4a67-a93d-70bcc92b7bb8","Type":"ContainerStarted","Data":"0d30f8ccb7be1cea7407bd83ecd05461018f9d8fa6ddb650a27f7d5613b7b217"} Nov 26 15:09:47 crc kubenswrapper[4785]: I1126 15:09:47.033703 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-q8fnj" event={"ID":"c5c2881b-79ef-4249-a393-dc3141e2e7c2","Type":"ContainerStarted","Data":"91dbbf16cf7a503365ec8b3c0ccc04696c430d484a8382fe5c62be9caa6c695e"} Nov 26 15:09:47 crc kubenswrapper[4785]: I1126 15:09:47.033760 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-q8fnj" event={"ID":"c5c2881b-79ef-4249-a393-dc3141e2e7c2","Type":"ContainerStarted","Data":"0328e44a93acc37b968cfd29fbdaa8b8ae10b88d6b11611cbb4c7e18393df392"} Nov 26 15:09:47 crc kubenswrapper[4785]: I1126 15:09:47.035860 4785 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Nov 26 15:09:47 crc kubenswrapper[4785]: I1126 15:09:47.049961 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nrsd2\" (UID: \"80f8d801-cad5-4d41-b2c7-1bb2306b1b25\") " pod="openshift-image-registry/image-registry-697d97f7c8-nrsd2" Nov 26 15:09:47 crc kubenswrapper[4785]: E1126 15:09:47.050320 4785 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-26 15:09:47.550308238 +0000 UTC m=+151.228674002 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nrsd2" (UID: "80f8d801-cad5-4d41-b2c7-1bb2306b1b25") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 15:09:47 crc kubenswrapper[4785]: I1126 15:09:47.094188 4785 patch_prober.go:28] interesting pod/router-default-5444994796-4v4sr container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 26 15:09:47 crc kubenswrapper[4785]: [-]has-synced failed: reason withheld Nov 26 15:09:47 crc kubenswrapper[4785]: [+]process-running ok Nov 26 15:09:47 crc kubenswrapper[4785]: healthz check failed Nov 26 15:09:47 crc kubenswrapper[4785]: I1126 15:09:47.094265 4785 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-4v4sr" podUID="fb4884c6-873b-4728-8633-4ce0b794dfcd" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 26 15:09:47 crc kubenswrapper[4785]: I1126 15:09:47.130007 4785 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-s7n5j" podStartSLOduration=11.129994264 podStartE2EDuration="11.129994264s" podCreationTimestamp="2025-11-26 15:09:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 15:09:47.123114706 +0000 UTC m=+150.801480470" watchObservedRunningTime="2025-11-26 15:09:47.129994264 +0000 UTC m=+150.808360018" Nov 26 15:09:47 crc kubenswrapper[4785]: I1126 15:09:47.154379 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 26 15:09:47 crc kubenswrapper[4785]: E1126 15:09:47.155570 4785 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-26 15:09:47.655544007 +0000 UTC m=+151.333909761 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 26 15:09:47 crc kubenswrapper[4785]: I1126 15:09:47.183484 4785 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2025-11-26T15:09:46.747594729Z","Handler":null,"Name":""} Nov 26 15:09:47 crc kubenswrapper[4785]: I1126 15:09:47.186150 4785 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Nov 26 15:09:47 crc kubenswrapper[4785]: I1126 15:09:47.186181 4785 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Nov 26 15:09:47 crc kubenswrapper[4785]: I1126 15:09:47.255989 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nrsd2\" (UID: \"80f8d801-cad5-4d41-b2c7-1bb2306b1b25\") " pod="openshift-image-registry/image-registry-697d97f7c8-nrsd2" Nov 26 15:09:47 crc kubenswrapper[4785]: I1126 15:09:47.272027 4785 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Nov 26 15:09:47 crc kubenswrapper[4785]: I1126 15:09:47.272091 4785 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nrsd2\" (UID: \"80f8d801-cad5-4d41-b2c7-1bb2306b1b25\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-nrsd2" Nov 26 15:09:47 crc kubenswrapper[4785]: I1126 15:09:47.302765 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nrsd2\" (UID: \"80f8d801-cad5-4d41-b2c7-1bb2306b1b25\") " pod="openshift-image-registry/image-registry-697d97f7c8-nrsd2" Nov 26 15:09:47 crc kubenswrapper[4785]: I1126 15:09:47.356656 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 26 15:09:47 crc kubenswrapper[4785]: I1126 15:09:47.366513 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Nov 26 15:09:47 crc kubenswrapper[4785]: I1126 15:09:47.396408 4785 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-z8vql" Nov 26 15:09:47 crc kubenswrapper[4785]: I1126 15:09:47.470537 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-nrsd2" Nov 26 15:09:47 crc kubenswrapper[4785]: I1126 15:09:47.564770 4785 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-74t7j"] Nov 26 15:09:47 crc kubenswrapper[4785]: I1126 15:09:47.565686 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-74t7j" Nov 26 15:09:47 crc kubenswrapper[4785]: I1126 15:09:47.568119 4785 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Nov 26 15:09:47 crc kubenswrapper[4785]: I1126 15:09:47.581114 4785 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-74t7j"] Nov 26 15:09:47 crc kubenswrapper[4785]: I1126 15:09:47.740939 4785 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-nrsd2"] Nov 26 15:09:47 crc kubenswrapper[4785]: I1126 15:09:47.764050 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5b0f6908-63a8-487c-ac41-e81114c43311-utilities\") pod \"redhat-marketplace-74t7j\" (UID: \"5b0f6908-63a8-487c-ac41-e81114c43311\") " pod="openshift-marketplace/redhat-marketplace-74t7j" Nov 26 15:09:47 crc kubenswrapper[4785]: I1126 15:09:47.764110 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5b0f6908-63a8-487c-ac41-e81114c43311-catalog-content\") pod \"redhat-marketplace-74t7j\" (UID: \"5b0f6908-63a8-487c-ac41-e81114c43311\") " pod="openshift-marketplace/redhat-marketplace-74t7j" Nov 26 15:09:47 crc kubenswrapper[4785]: I1126 15:09:47.764157 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kqp65\" (UniqueName: \"kubernetes.io/projected/5b0f6908-63a8-487c-ac41-e81114c43311-kube-api-access-kqp65\") pod \"redhat-marketplace-74t7j\" (UID: \"5b0f6908-63a8-487c-ac41-e81114c43311\") " pod="openshift-marketplace/redhat-marketplace-74t7j" Nov 26 15:09:47 crc kubenswrapper[4785]: I1126 15:09:47.865066 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5b0f6908-63a8-487c-ac41-e81114c43311-utilities\") pod \"redhat-marketplace-74t7j\" (UID: \"5b0f6908-63a8-487c-ac41-e81114c43311\") " pod="openshift-marketplace/redhat-marketplace-74t7j" Nov 26 15:09:47 crc kubenswrapper[4785]: I1126 15:09:47.865120 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5b0f6908-63a8-487c-ac41-e81114c43311-catalog-content\") pod \"redhat-marketplace-74t7j\" (UID: \"5b0f6908-63a8-487c-ac41-e81114c43311\") " pod="openshift-marketplace/redhat-marketplace-74t7j" Nov 26 15:09:47 crc kubenswrapper[4785]: I1126 15:09:47.865159 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kqp65\" (UniqueName: \"kubernetes.io/projected/5b0f6908-63a8-487c-ac41-e81114c43311-kube-api-access-kqp65\") pod \"redhat-marketplace-74t7j\" (UID: \"5b0f6908-63a8-487c-ac41-e81114c43311\") " pod="openshift-marketplace/redhat-marketplace-74t7j" Nov 26 15:09:47 crc kubenswrapper[4785]: I1126 15:09:47.865848 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5b0f6908-63a8-487c-ac41-e81114c43311-utilities\") pod \"redhat-marketplace-74t7j\" (UID: \"5b0f6908-63a8-487c-ac41-e81114c43311\") " pod="openshift-marketplace/redhat-marketplace-74t7j" Nov 26 15:09:47 crc kubenswrapper[4785]: I1126 15:09:47.866049 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5b0f6908-63a8-487c-ac41-e81114c43311-catalog-content\") pod \"redhat-marketplace-74t7j\" (UID: \"5b0f6908-63a8-487c-ac41-e81114c43311\") " pod="openshift-marketplace/redhat-marketplace-74t7j" Nov 26 15:09:47 crc kubenswrapper[4785]: I1126 15:09:47.890508 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kqp65\" (UniqueName: \"kubernetes.io/projected/5b0f6908-63a8-487c-ac41-e81114c43311-kube-api-access-kqp65\") pod \"redhat-marketplace-74t7j\" (UID: \"5b0f6908-63a8-487c-ac41-e81114c43311\") " pod="openshift-marketplace/redhat-marketplace-74t7j" Nov 26 15:09:47 crc kubenswrapper[4785]: I1126 15:09:47.970493 4785 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-896ql"] Nov 26 15:09:47 crc kubenswrapper[4785]: I1126 15:09:47.972055 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-896ql" Nov 26 15:09:47 crc kubenswrapper[4785]: I1126 15:09:47.986467 4785 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-896ql"] Nov 26 15:09:48 crc kubenswrapper[4785]: I1126 15:09:48.043764 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-nrsd2" event={"ID":"80f8d801-cad5-4d41-b2c7-1bb2306b1b25","Type":"ContainerStarted","Data":"7060ad56096ad31d3e5870ece508a21a402da0ff8f11416b4889b83f0913d76e"} Nov 26 15:09:48 crc kubenswrapper[4785]: I1126 15:09:48.045369 4785 generic.go:334] "Generic (PLEG): container finished" podID="441df9b7-56eb-451e-bdfa-beb5046aa892" containerID="1eb5d3ebaecb8563a2e3d569df9fa6dc49da0f1e9906351be49138fa82d7260a" exitCode=0 Nov 26 15:09:48 crc kubenswrapper[4785]: I1126 15:09:48.045430 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zdn6b" event={"ID":"441df9b7-56eb-451e-bdfa-beb5046aa892","Type":"ContainerDied","Data":"1eb5d3ebaecb8563a2e3d569df9fa6dc49da0f1e9906351be49138fa82d7260a"} Nov 26 15:09:48 crc kubenswrapper[4785]: I1126 15:09:48.045452 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zdn6b" event={"ID":"441df9b7-56eb-451e-bdfa-beb5046aa892","Type":"ContainerStarted","Data":"59eae0619bfd2a2a2298d7a106ab678b77733c64020351eca1ac096bde4b084b"} Nov 26 15:09:48 crc kubenswrapper[4785]: I1126 15:09:48.049498 4785 generic.go:334] "Generic (PLEG): container finished" podID="24729921-e706-4a67-a93d-70bcc92b7bb8" containerID="ae81b5cf1ca261fd3808ea671ed1ffbbb91e6a4931138d1f488b9a52a1bf55b9" exitCode=0 Nov 26 15:09:48 crc kubenswrapper[4785]: I1126 15:09:48.049623 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tj757" event={"ID":"24729921-e706-4a67-a93d-70bcc92b7bb8","Type":"ContainerDied","Data":"ae81b5cf1ca261fd3808ea671ed1ffbbb91e6a4931138d1f488b9a52a1bf55b9"} Nov 26 15:09:48 crc kubenswrapper[4785]: I1126 15:09:48.056238 4785 generic.go:334] "Generic (PLEG): container finished" podID="c5c2881b-79ef-4249-a393-dc3141e2e7c2" containerID="91dbbf16cf7a503365ec8b3c0ccc04696c430d484a8382fe5c62be9caa6c695e" exitCode=0 Nov 26 15:09:48 crc kubenswrapper[4785]: I1126 15:09:48.056321 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-q8fnj" event={"ID":"c5c2881b-79ef-4249-a393-dc3141e2e7c2","Type":"ContainerDied","Data":"91dbbf16cf7a503365ec8b3c0ccc04696c430d484a8382fe5c62be9caa6c695e"} Nov 26 15:09:48 crc kubenswrapper[4785]: I1126 15:09:48.057983 4785 generic.go:334] "Generic (PLEG): container finished" podID="9558a7a8-65c7-4127-9174-aa1036efc91f" containerID="87eac17aa6b913faf22fad6e705625c5fcb268424a7043799be8837a60aae5e1" exitCode=0 Nov 26 15:09:48 crc kubenswrapper[4785]: I1126 15:09:48.058140 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zb7zt" event={"ID":"9558a7a8-65c7-4127-9174-aa1036efc91f","Type":"ContainerDied","Data":"87eac17aa6b913faf22fad6e705625c5fcb268424a7043799be8837a60aae5e1"} Nov 26 15:09:48 crc kubenswrapper[4785]: I1126 15:09:48.070303 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fb128907-8fff-4ad4-99f3-2877b676c46a-utilities\") pod \"redhat-marketplace-896ql\" (UID: \"fb128907-8fff-4ad4-99f3-2877b676c46a\") " pod="openshift-marketplace/redhat-marketplace-896ql" Nov 26 15:09:48 crc kubenswrapper[4785]: I1126 15:09:48.070355 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fb128907-8fff-4ad4-99f3-2877b676c46a-catalog-content\") pod \"redhat-marketplace-896ql\" (UID: \"fb128907-8fff-4ad4-99f3-2877b676c46a\") " pod="openshift-marketplace/redhat-marketplace-896ql" Nov 26 15:09:48 crc kubenswrapper[4785]: I1126 15:09:48.070460 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fxml9\" (UniqueName: \"kubernetes.io/projected/fb128907-8fff-4ad4-99f3-2877b676c46a-kube-api-access-fxml9\") pod \"redhat-marketplace-896ql\" (UID: \"fb128907-8fff-4ad4-99f3-2877b676c46a\") " pod="openshift-marketplace/redhat-marketplace-896ql" Nov 26 15:09:48 crc kubenswrapper[4785]: I1126 15:09:48.092016 4785 patch_prober.go:28] interesting pod/router-default-5444994796-4v4sr container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 26 15:09:48 crc kubenswrapper[4785]: [-]has-synced failed: reason withheld Nov 26 15:09:48 crc kubenswrapper[4785]: [+]process-running ok Nov 26 15:09:48 crc kubenswrapper[4785]: healthz check failed Nov 26 15:09:48 crc kubenswrapper[4785]: I1126 15:09:48.092066 4785 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-4v4sr" podUID="fb4884c6-873b-4728-8633-4ce0b794dfcd" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 26 15:09:48 crc kubenswrapper[4785]: I1126 15:09:48.170783 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fxml9\" (UniqueName: \"kubernetes.io/projected/fb128907-8fff-4ad4-99f3-2877b676c46a-kube-api-access-fxml9\") pod \"redhat-marketplace-896ql\" (UID: \"fb128907-8fff-4ad4-99f3-2877b676c46a\") " pod="openshift-marketplace/redhat-marketplace-896ql" Nov 26 15:09:48 crc kubenswrapper[4785]: I1126 15:09:48.170853 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fb128907-8fff-4ad4-99f3-2877b676c46a-utilities\") pod \"redhat-marketplace-896ql\" (UID: \"fb128907-8fff-4ad4-99f3-2877b676c46a\") " pod="openshift-marketplace/redhat-marketplace-896ql" Nov 26 15:09:48 crc kubenswrapper[4785]: I1126 15:09:48.170894 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fb128907-8fff-4ad4-99f3-2877b676c46a-catalog-content\") pod \"redhat-marketplace-896ql\" (UID: \"fb128907-8fff-4ad4-99f3-2877b676c46a\") " pod="openshift-marketplace/redhat-marketplace-896ql" Nov 26 15:09:48 crc kubenswrapper[4785]: I1126 15:09:48.171365 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fb128907-8fff-4ad4-99f3-2877b676c46a-catalog-content\") pod \"redhat-marketplace-896ql\" (UID: \"fb128907-8fff-4ad4-99f3-2877b676c46a\") " pod="openshift-marketplace/redhat-marketplace-896ql" Nov 26 15:09:48 crc kubenswrapper[4785]: I1126 15:09:48.171481 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fb128907-8fff-4ad4-99f3-2877b676c46a-utilities\") pod \"redhat-marketplace-896ql\" (UID: \"fb128907-8fff-4ad4-99f3-2877b676c46a\") " pod="openshift-marketplace/redhat-marketplace-896ql" Nov 26 15:09:48 crc kubenswrapper[4785]: I1126 15:09:48.181404 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-74t7j" Nov 26 15:09:48 crc kubenswrapper[4785]: I1126 15:09:48.192758 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fxml9\" (UniqueName: \"kubernetes.io/projected/fb128907-8fff-4ad4-99f3-2877b676c46a-kube-api-access-fxml9\") pod \"redhat-marketplace-896ql\" (UID: \"fb128907-8fff-4ad4-99f3-2877b676c46a\") " pod="openshift-marketplace/redhat-marketplace-896ql" Nov 26 15:09:48 crc kubenswrapper[4785]: I1126 15:09:48.302434 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-896ql" Nov 26 15:09:48 crc kubenswrapper[4785]: I1126 15:09:48.390874 4785 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-74t7j"] Nov 26 15:09:48 crc kubenswrapper[4785]: W1126 15:09:48.402225 4785 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5b0f6908_63a8_487c_ac41_e81114c43311.slice/crio-138862daa24f51976894076a140afe04aa715795ad6946449ebf19c5d581ecfe WatchSource:0}: Error finding container 138862daa24f51976894076a140afe04aa715795ad6946449ebf19c5d581ecfe: Status 404 returned error can't find the container with id 138862daa24f51976894076a140afe04aa715795ad6946449ebf19c5d581ecfe Nov 26 15:09:48 crc kubenswrapper[4785]: I1126 15:09:48.439532 4785 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-6cx55" Nov 26 15:09:48 crc kubenswrapper[4785]: I1126 15:09:48.439589 4785 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-6cx55" Nov 26 15:09:48 crc kubenswrapper[4785]: I1126 15:09:48.444218 4785 patch_prober.go:28] interesting pod/console-f9d7485db-6cx55 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.14:8443/health\": dial tcp 10.217.0.14:8443: connect: connection refused" start-of-body= Nov 26 15:09:48 crc kubenswrapper[4785]: I1126 15:09:48.444254 4785 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-6cx55" podUID="44ec5c36-3bfd-42c0-a4a7-28f7e0e5aad7" containerName="console" probeResult="failure" output="Get \"https://10.217.0.14:8443/health\": dial tcp 10.217.0.14:8443: connect: connection refused" Nov 26 15:09:48 crc kubenswrapper[4785]: I1126 15:09:48.525150 4785 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-896ql"] Nov 26 15:09:48 crc kubenswrapper[4785]: W1126 15:09:48.538752 4785 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfb128907_8fff_4ad4_99f3_2877b676c46a.slice/crio-a652f1465c2b1af444b103091a41db928b422980128f1d5e71fd032c5597271a WatchSource:0}: Error finding container a652f1465c2b1af444b103091a41db928b422980128f1d5e71fd032c5597271a: Status 404 returned error can't find the container with id a652f1465c2b1af444b103091a41db928b422980128f1d5e71fd032c5597271a Nov 26 15:09:48 crc kubenswrapper[4785]: I1126 15:09:48.768879 4785 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-rqxdc"] Nov 26 15:09:48 crc kubenswrapper[4785]: I1126 15:09:48.773225 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rqxdc" Nov 26 15:09:48 crc kubenswrapper[4785]: I1126 15:09:48.775859 4785 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Nov 26 15:09:48 crc kubenswrapper[4785]: I1126 15:09:48.789705 4785 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-rqxdc"] Nov 26 15:09:48 crc kubenswrapper[4785]: I1126 15:09:48.827882 4785 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Nov 26 15:09:48 crc kubenswrapper[4785]: I1126 15:09:48.828646 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Nov 26 15:09:48 crc kubenswrapper[4785]: I1126 15:09:48.830791 4785 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Nov 26 15:09:48 crc kubenswrapper[4785]: I1126 15:09:48.831019 4785 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Nov 26 15:09:48 crc kubenswrapper[4785]: I1126 15:09:48.837861 4785 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Nov 26 15:09:48 crc kubenswrapper[4785]: I1126 15:09:48.879179 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b7wkz\" (UniqueName: \"kubernetes.io/projected/afb002bc-9134-4d4c-b956-15ad3e700c49-kube-api-access-b7wkz\") pod \"redhat-operators-rqxdc\" (UID: \"afb002bc-9134-4d4c-b956-15ad3e700c49\") " pod="openshift-marketplace/redhat-operators-rqxdc" Nov 26 15:09:48 crc kubenswrapper[4785]: I1126 15:09:48.879615 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/afb002bc-9134-4d4c-b956-15ad3e700c49-catalog-content\") pod \"redhat-operators-rqxdc\" (UID: \"afb002bc-9134-4d4c-b956-15ad3e700c49\") " pod="openshift-marketplace/redhat-operators-rqxdc" Nov 26 15:09:48 crc kubenswrapper[4785]: I1126 15:09:48.879650 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/afb002bc-9134-4d4c-b956-15ad3e700c49-utilities\") pod \"redhat-operators-rqxdc\" (UID: \"afb002bc-9134-4d4c-b956-15ad3e700c49\") " pod="openshift-marketplace/redhat-operators-rqxdc" Nov 26 15:09:48 crc kubenswrapper[4785]: I1126 15:09:48.968316 4785 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4h5d5" Nov 26 15:09:48 crc kubenswrapper[4785]: I1126 15:09:48.968387 4785 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4h5d5" Nov 26 15:09:48 crc kubenswrapper[4785]: I1126 15:09:48.975681 4785 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4h5d5" Nov 26 15:09:48 crc kubenswrapper[4785]: I1126 15:09:48.981270 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/91accea7-45f2-45c7-b620-12023af1f863-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"91accea7-45f2-45c7-b620-12023af1f863\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Nov 26 15:09:48 crc kubenswrapper[4785]: I1126 15:09:48.981331 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b7wkz\" (UniqueName: \"kubernetes.io/projected/afb002bc-9134-4d4c-b956-15ad3e700c49-kube-api-access-b7wkz\") pod \"redhat-operators-rqxdc\" (UID: \"afb002bc-9134-4d4c-b956-15ad3e700c49\") " pod="openshift-marketplace/redhat-operators-rqxdc" Nov 26 15:09:48 crc kubenswrapper[4785]: I1126 15:09:48.981649 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/afb002bc-9134-4d4c-b956-15ad3e700c49-catalog-content\") pod \"redhat-operators-rqxdc\" (UID: \"afb002bc-9134-4d4c-b956-15ad3e700c49\") " pod="openshift-marketplace/redhat-operators-rqxdc" Nov 26 15:09:48 crc kubenswrapper[4785]: I1126 15:09:48.981706 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/afb002bc-9134-4d4c-b956-15ad3e700c49-utilities\") pod \"redhat-operators-rqxdc\" (UID: \"afb002bc-9134-4d4c-b956-15ad3e700c49\") " pod="openshift-marketplace/redhat-operators-rqxdc" Nov 26 15:09:48 crc kubenswrapper[4785]: I1126 15:09:48.981794 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/91accea7-45f2-45c7-b620-12023af1f863-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"91accea7-45f2-45c7-b620-12023af1f863\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Nov 26 15:09:48 crc kubenswrapper[4785]: I1126 15:09:48.993315 4785 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-pnqcw" Nov 26 15:09:48 crc kubenswrapper[4785]: I1126 15:09:48.994855 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/afb002bc-9134-4d4c-b956-15ad3e700c49-catalog-content\") pod \"redhat-operators-rqxdc\" (UID: \"afb002bc-9134-4d4c-b956-15ad3e700c49\") " pod="openshift-marketplace/redhat-operators-rqxdc" Nov 26 15:09:48 crc kubenswrapper[4785]: I1126 15:09:48.995076 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/afb002bc-9134-4d4c-b956-15ad3e700c49-utilities\") pod \"redhat-operators-rqxdc\" (UID: \"afb002bc-9134-4d4c-b956-15ad3e700c49\") " pod="openshift-marketplace/redhat-operators-rqxdc" Nov 26 15:09:49 crc kubenswrapper[4785]: I1126 15:09:49.001102 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b7wkz\" (UniqueName: \"kubernetes.io/projected/afb002bc-9134-4d4c-b956-15ad3e700c49-kube-api-access-b7wkz\") pod \"redhat-operators-rqxdc\" (UID: \"afb002bc-9134-4d4c-b956-15ad3e700c49\") " pod="openshift-marketplace/redhat-operators-rqxdc" Nov 26 15:09:49 crc kubenswrapper[4785]: I1126 15:09:49.013814 4785 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-jmg2s" Nov 26 15:09:49 crc kubenswrapper[4785]: I1126 15:09:49.080860 4785 patch_prober.go:28] interesting pod/downloads-7954f5f757-vn49n container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" start-of-body= Nov 26 15:09:49 crc kubenswrapper[4785]: I1126 15:09:49.080916 4785 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-vn49n" podUID="cb9844e2-63a7-4437-ab9d-5047b2363580" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" Nov 26 15:09:49 crc kubenswrapper[4785]: I1126 15:09:49.080859 4785 patch_prober.go:28] interesting pod/downloads-7954f5f757-vn49n container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" start-of-body= Nov 26 15:09:49 crc kubenswrapper[4785]: I1126 15:09:49.081289 4785 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-vn49n" podUID="cb9844e2-63a7-4437-ab9d-5047b2363580" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" Nov 26 15:09:49 crc kubenswrapper[4785]: I1126 15:09:49.082899 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/91accea7-45f2-45c7-b620-12023af1f863-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"91accea7-45f2-45c7-b620-12023af1f863\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Nov 26 15:09:49 crc kubenswrapper[4785]: I1126 15:09:49.082956 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/91accea7-45f2-45c7-b620-12023af1f863-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"91accea7-45f2-45c7-b620-12023af1f863\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Nov 26 15:09:49 crc kubenswrapper[4785]: I1126 15:09:49.083057 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/91accea7-45f2-45c7-b620-12023af1f863-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"91accea7-45f2-45c7-b620-12023af1f863\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Nov 26 15:09:49 crc kubenswrapper[4785]: I1126 15:09:49.091632 4785 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Nov 26 15:09:49 crc kubenswrapper[4785]: I1126 15:09:49.092776 4785 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-4v4sr" Nov 26 15:09:49 crc kubenswrapper[4785]: I1126 15:09:49.093056 4785 generic.go:334] "Generic (PLEG): container finished" podID="5b0f6908-63a8-487c-ac41-e81114c43311" containerID="e8e59c3719f4769be1b82ffdd1169f29354272d4bc0a09bd6bedaacecd2577bb" exitCode=0 Nov 26 15:09:49 crc kubenswrapper[4785]: I1126 15:09:49.093125 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-74t7j" event={"ID":"5b0f6908-63a8-487c-ac41-e81114c43311","Type":"ContainerDied","Data":"e8e59c3719f4769be1b82ffdd1169f29354272d4bc0a09bd6bedaacecd2577bb"} Nov 26 15:09:49 crc kubenswrapper[4785]: I1126 15:09:49.093150 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-74t7j" event={"ID":"5b0f6908-63a8-487c-ac41-e81114c43311","Type":"ContainerStarted","Data":"138862daa24f51976894076a140afe04aa715795ad6946449ebf19c5d581ecfe"} Nov 26 15:09:49 crc kubenswrapper[4785]: I1126 15:09:49.093880 4785 patch_prober.go:28] interesting pod/router-default-5444994796-4v4sr container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 26 15:09:49 crc kubenswrapper[4785]: [-]has-synced failed: reason withheld Nov 26 15:09:49 crc kubenswrapper[4785]: [+]process-running ok Nov 26 15:09:49 crc kubenswrapper[4785]: healthz check failed Nov 26 15:09:49 crc kubenswrapper[4785]: I1126 15:09:49.093996 4785 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-4v4sr" podUID="fb4884c6-873b-4728-8633-4ce0b794dfcd" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 26 15:09:49 crc kubenswrapper[4785]: I1126 15:09:49.103803 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rqxdc" Nov 26 15:09:49 crc kubenswrapper[4785]: I1126 15:09:49.105295 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/91accea7-45f2-45c7-b620-12023af1f863-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"91accea7-45f2-45c7-b620-12023af1f863\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Nov 26 15:09:49 crc kubenswrapper[4785]: I1126 15:09:49.108371 4785 generic.go:334] "Generic (PLEG): container finished" podID="fb128907-8fff-4ad4-99f3-2877b676c46a" containerID="20e658ba377105ee5e2b92633febaba966ca089d23bf144ed2f41f029217e02a" exitCode=0 Nov 26 15:09:49 crc kubenswrapper[4785]: I1126 15:09:49.108455 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-896ql" event={"ID":"fb128907-8fff-4ad4-99f3-2877b676c46a","Type":"ContainerDied","Data":"20e658ba377105ee5e2b92633febaba966ca089d23bf144ed2f41f029217e02a"} Nov 26 15:09:49 crc kubenswrapper[4785]: I1126 15:09:49.108479 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-896ql" event={"ID":"fb128907-8fff-4ad4-99f3-2877b676c46a","Type":"ContainerStarted","Data":"a652f1465c2b1af444b103091a41db928b422980128f1d5e71fd032c5597271a"} Nov 26 15:09:49 crc kubenswrapper[4785]: I1126 15:09:49.111616 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-nrsd2" event={"ID":"80f8d801-cad5-4d41-b2c7-1bb2306b1b25","Type":"ContainerStarted","Data":"a0ad4ef719ebe6a761940a42e8a92bac899b06bfe2099b399c607dfdf4a1eff7"} Nov 26 15:09:49 crc kubenswrapper[4785]: I1126 15:09:49.111685 4785 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-nrsd2" Nov 26 15:09:49 crc kubenswrapper[4785]: I1126 15:09:49.116958 4785 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4h5d5" Nov 26 15:09:49 crc kubenswrapper[4785]: I1126 15:09:49.118940 4785 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-s6nf7" Nov 26 15:09:49 crc kubenswrapper[4785]: I1126 15:09:49.118967 4785 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-s6nf7" Nov 26 15:09:49 crc kubenswrapper[4785]: I1126 15:09:49.127951 4785 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-s6nf7" Nov 26 15:09:49 crc kubenswrapper[4785]: I1126 15:09:49.133799 4785 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-nrsd2" podStartSLOduration=131.133783089 podStartE2EDuration="2m11.133783089s" podCreationTimestamp="2025-11-26 15:07:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 15:09:49.131857899 +0000 UTC m=+152.810223663" watchObservedRunningTime="2025-11-26 15:09:49.133783089 +0000 UTC m=+152.812148853" Nov 26 15:09:49 crc kubenswrapper[4785]: I1126 15:09:49.150803 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Nov 26 15:09:49 crc kubenswrapper[4785]: I1126 15:09:49.188191 4785 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-f5pwc"] Nov 26 15:09:49 crc kubenswrapper[4785]: I1126 15:09:49.189570 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-f5pwc" Nov 26 15:09:49 crc kubenswrapper[4785]: I1126 15:09:49.199490 4785 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-f5pwc"] Nov 26 15:09:49 crc kubenswrapper[4785]: I1126 15:09:49.238004 4785 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-cllbv" Nov 26 15:09:49 crc kubenswrapper[4785]: I1126 15:09:49.396241 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z4z6j\" (UniqueName: \"kubernetes.io/projected/b5c5c94f-25c6-4d9e-80c2-57f09055aba9-kube-api-access-z4z6j\") pod \"redhat-operators-f5pwc\" (UID: \"b5c5c94f-25c6-4d9e-80c2-57f09055aba9\") " pod="openshift-marketplace/redhat-operators-f5pwc" Nov 26 15:09:49 crc kubenswrapper[4785]: I1126 15:09:49.396384 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b5c5c94f-25c6-4d9e-80c2-57f09055aba9-catalog-content\") pod \"redhat-operators-f5pwc\" (UID: \"b5c5c94f-25c6-4d9e-80c2-57f09055aba9\") " pod="openshift-marketplace/redhat-operators-f5pwc" Nov 26 15:09:49 crc kubenswrapper[4785]: I1126 15:09:49.396452 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b5c5c94f-25c6-4d9e-80c2-57f09055aba9-utilities\") pod \"redhat-operators-f5pwc\" (UID: \"b5c5c94f-25c6-4d9e-80c2-57f09055aba9\") " pod="openshift-marketplace/redhat-operators-f5pwc" Nov 26 15:09:49 crc kubenswrapper[4785]: I1126 15:09:49.489175 4785 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4z7ct" Nov 26 15:09:49 crc kubenswrapper[4785]: I1126 15:09:49.500338 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b5c5c94f-25c6-4d9e-80c2-57f09055aba9-catalog-content\") pod \"redhat-operators-f5pwc\" (UID: \"b5c5c94f-25c6-4d9e-80c2-57f09055aba9\") " pod="openshift-marketplace/redhat-operators-f5pwc" Nov 26 15:09:49 crc kubenswrapper[4785]: I1126 15:09:49.500400 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b5c5c94f-25c6-4d9e-80c2-57f09055aba9-utilities\") pod \"redhat-operators-f5pwc\" (UID: \"b5c5c94f-25c6-4d9e-80c2-57f09055aba9\") " pod="openshift-marketplace/redhat-operators-f5pwc" Nov 26 15:09:49 crc kubenswrapper[4785]: I1126 15:09:49.500472 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z4z6j\" (UniqueName: \"kubernetes.io/projected/b5c5c94f-25c6-4d9e-80c2-57f09055aba9-kube-api-access-z4z6j\") pod \"redhat-operators-f5pwc\" (UID: \"b5c5c94f-25c6-4d9e-80c2-57f09055aba9\") " pod="openshift-marketplace/redhat-operators-f5pwc" Nov 26 15:09:49 crc kubenswrapper[4785]: I1126 15:09:49.501568 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b5c5c94f-25c6-4d9e-80c2-57f09055aba9-catalog-content\") pod \"redhat-operators-f5pwc\" (UID: \"b5c5c94f-25c6-4d9e-80c2-57f09055aba9\") " pod="openshift-marketplace/redhat-operators-f5pwc" Nov 26 15:09:49 crc kubenswrapper[4785]: I1126 15:09:49.502356 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b5c5c94f-25c6-4d9e-80c2-57f09055aba9-utilities\") pod \"redhat-operators-f5pwc\" (UID: \"b5c5c94f-25c6-4d9e-80c2-57f09055aba9\") " pod="openshift-marketplace/redhat-operators-f5pwc" Nov 26 15:09:49 crc kubenswrapper[4785]: I1126 15:09:49.526948 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z4z6j\" (UniqueName: \"kubernetes.io/projected/b5c5c94f-25c6-4d9e-80c2-57f09055aba9-kube-api-access-z4z6j\") pod \"redhat-operators-f5pwc\" (UID: \"b5c5c94f-25c6-4d9e-80c2-57f09055aba9\") " pod="openshift-marketplace/redhat-operators-f5pwc" Nov 26 15:09:49 crc kubenswrapper[4785]: I1126 15:09:49.553059 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-f5pwc" Nov 26 15:09:49 crc kubenswrapper[4785]: I1126 15:09:49.559720 4785 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Nov 26 15:09:49 crc kubenswrapper[4785]: I1126 15:09:49.802335 4785 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-rqxdc"] Nov 26 15:09:50 crc kubenswrapper[4785]: I1126 15:09:50.069100 4785 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-f5pwc"] Nov 26 15:09:50 crc kubenswrapper[4785]: I1126 15:09:50.094949 4785 patch_prober.go:28] interesting pod/router-default-5444994796-4v4sr container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 26 15:09:50 crc kubenswrapper[4785]: [-]has-synced failed: reason withheld Nov 26 15:09:50 crc kubenswrapper[4785]: [+]process-running ok Nov 26 15:09:50 crc kubenswrapper[4785]: healthz check failed Nov 26 15:09:50 crc kubenswrapper[4785]: I1126 15:09:50.095012 4785 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-4v4sr" podUID="fb4884c6-873b-4728-8633-4ce0b794dfcd" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 26 15:09:50 crc kubenswrapper[4785]: I1126 15:09:50.136578 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"91accea7-45f2-45c7-b620-12023af1f863","Type":"ContainerStarted","Data":"1c8a8d4142423769c2c5b5b320ec3f719ff7ccf3fb82eff516664490c1e990f6"} Nov 26 15:09:50 crc kubenswrapper[4785]: I1126 15:09:50.140602 4785 generic.go:334] "Generic (PLEG): container finished" podID="afb002bc-9134-4d4c-b956-15ad3e700c49" containerID="31bd3a334482ead8b5c1623efea7ce87998d06a89110c376bc47c73f3ece6d67" exitCode=0 Nov 26 15:09:50 crc kubenswrapper[4785]: I1126 15:09:50.140777 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rqxdc" event={"ID":"afb002bc-9134-4d4c-b956-15ad3e700c49","Type":"ContainerDied","Data":"31bd3a334482ead8b5c1623efea7ce87998d06a89110c376bc47c73f3ece6d67"} Nov 26 15:09:50 crc kubenswrapper[4785]: I1126 15:09:50.140811 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rqxdc" event={"ID":"afb002bc-9134-4d4c-b956-15ad3e700c49","Type":"ContainerStarted","Data":"337b3925081fe5aa549c6af8a2761dbe6ab1da83c6a832f87daecafc8bd5d769"} Nov 26 15:09:50 crc kubenswrapper[4785]: I1126 15:09:50.153569 4785 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-s6nf7" Nov 26 15:09:50 crc kubenswrapper[4785]: I1126 15:09:50.418170 4785 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Nov 26 15:09:50 crc kubenswrapper[4785]: I1126 15:09:50.419730 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Nov 26 15:09:50 crc kubenswrapper[4785]: I1126 15:09:50.421922 4785 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Nov 26 15:09:50 crc kubenswrapper[4785]: I1126 15:09:50.423068 4785 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Nov 26 15:09:50 crc kubenswrapper[4785]: I1126 15:09:50.423266 4785 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Nov 26 15:09:50 crc kubenswrapper[4785]: I1126 15:09:50.532394 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/eafbd7cf-352b-45ee-b6f3-a796ae7a49ed-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"eafbd7cf-352b-45ee-b6f3-a796ae7a49ed\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Nov 26 15:09:50 crc kubenswrapper[4785]: I1126 15:09:50.532501 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/eafbd7cf-352b-45ee-b6f3-a796ae7a49ed-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"eafbd7cf-352b-45ee-b6f3-a796ae7a49ed\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Nov 26 15:09:50 crc kubenswrapper[4785]: I1126 15:09:50.633812 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/eafbd7cf-352b-45ee-b6f3-a796ae7a49ed-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"eafbd7cf-352b-45ee-b6f3-a796ae7a49ed\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Nov 26 15:09:50 crc kubenswrapper[4785]: I1126 15:09:50.633863 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/eafbd7cf-352b-45ee-b6f3-a796ae7a49ed-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"eafbd7cf-352b-45ee-b6f3-a796ae7a49ed\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Nov 26 15:09:50 crc kubenswrapper[4785]: I1126 15:09:50.633943 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/eafbd7cf-352b-45ee-b6f3-a796ae7a49ed-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"eafbd7cf-352b-45ee-b6f3-a796ae7a49ed\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Nov 26 15:09:50 crc kubenswrapper[4785]: I1126 15:09:50.654837 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/eafbd7cf-352b-45ee-b6f3-a796ae7a49ed-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"eafbd7cf-352b-45ee-b6f3-a796ae7a49ed\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Nov 26 15:09:50 crc kubenswrapper[4785]: I1126 15:09:50.776171 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Nov 26 15:09:51 crc kubenswrapper[4785]: I1126 15:09:51.089594 4785 patch_prober.go:28] interesting pod/router-default-5444994796-4v4sr container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 26 15:09:51 crc kubenswrapper[4785]: [-]has-synced failed: reason withheld Nov 26 15:09:51 crc kubenswrapper[4785]: [+]process-running ok Nov 26 15:09:51 crc kubenswrapper[4785]: healthz check failed Nov 26 15:09:51 crc kubenswrapper[4785]: I1126 15:09:51.089647 4785 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-4v4sr" podUID="fb4884c6-873b-4728-8633-4ce0b794dfcd" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 26 15:09:51 crc kubenswrapper[4785]: I1126 15:09:51.155147 4785 generic.go:334] "Generic (PLEG): container finished" podID="91accea7-45f2-45c7-b620-12023af1f863" containerID="325cd2cc3eadd611900d9042668ea8da92807bc35185fb747076d57e25ea3797" exitCode=0 Nov 26 15:09:51 crc kubenswrapper[4785]: I1126 15:09:51.155245 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"91accea7-45f2-45c7-b620-12023af1f863","Type":"ContainerDied","Data":"325cd2cc3eadd611900d9042668ea8da92807bc35185fb747076d57e25ea3797"} Nov 26 15:09:51 crc kubenswrapper[4785]: I1126 15:09:51.158216 4785 generic.go:334] "Generic (PLEG): container finished" podID="b5c5c94f-25c6-4d9e-80c2-57f09055aba9" containerID="5051ff079b19344a07b9f0107c7aae8be173696989da76c3bcadfaa7ec02b5e5" exitCode=0 Nov 26 15:09:51 crc kubenswrapper[4785]: I1126 15:09:51.158244 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-f5pwc" event={"ID":"b5c5c94f-25c6-4d9e-80c2-57f09055aba9","Type":"ContainerDied","Data":"5051ff079b19344a07b9f0107c7aae8be173696989da76c3bcadfaa7ec02b5e5"} Nov 26 15:09:51 crc kubenswrapper[4785]: I1126 15:09:51.158584 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-f5pwc" event={"ID":"b5c5c94f-25c6-4d9e-80c2-57f09055aba9","Type":"ContainerStarted","Data":"027647937cacd60d2a2863b9525d2cebe4a1c95d40d1f48c402dd0dc76408503"} Nov 26 15:09:51 crc kubenswrapper[4785]: I1126 15:09:51.257245 4785 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Nov 26 15:09:52 crc kubenswrapper[4785]: I1126 15:09:52.089917 4785 patch_prober.go:28] interesting pod/router-default-5444994796-4v4sr container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 26 15:09:52 crc kubenswrapper[4785]: [-]has-synced failed: reason withheld Nov 26 15:09:52 crc kubenswrapper[4785]: [+]process-running ok Nov 26 15:09:52 crc kubenswrapper[4785]: healthz check failed Nov 26 15:09:52 crc kubenswrapper[4785]: I1126 15:09:52.090002 4785 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-4v4sr" podUID="fb4884c6-873b-4728-8633-4ce0b794dfcd" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 26 15:09:52 crc kubenswrapper[4785]: I1126 15:09:52.179871 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"eafbd7cf-352b-45ee-b6f3-a796ae7a49ed","Type":"ContainerStarted","Data":"b650974a571488acb0a00ac96d50c22c229581e3f273d2f1b918116898fb2d81"} Nov 26 15:09:52 crc kubenswrapper[4785]: I1126 15:09:52.179921 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"eafbd7cf-352b-45ee-b6f3-a796ae7a49ed","Type":"ContainerStarted","Data":"07104a3b6757a2b6a6d8f7ea7c7f9139735c15e0cccac5e0deec01ada840ada6"} Nov 26 15:09:52 crc kubenswrapper[4785]: I1126 15:09:52.213630 4785 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-8-crc" podStartSLOduration=2.213611124 podStartE2EDuration="2.213611124s" podCreationTimestamp="2025-11-26 15:09:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 15:09:52.211410777 +0000 UTC m=+155.889776541" watchObservedRunningTime="2025-11-26 15:09:52.213611124 +0000 UTC m=+155.891976888" Nov 26 15:09:52 crc kubenswrapper[4785]: I1126 15:09:52.521602 4785 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Nov 26 15:09:52 crc kubenswrapper[4785]: I1126 15:09:52.662930 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/91accea7-45f2-45c7-b620-12023af1f863-kube-api-access\") pod \"91accea7-45f2-45c7-b620-12023af1f863\" (UID: \"91accea7-45f2-45c7-b620-12023af1f863\") " Nov 26 15:09:52 crc kubenswrapper[4785]: I1126 15:09:52.662986 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/91accea7-45f2-45c7-b620-12023af1f863-kubelet-dir\") pod \"91accea7-45f2-45c7-b620-12023af1f863\" (UID: \"91accea7-45f2-45c7-b620-12023af1f863\") " Nov 26 15:09:52 crc kubenswrapper[4785]: I1126 15:09:52.663100 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/91accea7-45f2-45c7-b620-12023af1f863-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "91accea7-45f2-45c7-b620-12023af1f863" (UID: "91accea7-45f2-45c7-b620-12023af1f863"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 15:09:52 crc kubenswrapper[4785]: I1126 15:09:52.663356 4785 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/91accea7-45f2-45c7-b620-12023af1f863-kubelet-dir\") on node \"crc\" DevicePath \"\"" Nov 26 15:09:52 crc kubenswrapper[4785]: I1126 15:09:52.667971 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/91accea7-45f2-45c7-b620-12023af1f863-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "91accea7-45f2-45c7-b620-12023af1f863" (UID: "91accea7-45f2-45c7-b620-12023af1f863"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 15:09:52 crc kubenswrapper[4785]: I1126 15:09:52.764472 4785 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/91accea7-45f2-45c7-b620-12023af1f863-kube-api-access\") on node \"crc\" DevicePath \"\"" Nov 26 15:09:53 crc kubenswrapper[4785]: I1126 15:09:53.091321 4785 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-4v4sr" Nov 26 15:09:53 crc kubenswrapper[4785]: I1126 15:09:53.100985 4785 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-4v4sr" Nov 26 15:09:53 crc kubenswrapper[4785]: I1126 15:09:53.204788 4785 generic.go:334] "Generic (PLEG): container finished" podID="d36477d3-bcc2-47e0-8aab-687e7ae01f9e" containerID="0aeed79d147bc76f39e87b05fe0e521053bd74345d229524269acb2c64fdc1e2" exitCode=0 Nov 26 15:09:53 crc kubenswrapper[4785]: I1126 15:09:53.204856 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29402820-vcn7w" event={"ID":"d36477d3-bcc2-47e0-8aab-687e7ae01f9e","Type":"ContainerDied","Data":"0aeed79d147bc76f39e87b05fe0e521053bd74345d229524269acb2c64fdc1e2"} Nov 26 15:09:53 crc kubenswrapper[4785]: I1126 15:09:53.225481 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"91accea7-45f2-45c7-b620-12023af1f863","Type":"ContainerDied","Data":"1c8a8d4142423769c2c5b5b320ec3f719ff7ccf3fb82eff516664490c1e990f6"} Nov 26 15:09:53 crc kubenswrapper[4785]: I1126 15:09:53.225519 4785 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1c8a8d4142423769c2c5b5b320ec3f719ff7ccf3fb82eff516664490c1e990f6" Nov 26 15:09:53 crc kubenswrapper[4785]: I1126 15:09:53.225585 4785 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Nov 26 15:09:53 crc kubenswrapper[4785]: I1126 15:09:53.228611 4785 generic.go:334] "Generic (PLEG): container finished" podID="eafbd7cf-352b-45ee-b6f3-a796ae7a49ed" containerID="b650974a571488acb0a00ac96d50c22c229581e3f273d2f1b918116898fb2d81" exitCode=0 Nov 26 15:09:53 crc kubenswrapper[4785]: I1126 15:09:53.229625 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"eafbd7cf-352b-45ee-b6f3-a796ae7a49ed","Type":"ContainerDied","Data":"b650974a571488acb0a00ac96d50c22c229581e3f273d2f1b918116898fb2d81"} Nov 26 15:09:54 crc kubenswrapper[4785]: I1126 15:09:54.246148 4785 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-452cx" Nov 26 15:09:54 crc kubenswrapper[4785]: I1126 15:09:54.652460 4785 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Nov 26 15:09:54 crc kubenswrapper[4785]: I1126 15:09:54.673886 4785 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29402820-vcn7w" Nov 26 15:09:54 crc kubenswrapper[4785]: I1126 15:09:54.830141 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d36477d3-bcc2-47e0-8aab-687e7ae01f9e-config-volume\") pod \"d36477d3-bcc2-47e0-8aab-687e7ae01f9e\" (UID: \"d36477d3-bcc2-47e0-8aab-687e7ae01f9e\") " Nov 26 15:09:54 crc kubenswrapper[4785]: I1126 15:09:54.830268 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dgg4l\" (UniqueName: \"kubernetes.io/projected/d36477d3-bcc2-47e0-8aab-687e7ae01f9e-kube-api-access-dgg4l\") pod \"d36477d3-bcc2-47e0-8aab-687e7ae01f9e\" (UID: \"d36477d3-bcc2-47e0-8aab-687e7ae01f9e\") " Nov 26 15:09:54 crc kubenswrapper[4785]: I1126 15:09:54.830293 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/eafbd7cf-352b-45ee-b6f3-a796ae7a49ed-kubelet-dir\") pod \"eafbd7cf-352b-45ee-b6f3-a796ae7a49ed\" (UID: \"eafbd7cf-352b-45ee-b6f3-a796ae7a49ed\") " Nov 26 15:09:54 crc kubenswrapper[4785]: I1126 15:09:54.830325 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/eafbd7cf-352b-45ee-b6f3-a796ae7a49ed-kube-api-access\") pod \"eafbd7cf-352b-45ee-b6f3-a796ae7a49ed\" (UID: \"eafbd7cf-352b-45ee-b6f3-a796ae7a49ed\") " Nov 26 15:09:54 crc kubenswrapper[4785]: I1126 15:09:54.830386 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d36477d3-bcc2-47e0-8aab-687e7ae01f9e-secret-volume\") pod \"d36477d3-bcc2-47e0-8aab-687e7ae01f9e\" (UID: \"d36477d3-bcc2-47e0-8aab-687e7ae01f9e\") " Nov 26 15:09:54 crc kubenswrapper[4785]: I1126 15:09:54.831617 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/eafbd7cf-352b-45ee-b6f3-a796ae7a49ed-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "eafbd7cf-352b-45ee-b6f3-a796ae7a49ed" (UID: "eafbd7cf-352b-45ee-b6f3-a796ae7a49ed"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 15:09:54 crc kubenswrapper[4785]: I1126 15:09:54.832124 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d36477d3-bcc2-47e0-8aab-687e7ae01f9e-config-volume" (OuterVolumeSpecName: "config-volume") pod "d36477d3-bcc2-47e0-8aab-687e7ae01f9e" (UID: "d36477d3-bcc2-47e0-8aab-687e7ae01f9e"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 15:09:54 crc kubenswrapper[4785]: I1126 15:09:54.837983 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d36477d3-bcc2-47e0-8aab-687e7ae01f9e-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "d36477d3-bcc2-47e0-8aab-687e7ae01f9e" (UID: "d36477d3-bcc2-47e0-8aab-687e7ae01f9e"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 15:09:54 crc kubenswrapper[4785]: I1126 15:09:54.840829 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d36477d3-bcc2-47e0-8aab-687e7ae01f9e-kube-api-access-dgg4l" (OuterVolumeSpecName: "kube-api-access-dgg4l") pod "d36477d3-bcc2-47e0-8aab-687e7ae01f9e" (UID: "d36477d3-bcc2-47e0-8aab-687e7ae01f9e"). InnerVolumeSpecName "kube-api-access-dgg4l". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 15:09:54 crc kubenswrapper[4785]: I1126 15:09:54.846105 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eafbd7cf-352b-45ee-b6f3-a796ae7a49ed-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "eafbd7cf-352b-45ee-b6f3-a796ae7a49ed" (UID: "eafbd7cf-352b-45ee-b6f3-a796ae7a49ed"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 15:09:54 crc kubenswrapper[4785]: I1126 15:09:54.932745 4785 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dgg4l\" (UniqueName: \"kubernetes.io/projected/d36477d3-bcc2-47e0-8aab-687e7ae01f9e-kube-api-access-dgg4l\") on node \"crc\" DevicePath \"\"" Nov 26 15:09:54 crc kubenswrapper[4785]: I1126 15:09:54.932863 4785 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/eafbd7cf-352b-45ee-b6f3-a796ae7a49ed-kubelet-dir\") on node \"crc\" DevicePath \"\"" Nov 26 15:09:54 crc kubenswrapper[4785]: I1126 15:09:54.932895 4785 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/eafbd7cf-352b-45ee-b6f3-a796ae7a49ed-kube-api-access\") on node \"crc\" DevicePath \"\"" Nov 26 15:09:54 crc kubenswrapper[4785]: I1126 15:09:54.932908 4785 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d36477d3-bcc2-47e0-8aab-687e7ae01f9e-secret-volume\") on node \"crc\" DevicePath \"\"" Nov 26 15:09:54 crc kubenswrapper[4785]: I1126 15:09:54.932916 4785 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d36477d3-bcc2-47e0-8aab-687e7ae01f9e-config-volume\") on node \"crc\" DevicePath \"\"" Nov 26 15:09:55 crc kubenswrapper[4785]: I1126 15:09:55.260089 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29402820-vcn7w" event={"ID":"d36477d3-bcc2-47e0-8aab-687e7ae01f9e","Type":"ContainerDied","Data":"5b4f9b2012bb6182e942ffcb0b9b438fe88c639cf056d08c661d9f893e08df82"} Nov 26 15:09:55 crc kubenswrapper[4785]: I1126 15:09:55.260125 4785 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5b4f9b2012bb6182e942ffcb0b9b438fe88c639cf056d08c661d9f893e08df82" Nov 26 15:09:55 crc kubenswrapper[4785]: I1126 15:09:55.260175 4785 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29402820-vcn7w" Nov 26 15:09:55 crc kubenswrapper[4785]: I1126 15:09:55.267933 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"eafbd7cf-352b-45ee-b6f3-a796ae7a49ed","Type":"ContainerDied","Data":"07104a3b6757a2b6a6d8f7ea7c7f9139735c15e0cccac5e0deec01ada840ada6"} Nov 26 15:09:55 crc kubenswrapper[4785]: I1126 15:09:55.267977 4785 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="07104a3b6757a2b6a6d8f7ea7c7f9139735c15e0cccac5e0deec01ada840ada6" Nov 26 15:09:55 crc kubenswrapper[4785]: I1126 15:09:55.268009 4785 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Nov 26 15:09:58 crc kubenswrapper[4785]: I1126 15:09:58.587037 4785 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-6cx55" Nov 26 15:09:58 crc kubenswrapper[4785]: I1126 15:09:58.591637 4785 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-6cx55" Nov 26 15:09:59 crc kubenswrapper[4785]: I1126 15:09:59.056934 4785 patch_prober.go:28] interesting pod/downloads-7954f5f757-vn49n container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" start-of-body= Nov 26 15:09:59 crc kubenswrapper[4785]: I1126 15:09:59.056959 4785 patch_prober.go:28] interesting pod/downloads-7954f5f757-vn49n container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" start-of-body= Nov 26 15:09:59 crc kubenswrapper[4785]: I1126 15:09:59.057016 4785 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-vn49n" podUID="cb9844e2-63a7-4437-ab9d-5047b2363580" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" Nov 26 15:09:59 crc kubenswrapper[4785]: I1126 15:09:59.057009 4785 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-vn49n" podUID="cb9844e2-63a7-4437-ab9d-5047b2363580" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" Nov 26 15:10:00 crc kubenswrapper[4785]: I1126 15:10:00.820978 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/72903df2-b694-4229-96b5-167500cab723-metrics-certs\") pod \"network-metrics-daemon-qdfwp\" (UID: \"72903df2-b694-4229-96b5-167500cab723\") " pod="openshift-multus/network-metrics-daemon-qdfwp" Nov 26 15:10:00 crc kubenswrapper[4785]: I1126 15:10:00.826275 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/72903df2-b694-4229-96b5-167500cab723-metrics-certs\") pod \"network-metrics-daemon-qdfwp\" (UID: \"72903df2-b694-4229-96b5-167500cab723\") " pod="openshift-multus/network-metrics-daemon-qdfwp" Nov 26 15:10:00 crc kubenswrapper[4785]: I1126 15:10:00.958224 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qdfwp" Nov 26 15:10:07 crc kubenswrapper[4785]: I1126 15:10:07.289546 4785 patch_prober.go:28] interesting pod/machine-config-daemon-gkxdl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 26 15:10:07 crc kubenswrapper[4785]: I1126 15:10:07.290209 4785 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-gkxdl" podUID="5539d39a-e2bc-4e7f-8b5a-a5e3e10c4ba4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 26 15:10:07 crc kubenswrapper[4785]: I1126 15:10:07.481575 4785 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-nrsd2" Nov 26 15:10:09 crc kubenswrapper[4785]: I1126 15:10:09.063523 4785 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-vn49n" Nov 26 15:10:19 crc kubenswrapper[4785]: I1126 15:10:19.743812 4785 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-qr82q" Nov 26 15:10:25 crc kubenswrapper[4785]: I1126 15:10:25.956825 4785 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 26 15:10:29 crc kubenswrapper[4785]: E1126 15:10:29.377444 4785 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Nov 26 15:10:29 crc kubenswrapper[4785]: E1126 15:10:29.377999 4785 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-gctt6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-zb7zt_openshift-marketplace(9558a7a8-65c7-4127-9174-aa1036efc91f): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Nov 26 15:10:29 crc kubenswrapper[4785]: E1126 15:10:29.379424 4785 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-zb7zt" podUID="9558a7a8-65c7-4127-9174-aa1036efc91f" Nov 26 15:10:32 crc kubenswrapper[4785]: E1126 15:10:32.695196 4785 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-zb7zt" podUID="9558a7a8-65c7-4127-9174-aa1036efc91f" Nov 26 15:10:34 crc kubenswrapper[4785]: E1126 15:10:34.856157 4785 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Nov 26 15:10:34 crc kubenswrapper[4785]: E1126 15:10:34.856544 4785 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-dsx94,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-q8fnj_openshift-marketplace(c5c2881b-79ef-4249-a393-dc3141e2e7c2): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Nov 26 15:10:34 crc kubenswrapper[4785]: E1126 15:10:34.857796 4785 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-q8fnj" podUID="c5c2881b-79ef-4249-a393-dc3141e2e7c2" Nov 26 15:10:36 crc kubenswrapper[4785]: E1126 15:10:36.106332 4785 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-q8fnj" podUID="c5c2881b-79ef-4249-a393-dc3141e2e7c2" Nov 26 15:10:36 crc kubenswrapper[4785]: E1126 15:10:36.361851 4785 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Nov 26 15:10:36 crc kubenswrapper[4785]: E1126 15:10:36.362151 4785 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-pgh9q,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-tj757_openshift-marketplace(24729921-e706-4a67-a93d-70bcc92b7bb8): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Nov 26 15:10:36 crc kubenswrapper[4785]: E1126 15:10:36.363424 4785 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-tj757" podUID="24729921-e706-4a67-a93d-70bcc92b7bb8" Nov 26 15:10:37 crc kubenswrapper[4785]: I1126 15:10:37.289594 4785 patch_prober.go:28] interesting pod/machine-config-daemon-gkxdl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 26 15:10:37 crc kubenswrapper[4785]: I1126 15:10:37.292917 4785 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-gkxdl" podUID="5539d39a-e2bc-4e7f-8b5a-a5e3e10c4ba4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 26 15:10:37 crc kubenswrapper[4785]: E1126 15:10:37.307742 4785 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Nov 26 15:10:37 crc kubenswrapper[4785]: E1126 15:10:37.307995 4785 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-kqp65,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-74t7j_openshift-marketplace(5b0f6908-63a8-487c-ac41-e81114c43311): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Nov 26 15:10:37 crc kubenswrapper[4785]: E1126 15:10:37.309216 4785 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-74t7j" podUID="5b0f6908-63a8-487c-ac41-e81114c43311" Nov 26 15:10:51 crc kubenswrapper[4785]: W1126 15:10:51.045464 4785 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod72903df2_b694_4229_96b5_167500cab723.slice/crio-8d6dd8cad20e247a0ec0dc75b695bdec80408af99cc77a1e0e06fad9fac13005 WatchSource:0}: Error finding container 8d6dd8cad20e247a0ec0dc75b695bdec80408af99cc77a1e0e06fad9fac13005: Status 404 returned error can't find the container with id 8d6dd8cad20e247a0ec0dc75b695bdec80408af99cc77a1e0e06fad9fac13005 Nov 26 15:10:51 crc kubenswrapper[4785]: I1126 15:10:51.057018 4785 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-qdfwp"] Nov 26 15:10:51 crc kubenswrapper[4785]: E1126 15:10:51.365809 4785 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Nov 26 15:10:51 crc kubenswrapper[4785]: E1126 15:10:51.366110 4785 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-b7wkz,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-rqxdc_openshift-marketplace(afb002bc-9134-4d4c-b956-15ad3e700c49): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Nov 26 15:10:51 crc kubenswrapper[4785]: E1126 15:10:51.368588 4785 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-rqxdc" podUID="afb002bc-9134-4d4c-b956-15ad3e700c49" Nov 26 15:10:51 crc kubenswrapper[4785]: I1126 15:10:51.636338 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-qdfwp" event={"ID":"72903df2-b694-4229-96b5-167500cab723","Type":"ContainerStarted","Data":"8d6dd8cad20e247a0ec0dc75b695bdec80408af99cc77a1e0e06fad9fac13005"} Nov 26 15:10:51 crc kubenswrapper[4785]: E1126 15:10:51.638234 4785 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-rqxdc" podUID="afb002bc-9134-4d4c-b956-15ad3e700c49" Nov 26 15:10:52 crc kubenswrapper[4785]: I1126 15:10:52.642574 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-qdfwp" event={"ID":"72903df2-b694-4229-96b5-167500cab723","Type":"ContainerStarted","Data":"0be736f3dd7f7a70cc01c80156fdd4de070d6f40e8688733227767689a37534a"} Nov 26 15:10:52 crc kubenswrapper[4785]: E1126 15:10:52.682686 4785 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Nov 26 15:10:52 crc kubenswrapper[4785]: E1126 15:10:52.682945 4785 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-fxml9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-896ql_openshift-marketplace(fb128907-8fff-4ad4-99f3-2877b676c46a): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Nov 26 15:10:52 crc kubenswrapper[4785]: E1126 15:10:52.684243 4785 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-896ql" podUID="fb128907-8fff-4ad4-99f3-2877b676c46a" Nov 26 15:10:53 crc kubenswrapper[4785]: I1126 15:10:53.647874 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-qdfwp" event={"ID":"72903df2-b694-4229-96b5-167500cab723","Type":"ContainerStarted","Data":"742428bca5bdf0000c25c9d7ac04444e2b9546c756eb0bbc85b91fbc7ab2fd00"} Nov 26 15:10:53 crc kubenswrapper[4785]: I1126 15:10:53.665682 4785 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-qdfwp" podStartSLOduration=195.665658886 podStartE2EDuration="3m15.665658886s" podCreationTimestamp="2025-11-26 15:07:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 15:10:53.664623226 +0000 UTC m=+217.342989040" watchObservedRunningTime="2025-11-26 15:10:53.665658886 +0000 UTC m=+217.344024680" Nov 26 15:10:55 crc kubenswrapper[4785]: E1126 15:10:55.684577 4785 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Nov 26 15:10:55 crc kubenswrapper[4785]: E1126 15:10:55.685205 4785 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-z4z6j,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-f5pwc_openshift-marketplace(b5c5c94f-25c6-4d9e-80c2-57f09055aba9): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Nov 26 15:10:55 crc kubenswrapper[4785]: E1126 15:10:55.686868 4785 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-f5pwc" podUID="b5c5c94f-25c6-4d9e-80c2-57f09055aba9" Nov 26 15:10:55 crc kubenswrapper[4785]: E1126 15:10:55.904058 4785 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Nov 26 15:10:55 crc kubenswrapper[4785]: E1126 15:10:55.904263 4785 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-8mqr2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-zdn6b_openshift-marketplace(441df9b7-56eb-451e-bdfa-beb5046aa892): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Nov 26 15:10:55 crc kubenswrapper[4785]: E1126 15:10:55.905544 4785 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-zdn6b" podUID="441df9b7-56eb-451e-bdfa-beb5046aa892" Nov 26 15:10:56 crc kubenswrapper[4785]: E1126 15:10:56.890186 4785 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-zdn6b" podUID="441df9b7-56eb-451e-bdfa-beb5046aa892" Nov 26 15:10:56 crc kubenswrapper[4785]: E1126 15:10:56.890262 4785 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-f5pwc" podUID="b5c5c94f-25c6-4d9e-80c2-57f09055aba9" Nov 26 15:10:57 crc kubenswrapper[4785]: I1126 15:10:57.671737 4785 generic.go:334] "Generic (PLEG): container finished" podID="5b0f6908-63a8-487c-ac41-e81114c43311" containerID="acdc0012490ad120024a9b5b096deec77106827b47fe3b239b652118b2d1cade" exitCode=0 Nov 26 15:10:57 crc kubenswrapper[4785]: I1126 15:10:57.671785 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-74t7j" event={"ID":"5b0f6908-63a8-487c-ac41-e81114c43311","Type":"ContainerDied","Data":"acdc0012490ad120024a9b5b096deec77106827b47fe3b239b652118b2d1cade"} Nov 26 15:10:57 crc kubenswrapper[4785]: I1126 15:10:57.674084 4785 generic.go:334] "Generic (PLEG): container finished" podID="24729921-e706-4a67-a93d-70bcc92b7bb8" containerID="01774895be5e631b44559d6896f781c4b0afee6dcfe83a8308ce31e9a9aa8b49" exitCode=0 Nov 26 15:10:57 crc kubenswrapper[4785]: I1126 15:10:57.674156 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tj757" event={"ID":"24729921-e706-4a67-a93d-70bcc92b7bb8","Type":"ContainerDied","Data":"01774895be5e631b44559d6896f781c4b0afee6dcfe83a8308ce31e9a9aa8b49"} Nov 26 15:10:57 crc kubenswrapper[4785]: I1126 15:10:57.676954 4785 generic.go:334] "Generic (PLEG): container finished" podID="c5c2881b-79ef-4249-a393-dc3141e2e7c2" containerID="f7f636ad237724bcaca6c69358cb44c7e7996e7be8803fee8d3a6bbd582ca91d" exitCode=0 Nov 26 15:10:57 crc kubenswrapper[4785]: I1126 15:10:57.677001 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-q8fnj" event={"ID":"c5c2881b-79ef-4249-a393-dc3141e2e7c2","Type":"ContainerDied","Data":"f7f636ad237724bcaca6c69358cb44c7e7996e7be8803fee8d3a6bbd582ca91d"} Nov 26 15:10:57 crc kubenswrapper[4785]: I1126 15:10:57.679438 4785 generic.go:334] "Generic (PLEG): container finished" podID="9558a7a8-65c7-4127-9174-aa1036efc91f" containerID="82ef80c8af4f9f0e3656991221146c39ae84d5198ddced371ed1514e247a5272" exitCode=0 Nov 26 15:10:57 crc kubenswrapper[4785]: I1126 15:10:57.679476 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zb7zt" event={"ID":"9558a7a8-65c7-4127-9174-aa1036efc91f","Type":"ContainerDied","Data":"82ef80c8af4f9f0e3656991221146c39ae84d5198ddced371ed1514e247a5272"} Nov 26 15:10:58 crc kubenswrapper[4785]: I1126 15:10:58.686813 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-q8fnj" event={"ID":"c5c2881b-79ef-4249-a393-dc3141e2e7c2","Type":"ContainerStarted","Data":"56b6a6926e215ca4d5b2d845ab4bf69abf200e14f7eca8eae5cc0ee5b261ae10"} Nov 26 15:10:58 crc kubenswrapper[4785]: I1126 15:10:58.690096 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zb7zt" event={"ID":"9558a7a8-65c7-4127-9174-aa1036efc91f","Type":"ContainerStarted","Data":"f22d0151e6b00adccb5774242c90983df5e615327d30f4456a1c2184318c5330"} Nov 26 15:10:58 crc kubenswrapper[4785]: I1126 15:10:58.692529 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-74t7j" event={"ID":"5b0f6908-63a8-487c-ac41-e81114c43311","Type":"ContainerStarted","Data":"04f2300fcaa2521dd445dcfaacf8087f7d3d20484fce34a830fb014b3f2318c2"} Nov 26 15:10:58 crc kubenswrapper[4785]: I1126 15:10:58.705996 4785 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-q8fnj" podStartSLOduration=2.387269712 podStartE2EDuration="1m13.705982618s" podCreationTimestamp="2025-11-26 15:09:45 +0000 UTC" firstStartedPulling="2025-11-26 15:09:47.035509465 +0000 UTC m=+150.713875229" lastFinishedPulling="2025-11-26 15:10:58.354222371 +0000 UTC m=+222.032588135" observedRunningTime="2025-11-26 15:10:58.704361032 +0000 UTC m=+222.382726806" watchObservedRunningTime="2025-11-26 15:10:58.705982618 +0000 UTC m=+222.384348372" Nov 26 15:10:58 crc kubenswrapper[4785]: I1126 15:10:58.708377 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tj757" event={"ID":"24729921-e706-4a67-a93d-70bcc92b7bb8","Type":"ContainerStarted","Data":"2707a7b814ed1a074f485da224a9fb55dd312cef4fccff971e267284bb9e32b1"} Nov 26 15:10:58 crc kubenswrapper[4785]: I1126 15:10:58.720442 4785 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-zb7zt" podStartSLOduration=3.561616851 podStartE2EDuration="1m13.720425325s" podCreationTimestamp="2025-11-26 15:09:45 +0000 UTC" firstStartedPulling="2025-11-26 15:09:48.058952021 +0000 UTC m=+151.737317785" lastFinishedPulling="2025-11-26 15:10:58.217760485 +0000 UTC m=+221.896126259" observedRunningTime="2025-11-26 15:10:58.717628794 +0000 UTC m=+222.395994578" watchObservedRunningTime="2025-11-26 15:10:58.720425325 +0000 UTC m=+222.398791089" Nov 26 15:10:58 crc kubenswrapper[4785]: I1126 15:10:58.737232 4785 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-74t7j" podStartSLOduration=3.619296638 podStartE2EDuration="1m11.737211819s" podCreationTimestamp="2025-11-26 15:09:47 +0000 UTC" firstStartedPulling="2025-11-26 15:09:50.183174248 +0000 UTC m=+153.861540022" lastFinishedPulling="2025-11-26 15:10:58.301089439 +0000 UTC m=+221.979455203" observedRunningTime="2025-11-26 15:10:58.732308328 +0000 UTC m=+222.410674112" watchObservedRunningTime="2025-11-26 15:10:58.737211819 +0000 UTC m=+222.415577583" Nov 26 15:10:58 crc kubenswrapper[4785]: I1126 15:10:58.754811 4785 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-tj757" podStartSLOduration=3.666953739 podStartE2EDuration="1m13.754788366s" podCreationTimestamp="2025-11-26 15:09:45 +0000 UTC" firstStartedPulling="2025-11-26 15:09:48.052130044 +0000 UTC m=+151.730495808" lastFinishedPulling="2025-11-26 15:10:58.139964671 +0000 UTC m=+221.818330435" observedRunningTime="2025-11-26 15:10:58.751608665 +0000 UTC m=+222.429974449" watchObservedRunningTime="2025-11-26 15:10:58.754788366 +0000 UTC m=+222.433154130" Nov 26 15:11:05 crc kubenswrapper[4785]: I1126 15:11:05.963193 4785 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-zb7zt" Nov 26 15:11:05 crc kubenswrapper[4785]: I1126 15:11:05.963428 4785 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-zb7zt" Nov 26 15:11:06 crc kubenswrapper[4785]: I1126 15:11:06.162031 4785 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-q8fnj" Nov 26 15:11:06 crc kubenswrapper[4785]: I1126 15:11:06.162135 4785 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-q8fnj" Nov 26 15:11:06 crc kubenswrapper[4785]: I1126 15:11:06.386961 4785 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-tj757" Nov 26 15:11:06 crc kubenswrapper[4785]: I1126 15:11:06.387014 4785 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-tj757" Nov 26 15:11:06 crc kubenswrapper[4785]: I1126 15:11:06.540320 4785 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-tj757" Nov 26 15:11:06 crc kubenswrapper[4785]: I1126 15:11:06.541059 4785 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-q8fnj" Nov 26 15:11:06 crc kubenswrapper[4785]: I1126 15:11:06.541534 4785 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-zb7zt" Nov 26 15:11:06 crc kubenswrapper[4785]: I1126 15:11:06.788837 4785 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-q8fnj" Nov 26 15:11:06 crc kubenswrapper[4785]: I1126 15:11:06.789472 4785 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-tj757" Nov 26 15:11:06 crc kubenswrapper[4785]: I1126 15:11:06.796939 4785 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-zb7zt" Nov 26 15:11:07 crc kubenswrapper[4785]: I1126 15:11:07.288785 4785 patch_prober.go:28] interesting pod/machine-config-daemon-gkxdl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 26 15:11:07 crc kubenswrapper[4785]: I1126 15:11:07.288842 4785 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-gkxdl" podUID="5539d39a-e2bc-4e7f-8b5a-a5e3e10c4ba4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 26 15:11:07 crc kubenswrapper[4785]: I1126 15:11:07.288890 4785 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-gkxdl" Nov 26 15:11:07 crc kubenswrapper[4785]: I1126 15:11:07.289465 4785 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"5538d579d2e0ac36c778f58da6e8176f8f7a0995797e8bc7c3ad21a054346867"} pod="openshift-machine-config-operator/machine-config-daemon-gkxdl" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 26 15:11:07 crc kubenswrapper[4785]: I1126 15:11:07.289620 4785 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-gkxdl" podUID="5539d39a-e2bc-4e7f-8b5a-a5e3e10c4ba4" containerName="machine-config-daemon" containerID="cri-o://5538d579d2e0ac36c778f58da6e8176f8f7a0995797e8bc7c3ad21a054346867" gracePeriod=600 Nov 26 15:11:07 crc kubenswrapper[4785]: I1126 15:11:07.757378 4785 generic.go:334] "Generic (PLEG): container finished" podID="5539d39a-e2bc-4e7f-8b5a-a5e3e10c4ba4" containerID="5538d579d2e0ac36c778f58da6e8176f8f7a0995797e8bc7c3ad21a054346867" exitCode=0 Nov 26 15:11:07 crc kubenswrapper[4785]: I1126 15:11:07.757468 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-gkxdl" event={"ID":"5539d39a-e2bc-4e7f-8b5a-a5e3e10c4ba4","Type":"ContainerDied","Data":"5538d579d2e0ac36c778f58da6e8176f8f7a0995797e8bc7c3ad21a054346867"} Nov 26 15:11:07 crc kubenswrapper[4785]: I1126 15:11:07.866526 4785 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-tj757"] Nov 26 15:11:07 crc kubenswrapper[4785]: I1126 15:11:07.948979 4785 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-jmg2s"] Nov 26 15:11:08 crc kubenswrapper[4785]: I1126 15:11:08.181945 4785 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-74t7j" Nov 26 15:11:08 crc kubenswrapper[4785]: I1126 15:11:08.181992 4785 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-74t7j" Nov 26 15:11:08 crc kubenswrapper[4785]: I1126 15:11:08.237485 4785 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-74t7j" Nov 26 15:11:08 crc kubenswrapper[4785]: I1126 15:11:08.761639 4785 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-tj757" podUID="24729921-e706-4a67-a93d-70bcc92b7bb8" containerName="registry-server" containerID="cri-o://2707a7b814ed1a074f485da224a9fb55dd312cef4fccff971e267284bb9e32b1" gracePeriod=2 Nov 26 15:11:08 crc kubenswrapper[4785]: I1126 15:11:08.811821 4785 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-74t7j" Nov 26 15:11:11 crc kubenswrapper[4785]: I1126 15:11:11.714488 4785 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-tj757" Nov 26 15:11:11 crc kubenswrapper[4785]: I1126 15:11:11.774876 4785 generic.go:334] "Generic (PLEG): container finished" podID="24729921-e706-4a67-a93d-70bcc92b7bb8" containerID="2707a7b814ed1a074f485da224a9fb55dd312cef4fccff971e267284bb9e32b1" exitCode=0 Nov 26 15:11:11 crc kubenswrapper[4785]: I1126 15:11:11.774918 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tj757" event={"ID":"24729921-e706-4a67-a93d-70bcc92b7bb8","Type":"ContainerDied","Data":"2707a7b814ed1a074f485da224a9fb55dd312cef4fccff971e267284bb9e32b1"} Nov 26 15:11:11 crc kubenswrapper[4785]: I1126 15:11:11.774947 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tj757" event={"ID":"24729921-e706-4a67-a93d-70bcc92b7bb8","Type":"ContainerDied","Data":"0d30f8ccb7be1cea7407bd83ecd05461018f9d8fa6ddb650a27f7d5613b7b217"} Nov 26 15:11:11 crc kubenswrapper[4785]: I1126 15:11:11.774968 4785 scope.go:117] "RemoveContainer" containerID="2707a7b814ed1a074f485da224a9fb55dd312cef4fccff971e267284bb9e32b1" Nov 26 15:11:11 crc kubenswrapper[4785]: I1126 15:11:11.775076 4785 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-tj757" Nov 26 15:11:11 crc kubenswrapper[4785]: I1126 15:11:11.895723 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/24729921-e706-4a67-a93d-70bcc92b7bb8-catalog-content\") pod \"24729921-e706-4a67-a93d-70bcc92b7bb8\" (UID: \"24729921-e706-4a67-a93d-70bcc92b7bb8\") " Nov 26 15:11:11 crc kubenswrapper[4785]: I1126 15:11:11.895779 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pgh9q\" (UniqueName: \"kubernetes.io/projected/24729921-e706-4a67-a93d-70bcc92b7bb8-kube-api-access-pgh9q\") pod \"24729921-e706-4a67-a93d-70bcc92b7bb8\" (UID: \"24729921-e706-4a67-a93d-70bcc92b7bb8\") " Nov 26 15:11:11 crc kubenswrapper[4785]: I1126 15:11:11.895818 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/24729921-e706-4a67-a93d-70bcc92b7bb8-utilities\") pod \"24729921-e706-4a67-a93d-70bcc92b7bb8\" (UID: \"24729921-e706-4a67-a93d-70bcc92b7bb8\") " Nov 26 15:11:11 crc kubenswrapper[4785]: I1126 15:11:11.897044 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/24729921-e706-4a67-a93d-70bcc92b7bb8-utilities" (OuterVolumeSpecName: "utilities") pod "24729921-e706-4a67-a93d-70bcc92b7bb8" (UID: "24729921-e706-4a67-a93d-70bcc92b7bb8"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 15:11:11 crc kubenswrapper[4785]: I1126 15:11:11.901814 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/24729921-e706-4a67-a93d-70bcc92b7bb8-kube-api-access-pgh9q" (OuterVolumeSpecName: "kube-api-access-pgh9q") pod "24729921-e706-4a67-a93d-70bcc92b7bb8" (UID: "24729921-e706-4a67-a93d-70bcc92b7bb8"). InnerVolumeSpecName "kube-api-access-pgh9q". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 15:11:11 crc kubenswrapper[4785]: I1126 15:11:11.997199 4785 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pgh9q\" (UniqueName: \"kubernetes.io/projected/24729921-e706-4a67-a93d-70bcc92b7bb8-kube-api-access-pgh9q\") on node \"crc\" DevicePath \"\"" Nov 26 15:11:11 crc kubenswrapper[4785]: I1126 15:11:11.997225 4785 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/24729921-e706-4a67-a93d-70bcc92b7bb8-utilities\") on node \"crc\" DevicePath \"\"" Nov 26 15:11:12 crc kubenswrapper[4785]: I1126 15:11:12.388442 4785 scope.go:117] "RemoveContainer" containerID="01774895be5e631b44559d6896f781c4b0afee6dcfe83a8308ce31e9a9aa8b49" Nov 26 15:11:12 crc kubenswrapper[4785]: I1126 15:11:12.448613 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/24729921-e706-4a67-a93d-70bcc92b7bb8-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "24729921-e706-4a67-a93d-70bcc92b7bb8" (UID: "24729921-e706-4a67-a93d-70bcc92b7bb8"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 15:11:12 crc kubenswrapper[4785]: I1126 15:11:12.503095 4785 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/24729921-e706-4a67-a93d-70bcc92b7bb8-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 26 15:11:12 crc kubenswrapper[4785]: I1126 15:11:12.704323 4785 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-tj757"] Nov 26 15:11:12 crc kubenswrapper[4785]: I1126 15:11:12.706843 4785 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-tj757"] Nov 26 15:11:12 crc kubenswrapper[4785]: I1126 15:11:12.757762 4785 scope.go:117] "RemoveContainer" containerID="ae81b5cf1ca261fd3808ea671ed1ffbbb91e6a4931138d1f488b9a52a1bf55b9" Nov 26 15:11:13 crc kubenswrapper[4785]: I1126 15:11:13.047289 4785 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="24729921-e706-4a67-a93d-70bcc92b7bb8" path="/var/lib/kubelet/pods/24729921-e706-4a67-a93d-70bcc92b7bb8/volumes" Nov 26 15:11:13 crc kubenswrapper[4785]: I1126 15:11:13.942643 4785 scope.go:117] "RemoveContainer" containerID="2707a7b814ed1a074f485da224a9fb55dd312cef4fccff971e267284bb9e32b1" Nov 26 15:11:13 crc kubenswrapper[4785]: E1126 15:11:13.943971 4785 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2707a7b814ed1a074f485da224a9fb55dd312cef4fccff971e267284bb9e32b1\": container with ID starting with 2707a7b814ed1a074f485da224a9fb55dd312cef4fccff971e267284bb9e32b1 not found: ID does not exist" containerID="2707a7b814ed1a074f485da224a9fb55dd312cef4fccff971e267284bb9e32b1" Nov 26 15:11:13 crc kubenswrapper[4785]: I1126 15:11:13.944042 4785 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2707a7b814ed1a074f485da224a9fb55dd312cef4fccff971e267284bb9e32b1"} err="failed to get container status \"2707a7b814ed1a074f485da224a9fb55dd312cef4fccff971e267284bb9e32b1\": rpc error: code = NotFound desc = could not find container \"2707a7b814ed1a074f485da224a9fb55dd312cef4fccff971e267284bb9e32b1\": container with ID starting with 2707a7b814ed1a074f485da224a9fb55dd312cef4fccff971e267284bb9e32b1 not found: ID does not exist" Nov 26 15:11:13 crc kubenswrapper[4785]: I1126 15:11:13.944081 4785 scope.go:117] "RemoveContainer" containerID="01774895be5e631b44559d6896f781c4b0afee6dcfe83a8308ce31e9a9aa8b49" Nov 26 15:11:13 crc kubenswrapper[4785]: E1126 15:11:13.944459 4785 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"01774895be5e631b44559d6896f781c4b0afee6dcfe83a8308ce31e9a9aa8b49\": container with ID starting with 01774895be5e631b44559d6896f781c4b0afee6dcfe83a8308ce31e9a9aa8b49 not found: ID does not exist" containerID="01774895be5e631b44559d6896f781c4b0afee6dcfe83a8308ce31e9a9aa8b49" Nov 26 15:11:13 crc kubenswrapper[4785]: I1126 15:11:13.944500 4785 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"01774895be5e631b44559d6896f781c4b0afee6dcfe83a8308ce31e9a9aa8b49"} err="failed to get container status \"01774895be5e631b44559d6896f781c4b0afee6dcfe83a8308ce31e9a9aa8b49\": rpc error: code = NotFound desc = could not find container \"01774895be5e631b44559d6896f781c4b0afee6dcfe83a8308ce31e9a9aa8b49\": container with ID starting with 01774895be5e631b44559d6896f781c4b0afee6dcfe83a8308ce31e9a9aa8b49 not found: ID does not exist" Nov 26 15:11:13 crc kubenswrapper[4785]: I1126 15:11:13.944526 4785 scope.go:117] "RemoveContainer" containerID="ae81b5cf1ca261fd3808ea671ed1ffbbb91e6a4931138d1f488b9a52a1bf55b9" Nov 26 15:11:13 crc kubenswrapper[4785]: E1126 15:11:13.944854 4785 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ae81b5cf1ca261fd3808ea671ed1ffbbb91e6a4931138d1f488b9a52a1bf55b9\": container with ID starting with ae81b5cf1ca261fd3808ea671ed1ffbbb91e6a4931138d1f488b9a52a1bf55b9 not found: ID does not exist" containerID="ae81b5cf1ca261fd3808ea671ed1ffbbb91e6a4931138d1f488b9a52a1bf55b9" Nov 26 15:11:13 crc kubenswrapper[4785]: I1126 15:11:13.944942 4785 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ae81b5cf1ca261fd3808ea671ed1ffbbb91e6a4931138d1f488b9a52a1bf55b9"} err="failed to get container status \"ae81b5cf1ca261fd3808ea671ed1ffbbb91e6a4931138d1f488b9a52a1bf55b9\": rpc error: code = NotFound desc = could not find container \"ae81b5cf1ca261fd3808ea671ed1ffbbb91e6a4931138d1f488b9a52a1bf55b9\": container with ID starting with ae81b5cf1ca261fd3808ea671ed1ffbbb91e6a4931138d1f488b9a52a1bf55b9 not found: ID does not exist" Nov 26 15:11:16 crc kubenswrapper[4785]: I1126 15:11:16.806096 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-gkxdl" event={"ID":"5539d39a-e2bc-4e7f-8b5a-a5e3e10c4ba4","Type":"ContainerStarted","Data":"5cffe927e2d5a0446788b5fd7d127ce9dcbaf8f7bdc56ee8038410996d1c3424"} Nov 26 15:11:16 crc kubenswrapper[4785]: I1126 15:11:16.808502 4785 generic.go:334] "Generic (PLEG): container finished" podID="b5c5c94f-25c6-4d9e-80c2-57f09055aba9" containerID="a8bbf6ec4a1fc51850a6e620bebf3647a4932c382abac79ec95b6e6e6b311abe" exitCode=0 Nov 26 15:11:16 crc kubenswrapper[4785]: I1126 15:11:16.808708 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-f5pwc" event={"ID":"b5c5c94f-25c6-4d9e-80c2-57f09055aba9","Type":"ContainerDied","Data":"a8bbf6ec4a1fc51850a6e620bebf3647a4932c382abac79ec95b6e6e6b311abe"} Nov 26 15:11:16 crc kubenswrapper[4785]: I1126 15:11:16.811341 4785 generic.go:334] "Generic (PLEG): container finished" podID="fb128907-8fff-4ad4-99f3-2877b676c46a" containerID="64dd987a699bf6fed70276198be334f367dfdd27e8f695446ddbc2c213000ec9" exitCode=0 Nov 26 15:11:16 crc kubenswrapper[4785]: I1126 15:11:16.811397 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-896ql" event={"ID":"fb128907-8fff-4ad4-99f3-2877b676c46a","Type":"ContainerDied","Data":"64dd987a699bf6fed70276198be334f367dfdd27e8f695446ddbc2c213000ec9"} Nov 26 15:11:16 crc kubenswrapper[4785]: I1126 15:11:16.813811 4785 generic.go:334] "Generic (PLEG): container finished" podID="afb002bc-9134-4d4c-b956-15ad3e700c49" containerID="92fb77da21fbfafb18f7fc7830fad7ddba592c90d22dce9690cde855b699420a" exitCode=0 Nov 26 15:11:16 crc kubenswrapper[4785]: I1126 15:11:16.813841 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rqxdc" event={"ID":"afb002bc-9134-4d4c-b956-15ad3e700c49","Type":"ContainerDied","Data":"92fb77da21fbfafb18f7fc7830fad7ddba592c90d22dce9690cde855b699420a"} Nov 26 15:11:17 crc kubenswrapper[4785]: I1126 15:11:17.819824 4785 generic.go:334] "Generic (PLEG): container finished" podID="441df9b7-56eb-451e-bdfa-beb5046aa892" containerID="97da9a0296a3f9e1b9c9f2254bdbf3c89dfc35e918a3261ba9e77f16bb43d338" exitCode=0 Nov 26 15:11:17 crc kubenswrapper[4785]: I1126 15:11:17.819891 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zdn6b" event={"ID":"441df9b7-56eb-451e-bdfa-beb5046aa892","Type":"ContainerDied","Data":"97da9a0296a3f9e1b9c9f2254bdbf3c89dfc35e918a3261ba9e77f16bb43d338"} Nov 26 15:11:18 crc kubenswrapper[4785]: I1126 15:11:18.827355 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-f5pwc" event={"ID":"b5c5c94f-25c6-4d9e-80c2-57f09055aba9","Type":"ContainerStarted","Data":"23f32a377d2c3eb2e3c287029f8c0b13d8359a3b8ff0f1edfcf61ab345f9bf62"} Nov 26 15:11:18 crc kubenswrapper[4785]: I1126 15:11:18.830278 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rqxdc" event={"ID":"afb002bc-9134-4d4c-b956-15ad3e700c49","Type":"ContainerStarted","Data":"64d079305ba96f01fc98b43154c648a5f29e8b91bcbeff719ae2f935c0cad09d"} Nov 26 15:11:18 crc kubenswrapper[4785]: I1126 15:11:18.833889 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-896ql" event={"ID":"fb128907-8fff-4ad4-99f3-2877b676c46a","Type":"ContainerStarted","Data":"3a2c284f93a9a0de54fde2be858a1812838542e83ca5ee5c73f3e59deb257493"} Nov 26 15:11:18 crc kubenswrapper[4785]: I1126 15:11:18.853972 4785 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-f5pwc" podStartSLOduration=2.860561333 podStartE2EDuration="1m29.853955384s" podCreationTimestamp="2025-11-26 15:09:49 +0000 UTC" firstStartedPulling="2025-11-26 15:09:51.163682691 +0000 UTC m=+154.842048455" lastFinishedPulling="2025-11-26 15:11:18.157076742 +0000 UTC m=+241.835442506" observedRunningTime="2025-11-26 15:11:18.852602974 +0000 UTC m=+242.530968758" watchObservedRunningTime="2025-11-26 15:11:18.853955384 +0000 UTC m=+242.532321148" Nov 26 15:11:18 crc kubenswrapper[4785]: I1126 15:11:18.870149 4785 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-rqxdc" podStartSLOduration=3.136828594 podStartE2EDuration="1m30.87013015s" podCreationTimestamp="2025-11-26 15:09:48 +0000 UTC" firstStartedPulling="2025-11-26 15:09:50.183918607 +0000 UTC m=+153.862284391" lastFinishedPulling="2025-11-26 15:11:17.917220163 +0000 UTC m=+241.595585947" observedRunningTime="2025-11-26 15:11:18.869651346 +0000 UTC m=+242.548017110" watchObservedRunningTime="2025-11-26 15:11:18.87013015 +0000 UTC m=+242.548495924" Nov 26 15:11:18 crc kubenswrapper[4785]: I1126 15:11:18.887254 4785 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-896ql" podStartSLOduration=3.779755985 podStartE2EDuration="1m31.887238394s" podCreationTimestamp="2025-11-26 15:09:47 +0000 UTC" firstStartedPulling="2025-11-26 15:09:50.183604709 +0000 UTC m=+153.861970483" lastFinishedPulling="2025-11-26 15:11:18.291087118 +0000 UTC m=+241.969452892" observedRunningTime="2025-11-26 15:11:18.885348089 +0000 UTC m=+242.563713853" watchObservedRunningTime="2025-11-26 15:11:18.887238394 +0000 UTC m=+242.565604158" Nov 26 15:11:19 crc kubenswrapper[4785]: I1126 15:11:19.104887 4785 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-rqxdc" Nov 26 15:11:19 crc kubenswrapper[4785]: I1126 15:11:19.104955 4785 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-rqxdc" Nov 26 15:11:19 crc kubenswrapper[4785]: I1126 15:11:19.554781 4785 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-f5pwc" Nov 26 15:11:19 crc kubenswrapper[4785]: I1126 15:11:19.554850 4785 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-f5pwc" Nov 26 15:11:19 crc kubenswrapper[4785]: I1126 15:11:19.840799 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zdn6b" event={"ID":"441df9b7-56eb-451e-bdfa-beb5046aa892","Type":"ContainerStarted","Data":"9679946ed2d38c1768e95f9aca93aad3656c9f963941e36ae5dc0dad238941bb"} Nov 26 15:11:19 crc kubenswrapper[4785]: I1126 15:11:19.863029 4785 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-zdn6b" podStartSLOduration=2.788087648 podStartE2EDuration="1m33.863010541s" podCreationTimestamp="2025-11-26 15:09:46 +0000 UTC" firstStartedPulling="2025-11-26 15:09:48.04734507 +0000 UTC m=+151.725710834" lastFinishedPulling="2025-11-26 15:11:19.122267963 +0000 UTC m=+242.800633727" observedRunningTime="2025-11-26 15:11:19.861523458 +0000 UTC m=+243.539889242" watchObservedRunningTime="2025-11-26 15:11:19.863010541 +0000 UTC m=+243.541376305" Nov 26 15:11:20 crc kubenswrapper[4785]: I1126 15:11:20.155580 4785 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-rqxdc" podUID="afb002bc-9134-4d4c-b956-15ad3e700c49" containerName="registry-server" probeResult="failure" output=< Nov 26 15:11:20 crc kubenswrapper[4785]: timeout: failed to connect service ":50051" within 1s Nov 26 15:11:20 crc kubenswrapper[4785]: > Nov 26 15:11:20 crc kubenswrapper[4785]: I1126 15:11:20.595075 4785 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-f5pwc" podUID="b5c5c94f-25c6-4d9e-80c2-57f09055aba9" containerName="registry-server" probeResult="failure" output=< Nov 26 15:11:20 crc kubenswrapper[4785]: timeout: failed to connect service ":50051" within 1s Nov 26 15:11:20 crc kubenswrapper[4785]: > Nov 26 15:11:26 crc kubenswrapper[4785]: I1126 15:11:26.576193 4785 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-zdn6b" Nov 26 15:11:26 crc kubenswrapper[4785]: I1126 15:11:26.576989 4785 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-zdn6b" Nov 26 15:11:26 crc kubenswrapper[4785]: I1126 15:11:26.627856 4785 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-zdn6b" Nov 26 15:11:26 crc kubenswrapper[4785]: I1126 15:11:26.952668 4785 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-zdn6b" Nov 26 15:11:27 crc kubenswrapper[4785]: I1126 15:11:27.012642 4785 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-zdn6b"] Nov 26 15:11:28 crc kubenswrapper[4785]: I1126 15:11:28.303020 4785 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-896ql" Nov 26 15:11:28 crc kubenswrapper[4785]: I1126 15:11:28.303135 4785 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-896ql" Nov 26 15:11:28 crc kubenswrapper[4785]: I1126 15:11:28.381330 4785 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-896ql" Nov 26 15:11:28 crc kubenswrapper[4785]: I1126 15:11:28.890173 4785 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-zdn6b" podUID="441df9b7-56eb-451e-bdfa-beb5046aa892" containerName="registry-server" containerID="cri-o://9679946ed2d38c1768e95f9aca93aad3656c9f963941e36ae5dc0dad238941bb" gracePeriod=2 Nov 26 15:11:28 crc kubenswrapper[4785]: I1126 15:11:28.950049 4785 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-896ql" Nov 26 15:11:29 crc kubenswrapper[4785]: I1126 15:11:29.153201 4785 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-rqxdc" Nov 26 15:11:29 crc kubenswrapper[4785]: I1126 15:11:29.193837 4785 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-rqxdc" Nov 26 15:11:29 crc kubenswrapper[4785]: I1126 15:11:29.616106 4785 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-f5pwc" Nov 26 15:11:29 crc kubenswrapper[4785]: I1126 15:11:29.658857 4785 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-f5pwc" Nov 26 15:11:30 crc kubenswrapper[4785]: I1126 15:11:30.061620 4785 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-896ql"] Nov 26 15:11:30 crc kubenswrapper[4785]: I1126 15:11:30.901420 4785 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-896ql" podUID="fb128907-8fff-4ad4-99f3-2877b676c46a" containerName="registry-server" containerID="cri-o://3a2c284f93a9a0de54fde2be858a1812838542e83ca5ee5c73f3e59deb257493" gracePeriod=2 Nov 26 15:11:32 crc kubenswrapper[4785]: I1126 15:11:32.458309 4785 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-f5pwc"] Nov 26 15:11:32 crc kubenswrapper[4785]: I1126 15:11:32.458625 4785 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-f5pwc" podUID="b5c5c94f-25c6-4d9e-80c2-57f09055aba9" containerName="registry-server" containerID="cri-o://23f32a377d2c3eb2e3c287029f8c0b13d8359a3b8ff0f1edfcf61ab345f9bf62" gracePeriod=2 Nov 26 15:11:32 crc kubenswrapper[4785]: I1126 15:11:32.913142 4785 generic.go:334] "Generic (PLEG): container finished" podID="441df9b7-56eb-451e-bdfa-beb5046aa892" containerID="9679946ed2d38c1768e95f9aca93aad3656c9f963941e36ae5dc0dad238941bb" exitCode=0 Nov 26 15:11:32 crc kubenswrapper[4785]: I1126 15:11:32.913207 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zdn6b" event={"ID":"441df9b7-56eb-451e-bdfa-beb5046aa892","Type":"ContainerDied","Data":"9679946ed2d38c1768e95f9aca93aad3656c9f963941e36ae5dc0dad238941bb"} Nov 26 15:11:32 crc kubenswrapper[4785]: I1126 15:11:32.914831 4785 generic.go:334] "Generic (PLEG): container finished" podID="fb128907-8fff-4ad4-99f3-2877b676c46a" containerID="3a2c284f93a9a0de54fde2be858a1812838542e83ca5ee5c73f3e59deb257493" exitCode=0 Nov 26 15:11:32 crc kubenswrapper[4785]: I1126 15:11:32.914858 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-896ql" event={"ID":"fb128907-8fff-4ad4-99f3-2877b676c46a","Type":"ContainerDied","Data":"3a2c284f93a9a0de54fde2be858a1812838542e83ca5ee5c73f3e59deb257493"} Nov 26 15:11:32 crc kubenswrapper[4785]: I1126 15:11:32.971718 4785 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-jmg2s" podUID="5d3c5506-f58a-45bb-adbe-e895b7e4d646" containerName="oauth-openshift" containerID="cri-o://5d26e7e5415ccbf92293e4a6ac21162b55c99206553e2d3996f3fadbee0ded50" gracePeriod=15 Nov 26 15:11:33 crc kubenswrapper[4785]: I1126 15:11:33.648103 4785 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zdn6b" Nov 26 15:11:33 crc kubenswrapper[4785]: I1126 15:11:33.783026 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/441df9b7-56eb-451e-bdfa-beb5046aa892-catalog-content\") pod \"441df9b7-56eb-451e-bdfa-beb5046aa892\" (UID: \"441df9b7-56eb-451e-bdfa-beb5046aa892\") " Nov 26 15:11:33 crc kubenswrapper[4785]: I1126 15:11:33.783132 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/441df9b7-56eb-451e-bdfa-beb5046aa892-utilities\") pod \"441df9b7-56eb-451e-bdfa-beb5046aa892\" (UID: \"441df9b7-56eb-451e-bdfa-beb5046aa892\") " Nov 26 15:11:33 crc kubenswrapper[4785]: I1126 15:11:33.783166 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8mqr2\" (UniqueName: \"kubernetes.io/projected/441df9b7-56eb-451e-bdfa-beb5046aa892-kube-api-access-8mqr2\") pod \"441df9b7-56eb-451e-bdfa-beb5046aa892\" (UID: \"441df9b7-56eb-451e-bdfa-beb5046aa892\") " Nov 26 15:11:33 crc kubenswrapper[4785]: I1126 15:11:33.784644 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/441df9b7-56eb-451e-bdfa-beb5046aa892-utilities" (OuterVolumeSpecName: "utilities") pod "441df9b7-56eb-451e-bdfa-beb5046aa892" (UID: "441df9b7-56eb-451e-bdfa-beb5046aa892"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 15:11:33 crc kubenswrapper[4785]: I1126 15:11:33.791394 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/441df9b7-56eb-451e-bdfa-beb5046aa892-kube-api-access-8mqr2" (OuterVolumeSpecName: "kube-api-access-8mqr2") pod "441df9b7-56eb-451e-bdfa-beb5046aa892" (UID: "441df9b7-56eb-451e-bdfa-beb5046aa892"). InnerVolumeSpecName "kube-api-access-8mqr2". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 15:11:33 crc kubenswrapper[4785]: I1126 15:11:33.884279 4785 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/441df9b7-56eb-451e-bdfa-beb5046aa892-utilities\") on node \"crc\" DevicePath \"\"" Nov 26 15:11:33 crc kubenswrapper[4785]: I1126 15:11:33.884314 4785 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8mqr2\" (UniqueName: \"kubernetes.io/projected/441df9b7-56eb-451e-bdfa-beb5046aa892-kube-api-access-8mqr2\") on node \"crc\" DevicePath \"\"" Nov 26 15:11:33 crc kubenswrapper[4785]: I1126 15:11:33.924318 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zdn6b" event={"ID":"441df9b7-56eb-451e-bdfa-beb5046aa892","Type":"ContainerDied","Data":"59eae0619bfd2a2a2298d7a106ab678b77733c64020351eca1ac096bde4b084b"} Nov 26 15:11:33 crc kubenswrapper[4785]: I1126 15:11:33.924386 4785 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zdn6b" Nov 26 15:11:33 crc kubenswrapper[4785]: I1126 15:11:33.924712 4785 scope.go:117] "RemoveContainer" containerID="9679946ed2d38c1768e95f9aca93aad3656c9f963941e36ae5dc0dad238941bb" Nov 26 15:11:33 crc kubenswrapper[4785]: I1126 15:11:33.947336 4785 scope.go:117] "RemoveContainer" containerID="97da9a0296a3f9e1b9c9f2254bdbf3c89dfc35e918a3261ba9e77f16bb43d338" Nov 26 15:11:33 crc kubenswrapper[4785]: I1126 15:11:33.961483 4785 scope.go:117] "RemoveContainer" containerID="1eb5d3ebaecb8563a2e3d569df9fa6dc49da0f1e9906351be49138fa82d7260a" Nov 26 15:11:34 crc kubenswrapper[4785]: I1126 15:11:34.129795 4785 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-896ql" Nov 26 15:11:34 crc kubenswrapper[4785]: I1126 15:11:34.287838 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fb128907-8fff-4ad4-99f3-2877b676c46a-utilities\") pod \"fb128907-8fff-4ad4-99f3-2877b676c46a\" (UID: \"fb128907-8fff-4ad4-99f3-2877b676c46a\") " Nov 26 15:11:34 crc kubenswrapper[4785]: I1126 15:11:34.287908 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fb128907-8fff-4ad4-99f3-2877b676c46a-catalog-content\") pod \"fb128907-8fff-4ad4-99f3-2877b676c46a\" (UID: \"fb128907-8fff-4ad4-99f3-2877b676c46a\") " Nov 26 15:11:34 crc kubenswrapper[4785]: I1126 15:11:34.287952 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fxml9\" (UniqueName: \"kubernetes.io/projected/fb128907-8fff-4ad4-99f3-2877b676c46a-kube-api-access-fxml9\") pod \"fb128907-8fff-4ad4-99f3-2877b676c46a\" (UID: \"fb128907-8fff-4ad4-99f3-2877b676c46a\") " Nov 26 15:11:34 crc kubenswrapper[4785]: I1126 15:11:34.289132 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fb128907-8fff-4ad4-99f3-2877b676c46a-utilities" (OuterVolumeSpecName: "utilities") pod "fb128907-8fff-4ad4-99f3-2877b676c46a" (UID: "fb128907-8fff-4ad4-99f3-2877b676c46a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 15:11:34 crc kubenswrapper[4785]: I1126 15:11:34.291433 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fb128907-8fff-4ad4-99f3-2877b676c46a-kube-api-access-fxml9" (OuterVolumeSpecName: "kube-api-access-fxml9") pod "fb128907-8fff-4ad4-99f3-2877b676c46a" (UID: "fb128907-8fff-4ad4-99f3-2877b676c46a"). InnerVolumeSpecName "kube-api-access-fxml9". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 15:11:34 crc kubenswrapper[4785]: I1126 15:11:34.303936 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fb128907-8fff-4ad4-99f3-2877b676c46a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "fb128907-8fff-4ad4-99f3-2877b676c46a" (UID: "fb128907-8fff-4ad4-99f3-2877b676c46a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 15:11:34 crc kubenswrapper[4785]: I1126 15:11:34.390074 4785 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fb128907-8fff-4ad4-99f3-2877b676c46a-utilities\") on node \"crc\" DevicePath \"\"" Nov 26 15:11:34 crc kubenswrapper[4785]: I1126 15:11:34.390118 4785 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fb128907-8fff-4ad4-99f3-2877b676c46a-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 26 15:11:34 crc kubenswrapper[4785]: I1126 15:11:34.390136 4785 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fxml9\" (UniqueName: \"kubernetes.io/projected/fb128907-8fff-4ad4-99f3-2877b676c46a-kube-api-access-fxml9\") on node \"crc\" DevicePath \"\"" Nov 26 15:11:34 crc kubenswrapper[4785]: I1126 15:11:34.582457 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/441df9b7-56eb-451e-bdfa-beb5046aa892-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "441df9b7-56eb-451e-bdfa-beb5046aa892" (UID: "441df9b7-56eb-451e-bdfa-beb5046aa892"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 15:11:34 crc kubenswrapper[4785]: I1126 15:11:34.593723 4785 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/441df9b7-56eb-451e-bdfa-beb5046aa892-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 26 15:11:34 crc kubenswrapper[4785]: I1126 15:11:34.856981 4785 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-zdn6b"] Nov 26 15:11:34 crc kubenswrapper[4785]: I1126 15:11:34.862306 4785 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-zdn6b"] Nov 26 15:11:34 crc kubenswrapper[4785]: I1126 15:11:34.932738 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-896ql" event={"ID":"fb128907-8fff-4ad4-99f3-2877b676c46a","Type":"ContainerDied","Data":"a652f1465c2b1af444b103091a41db928b422980128f1d5e71fd032c5597271a"} Nov 26 15:11:34 crc kubenswrapper[4785]: I1126 15:11:34.933012 4785 scope.go:117] "RemoveContainer" containerID="3a2c284f93a9a0de54fde2be858a1812838542e83ca5ee5c73f3e59deb257493" Nov 26 15:11:34 crc kubenswrapper[4785]: I1126 15:11:34.932856 4785 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-896ql" Nov 26 15:11:34 crc kubenswrapper[4785]: I1126 15:11:34.937523 4785 generic.go:334] "Generic (PLEG): container finished" podID="b5c5c94f-25c6-4d9e-80c2-57f09055aba9" containerID="23f32a377d2c3eb2e3c287029f8c0b13d8359a3b8ff0f1edfcf61ab345f9bf62" exitCode=0 Nov 26 15:11:34 crc kubenswrapper[4785]: I1126 15:11:34.937637 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-f5pwc" event={"ID":"b5c5c94f-25c6-4d9e-80c2-57f09055aba9","Type":"ContainerDied","Data":"23f32a377d2c3eb2e3c287029f8c0b13d8359a3b8ff0f1edfcf61ab345f9bf62"} Nov 26 15:11:34 crc kubenswrapper[4785]: I1126 15:11:34.939424 4785 generic.go:334] "Generic (PLEG): container finished" podID="5d3c5506-f58a-45bb-adbe-e895b7e4d646" containerID="5d26e7e5415ccbf92293e4a6ac21162b55c99206553e2d3996f3fadbee0ded50" exitCode=0 Nov 26 15:11:34 crc kubenswrapper[4785]: I1126 15:11:34.939510 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-jmg2s" event={"ID":"5d3c5506-f58a-45bb-adbe-e895b7e4d646","Type":"ContainerDied","Data":"5d26e7e5415ccbf92293e4a6ac21162b55c99206553e2d3996f3fadbee0ded50"} Nov 26 15:11:34 crc kubenswrapper[4785]: I1126 15:11:34.959026 4785 scope.go:117] "RemoveContainer" containerID="64dd987a699bf6fed70276198be334f367dfdd27e8f695446ddbc2c213000ec9" Nov 26 15:11:34 crc kubenswrapper[4785]: I1126 15:11:34.961643 4785 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-896ql"] Nov 26 15:11:34 crc kubenswrapper[4785]: I1126 15:11:34.964482 4785 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-896ql"] Nov 26 15:11:35 crc kubenswrapper[4785]: I1126 15:11:35.001278 4785 scope.go:117] "RemoveContainer" containerID="20e658ba377105ee5e2b92633febaba966ca089d23bf144ed2f41f029217e02a" Nov 26 15:11:35 crc kubenswrapper[4785]: I1126 15:11:35.042319 4785 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="441df9b7-56eb-451e-bdfa-beb5046aa892" path="/var/lib/kubelet/pods/441df9b7-56eb-451e-bdfa-beb5046aa892/volumes" Nov 26 15:11:35 crc kubenswrapper[4785]: I1126 15:11:35.043377 4785 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fb128907-8fff-4ad4-99f3-2877b676c46a" path="/var/lib/kubelet/pods/fb128907-8fff-4ad4-99f3-2877b676c46a/volumes" Nov 26 15:11:35 crc kubenswrapper[4785]: I1126 15:11:35.394280 4785 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-jmg2s" Nov 26 15:11:35 crc kubenswrapper[4785]: I1126 15:11:35.505662 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/5d3c5506-f58a-45bb-adbe-e895b7e4d646-v4-0-config-user-idp-0-file-data\") pod \"5d3c5506-f58a-45bb-adbe-e895b7e4d646\" (UID: \"5d3c5506-f58a-45bb-adbe-e895b7e4d646\") " Nov 26 15:11:35 crc kubenswrapper[4785]: I1126 15:11:35.505771 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/5d3c5506-f58a-45bb-adbe-e895b7e4d646-v4-0-config-user-template-error\") pod \"5d3c5506-f58a-45bb-adbe-e895b7e4d646\" (UID: \"5d3c5506-f58a-45bb-adbe-e895b7e4d646\") " Nov 26 15:11:35 crc kubenswrapper[4785]: I1126 15:11:35.505978 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5d3c5506-f58a-45bb-adbe-e895b7e4d646-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "5d3c5506-f58a-45bb-adbe-e895b7e4d646" (UID: "5d3c5506-f58a-45bb-adbe-e895b7e4d646"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 15:11:35 crc kubenswrapper[4785]: I1126 15:11:35.506634 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/5d3c5506-f58a-45bb-adbe-e895b7e4d646-audit-dir\") pod \"5d3c5506-f58a-45bb-adbe-e895b7e4d646\" (UID: \"5d3c5506-f58a-45bb-adbe-e895b7e4d646\") " Nov 26 15:11:35 crc kubenswrapper[4785]: I1126 15:11:35.506705 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/5d3c5506-f58a-45bb-adbe-e895b7e4d646-v4-0-config-system-ocp-branding-template\") pod \"5d3c5506-f58a-45bb-adbe-e895b7e4d646\" (UID: \"5d3c5506-f58a-45bb-adbe-e895b7e4d646\") " Nov 26 15:11:35 crc kubenswrapper[4785]: I1126 15:11:35.506795 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/5d3c5506-f58a-45bb-adbe-e895b7e4d646-v4-0-config-system-cliconfig\") pod \"5d3c5506-f58a-45bb-adbe-e895b7e4d646\" (UID: \"5d3c5506-f58a-45bb-adbe-e895b7e4d646\") " Nov 26 15:11:35 crc kubenswrapper[4785]: I1126 15:11:35.506832 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/5d3c5506-f58a-45bb-adbe-e895b7e4d646-v4-0-config-system-session\") pod \"5d3c5506-f58a-45bb-adbe-e895b7e4d646\" (UID: \"5d3c5506-f58a-45bb-adbe-e895b7e4d646\") " Nov 26 15:11:35 crc kubenswrapper[4785]: I1126 15:11:35.506879 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/5d3c5506-f58a-45bb-adbe-e895b7e4d646-v4-0-config-system-router-certs\") pod \"5d3c5506-f58a-45bb-adbe-e895b7e4d646\" (UID: \"5d3c5506-f58a-45bb-adbe-e895b7e4d646\") " Nov 26 15:11:35 crc kubenswrapper[4785]: I1126 15:11:35.506907 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/5d3c5506-f58a-45bb-adbe-e895b7e4d646-v4-0-config-system-serving-cert\") pod \"5d3c5506-f58a-45bb-adbe-e895b7e4d646\" (UID: \"5d3c5506-f58a-45bb-adbe-e895b7e4d646\") " Nov 26 15:11:35 crc kubenswrapper[4785]: I1126 15:11:35.506958 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/5d3c5506-f58a-45bb-adbe-e895b7e4d646-v4-0-config-user-template-provider-selection\") pod \"5d3c5506-f58a-45bb-adbe-e895b7e4d646\" (UID: \"5d3c5506-f58a-45bb-adbe-e895b7e4d646\") " Nov 26 15:11:35 crc kubenswrapper[4785]: I1126 15:11:35.506994 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5d3c5506-f58a-45bb-adbe-e895b7e4d646-v4-0-config-system-trusted-ca-bundle\") pod \"5d3c5506-f58a-45bb-adbe-e895b7e4d646\" (UID: \"5d3c5506-f58a-45bb-adbe-e895b7e4d646\") " Nov 26 15:11:35 crc kubenswrapper[4785]: I1126 15:11:35.507034 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/5d3c5506-f58a-45bb-adbe-e895b7e4d646-v4-0-config-system-service-ca\") pod \"5d3c5506-f58a-45bb-adbe-e895b7e4d646\" (UID: \"5d3c5506-f58a-45bb-adbe-e895b7e4d646\") " Nov 26 15:11:35 crc kubenswrapper[4785]: I1126 15:11:35.507066 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcdzs\" (UniqueName: \"kubernetes.io/projected/5d3c5506-f58a-45bb-adbe-e895b7e4d646-kube-api-access-xcdzs\") pod \"5d3c5506-f58a-45bb-adbe-e895b7e4d646\" (UID: \"5d3c5506-f58a-45bb-adbe-e895b7e4d646\") " Nov 26 15:11:35 crc kubenswrapper[4785]: I1126 15:11:35.507114 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/5d3c5506-f58a-45bb-adbe-e895b7e4d646-audit-policies\") pod \"5d3c5506-f58a-45bb-adbe-e895b7e4d646\" (UID: \"5d3c5506-f58a-45bb-adbe-e895b7e4d646\") " Nov 26 15:11:35 crc kubenswrapper[4785]: I1126 15:11:35.507226 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/5d3c5506-f58a-45bb-adbe-e895b7e4d646-v4-0-config-user-template-login\") pod \"5d3c5506-f58a-45bb-adbe-e895b7e4d646\" (UID: \"5d3c5506-f58a-45bb-adbe-e895b7e4d646\") " Nov 26 15:11:35 crc kubenswrapper[4785]: I1126 15:11:35.507495 4785 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/5d3c5506-f58a-45bb-adbe-e895b7e4d646-audit-dir\") on node \"crc\" DevicePath \"\"" Nov 26 15:11:35 crc kubenswrapper[4785]: I1126 15:11:35.507838 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5d3c5506-f58a-45bb-adbe-e895b7e4d646-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "5d3c5506-f58a-45bb-adbe-e895b7e4d646" (UID: "5d3c5506-f58a-45bb-adbe-e895b7e4d646"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 15:11:35 crc kubenswrapper[4785]: I1126 15:11:35.507936 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5d3c5506-f58a-45bb-adbe-e895b7e4d646-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "5d3c5506-f58a-45bb-adbe-e895b7e4d646" (UID: "5d3c5506-f58a-45bb-adbe-e895b7e4d646"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 15:11:35 crc kubenswrapper[4785]: I1126 15:11:35.508465 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5d3c5506-f58a-45bb-adbe-e895b7e4d646-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "5d3c5506-f58a-45bb-adbe-e895b7e4d646" (UID: "5d3c5506-f58a-45bb-adbe-e895b7e4d646"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 15:11:35 crc kubenswrapper[4785]: I1126 15:11:35.508743 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5d3c5506-f58a-45bb-adbe-e895b7e4d646-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "5d3c5506-f58a-45bb-adbe-e895b7e4d646" (UID: "5d3c5506-f58a-45bb-adbe-e895b7e4d646"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 15:11:35 crc kubenswrapper[4785]: I1126 15:11:35.511789 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d3c5506-f58a-45bb-adbe-e895b7e4d646-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "5d3c5506-f58a-45bb-adbe-e895b7e4d646" (UID: "5d3c5506-f58a-45bb-adbe-e895b7e4d646"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 15:11:35 crc kubenswrapper[4785]: I1126 15:11:35.512465 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5d3c5506-f58a-45bb-adbe-e895b7e4d646-kube-api-access-xcdzs" (OuterVolumeSpecName: "kube-api-access-xcdzs") pod "5d3c5506-f58a-45bb-adbe-e895b7e4d646" (UID: "5d3c5506-f58a-45bb-adbe-e895b7e4d646"). InnerVolumeSpecName "kube-api-access-xcdzs". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 15:11:35 crc kubenswrapper[4785]: I1126 15:11:35.512591 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d3c5506-f58a-45bb-adbe-e895b7e4d646-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "5d3c5506-f58a-45bb-adbe-e895b7e4d646" (UID: "5d3c5506-f58a-45bb-adbe-e895b7e4d646"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 15:11:35 crc kubenswrapper[4785]: I1126 15:11:35.513228 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d3c5506-f58a-45bb-adbe-e895b7e4d646-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "5d3c5506-f58a-45bb-adbe-e895b7e4d646" (UID: "5d3c5506-f58a-45bb-adbe-e895b7e4d646"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 15:11:35 crc kubenswrapper[4785]: I1126 15:11:35.513820 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d3c5506-f58a-45bb-adbe-e895b7e4d646-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "5d3c5506-f58a-45bb-adbe-e895b7e4d646" (UID: "5d3c5506-f58a-45bb-adbe-e895b7e4d646"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 15:11:35 crc kubenswrapper[4785]: I1126 15:11:35.513848 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d3c5506-f58a-45bb-adbe-e895b7e4d646-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "5d3c5506-f58a-45bb-adbe-e895b7e4d646" (UID: "5d3c5506-f58a-45bb-adbe-e895b7e4d646"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 15:11:35 crc kubenswrapper[4785]: I1126 15:11:35.513967 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d3c5506-f58a-45bb-adbe-e895b7e4d646-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "5d3c5506-f58a-45bb-adbe-e895b7e4d646" (UID: "5d3c5506-f58a-45bb-adbe-e895b7e4d646"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 15:11:35 crc kubenswrapper[4785]: I1126 15:11:35.514131 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d3c5506-f58a-45bb-adbe-e895b7e4d646-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "5d3c5506-f58a-45bb-adbe-e895b7e4d646" (UID: "5d3c5506-f58a-45bb-adbe-e895b7e4d646"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 15:11:35 crc kubenswrapper[4785]: I1126 15:11:35.514430 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d3c5506-f58a-45bb-adbe-e895b7e4d646-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "5d3c5506-f58a-45bb-adbe-e895b7e4d646" (UID: "5d3c5506-f58a-45bb-adbe-e895b7e4d646"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 15:11:35 crc kubenswrapper[4785]: I1126 15:11:35.551213 4785 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-f5pwc" Nov 26 15:11:35 crc kubenswrapper[4785]: I1126 15:11:35.608625 4785 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/5d3c5506-f58a-45bb-adbe-e895b7e4d646-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Nov 26 15:11:35 crc kubenswrapper[4785]: I1126 15:11:35.608960 4785 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/5d3c5506-f58a-45bb-adbe-e895b7e4d646-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 26 15:11:35 crc kubenswrapper[4785]: I1126 15:11:35.608973 4785 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/5d3c5506-f58a-45bb-adbe-e895b7e4d646-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Nov 26 15:11:35 crc kubenswrapper[4785]: I1126 15:11:35.608983 4785 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5d3c5506-f58a-45bb-adbe-e895b7e4d646-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 26 15:11:35 crc kubenswrapper[4785]: I1126 15:11:35.608991 4785 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/5d3c5506-f58a-45bb-adbe-e895b7e4d646-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Nov 26 15:11:35 crc kubenswrapper[4785]: I1126 15:11:35.609000 4785 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcdzs\" (UniqueName: \"kubernetes.io/projected/5d3c5506-f58a-45bb-adbe-e895b7e4d646-kube-api-access-xcdzs\") on node \"crc\" DevicePath \"\"" Nov 26 15:11:35 crc kubenswrapper[4785]: I1126 15:11:35.609009 4785 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/5d3c5506-f58a-45bb-adbe-e895b7e4d646-audit-policies\") on node \"crc\" DevicePath \"\"" Nov 26 15:11:35 crc kubenswrapper[4785]: I1126 15:11:35.609019 4785 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/5d3c5506-f58a-45bb-adbe-e895b7e4d646-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Nov 26 15:11:35 crc kubenswrapper[4785]: I1126 15:11:35.609048 4785 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/5d3c5506-f58a-45bb-adbe-e895b7e4d646-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Nov 26 15:11:35 crc kubenswrapper[4785]: I1126 15:11:35.609058 4785 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/5d3c5506-f58a-45bb-adbe-e895b7e4d646-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Nov 26 15:11:35 crc kubenswrapper[4785]: I1126 15:11:35.609067 4785 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/5d3c5506-f58a-45bb-adbe-e895b7e4d646-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Nov 26 15:11:35 crc kubenswrapper[4785]: I1126 15:11:35.609075 4785 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/5d3c5506-f58a-45bb-adbe-e895b7e4d646-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Nov 26 15:11:35 crc kubenswrapper[4785]: I1126 15:11:35.609085 4785 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/5d3c5506-f58a-45bb-adbe-e895b7e4d646-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Nov 26 15:11:35 crc kubenswrapper[4785]: I1126 15:11:35.709544 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z4z6j\" (UniqueName: \"kubernetes.io/projected/b5c5c94f-25c6-4d9e-80c2-57f09055aba9-kube-api-access-z4z6j\") pod \"b5c5c94f-25c6-4d9e-80c2-57f09055aba9\" (UID: \"b5c5c94f-25c6-4d9e-80c2-57f09055aba9\") " Nov 26 15:11:35 crc kubenswrapper[4785]: I1126 15:11:35.709661 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b5c5c94f-25c6-4d9e-80c2-57f09055aba9-utilities\") pod \"b5c5c94f-25c6-4d9e-80c2-57f09055aba9\" (UID: \"b5c5c94f-25c6-4d9e-80c2-57f09055aba9\") " Nov 26 15:11:35 crc kubenswrapper[4785]: I1126 15:11:35.709690 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b5c5c94f-25c6-4d9e-80c2-57f09055aba9-catalog-content\") pod \"b5c5c94f-25c6-4d9e-80c2-57f09055aba9\" (UID: \"b5c5c94f-25c6-4d9e-80c2-57f09055aba9\") " Nov 26 15:11:35 crc kubenswrapper[4785]: I1126 15:11:35.711196 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b5c5c94f-25c6-4d9e-80c2-57f09055aba9-utilities" (OuterVolumeSpecName: "utilities") pod "b5c5c94f-25c6-4d9e-80c2-57f09055aba9" (UID: "b5c5c94f-25c6-4d9e-80c2-57f09055aba9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 15:11:35 crc kubenswrapper[4785]: I1126 15:11:35.712964 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b5c5c94f-25c6-4d9e-80c2-57f09055aba9-kube-api-access-z4z6j" (OuterVolumeSpecName: "kube-api-access-z4z6j") pod "b5c5c94f-25c6-4d9e-80c2-57f09055aba9" (UID: "b5c5c94f-25c6-4d9e-80c2-57f09055aba9"). InnerVolumeSpecName "kube-api-access-z4z6j". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 15:11:35 crc kubenswrapper[4785]: I1126 15:11:35.810983 4785 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b5c5c94f-25c6-4d9e-80c2-57f09055aba9-utilities\") on node \"crc\" DevicePath \"\"" Nov 26 15:11:35 crc kubenswrapper[4785]: I1126 15:11:35.811027 4785 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z4z6j\" (UniqueName: \"kubernetes.io/projected/b5c5c94f-25c6-4d9e-80c2-57f09055aba9-kube-api-access-z4z6j\") on node \"crc\" DevicePath \"\"" Nov 26 15:11:35 crc kubenswrapper[4785]: I1126 15:11:35.873373 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b5c5c94f-25c6-4d9e-80c2-57f09055aba9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b5c5c94f-25c6-4d9e-80c2-57f09055aba9" (UID: "b5c5c94f-25c6-4d9e-80c2-57f09055aba9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 15:11:35 crc kubenswrapper[4785]: I1126 15:11:35.912710 4785 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b5c5c94f-25c6-4d9e-80c2-57f09055aba9-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 26 15:11:35 crc kubenswrapper[4785]: I1126 15:11:35.945732 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-jmg2s" event={"ID":"5d3c5506-f58a-45bb-adbe-e895b7e4d646","Type":"ContainerDied","Data":"81c8a5c7e5f27c279372c67eb67bd1782d9657a8b6e2e8cb209424884773bbae"} Nov 26 15:11:35 crc kubenswrapper[4785]: I1126 15:11:35.945788 4785 scope.go:117] "RemoveContainer" containerID="5d26e7e5415ccbf92293e4a6ac21162b55c99206553e2d3996f3fadbee0ded50" Nov 26 15:11:35 crc kubenswrapper[4785]: I1126 15:11:35.945801 4785 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-jmg2s" Nov 26 15:11:35 crc kubenswrapper[4785]: I1126 15:11:35.950478 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-f5pwc" event={"ID":"b5c5c94f-25c6-4d9e-80c2-57f09055aba9","Type":"ContainerDied","Data":"027647937cacd60d2a2863b9525d2cebe4a1c95d40d1f48c402dd0dc76408503"} Nov 26 15:11:35 crc kubenswrapper[4785]: I1126 15:11:35.950586 4785 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-f5pwc" Nov 26 15:11:35 crc kubenswrapper[4785]: I1126 15:11:35.971801 4785 scope.go:117] "RemoveContainer" containerID="23f32a377d2c3eb2e3c287029f8c0b13d8359a3b8ff0f1edfcf61ab345f9bf62" Nov 26 15:11:35 crc kubenswrapper[4785]: I1126 15:11:35.988381 4785 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-jmg2s"] Nov 26 15:11:35 crc kubenswrapper[4785]: I1126 15:11:35.990167 4785 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-jmg2s"] Nov 26 15:11:36 crc kubenswrapper[4785]: I1126 15:11:36.008874 4785 scope.go:117] "RemoveContainer" containerID="a8bbf6ec4a1fc51850a6e620bebf3647a4932c382abac79ec95b6e6e6b311abe" Nov 26 15:11:36 crc kubenswrapper[4785]: I1126 15:11:36.010085 4785 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-f5pwc"] Nov 26 15:11:36 crc kubenswrapper[4785]: I1126 15:11:36.013007 4785 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-f5pwc"] Nov 26 15:11:36 crc kubenswrapper[4785]: I1126 15:11:36.022653 4785 scope.go:117] "RemoveContainer" containerID="5051ff079b19344a07b9f0107c7aae8be173696989da76c3bcadfaa7ec02b5e5" Nov 26 15:11:37 crc kubenswrapper[4785]: I1126 15:11:37.042922 4785 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5d3c5506-f58a-45bb-adbe-e895b7e4d646" path="/var/lib/kubelet/pods/5d3c5506-f58a-45bb-adbe-e895b7e4d646/volumes" Nov 26 15:11:37 crc kubenswrapper[4785]: I1126 15:11:37.043960 4785 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b5c5c94f-25c6-4d9e-80c2-57f09055aba9" path="/var/lib/kubelet/pods/b5c5c94f-25c6-4d9e-80c2-57f09055aba9/volumes" Nov 26 15:11:38 crc kubenswrapper[4785]: I1126 15:11:38.288198 4785 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-6bbf4c9fdf-5bzvw"] Nov 26 15:11:38 crc kubenswrapper[4785]: E1126 15:11:38.288408 4785 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24729921-e706-4a67-a93d-70bcc92b7bb8" containerName="extract-utilities" Nov 26 15:11:38 crc kubenswrapper[4785]: I1126 15:11:38.288422 4785 state_mem.go:107] "Deleted CPUSet assignment" podUID="24729921-e706-4a67-a93d-70bcc92b7bb8" containerName="extract-utilities" Nov 26 15:11:38 crc kubenswrapper[4785]: E1126 15:11:38.288433 4785 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="441df9b7-56eb-451e-bdfa-beb5046aa892" containerName="extract-utilities" Nov 26 15:11:38 crc kubenswrapper[4785]: I1126 15:11:38.288438 4785 state_mem.go:107] "Deleted CPUSet assignment" podUID="441df9b7-56eb-451e-bdfa-beb5046aa892" containerName="extract-utilities" Nov 26 15:11:38 crc kubenswrapper[4785]: E1126 15:11:38.288449 4785 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eafbd7cf-352b-45ee-b6f3-a796ae7a49ed" containerName="pruner" Nov 26 15:11:38 crc kubenswrapper[4785]: I1126 15:11:38.288455 4785 state_mem.go:107] "Deleted CPUSet assignment" podUID="eafbd7cf-352b-45ee-b6f3-a796ae7a49ed" containerName="pruner" Nov 26 15:11:38 crc kubenswrapper[4785]: E1126 15:11:38.288465 4785 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="441df9b7-56eb-451e-bdfa-beb5046aa892" containerName="extract-content" Nov 26 15:11:38 crc kubenswrapper[4785]: I1126 15:11:38.288481 4785 state_mem.go:107] "Deleted CPUSet assignment" podUID="441df9b7-56eb-451e-bdfa-beb5046aa892" containerName="extract-content" Nov 26 15:11:38 crc kubenswrapper[4785]: E1126 15:11:38.288491 4785 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24729921-e706-4a67-a93d-70bcc92b7bb8" containerName="extract-content" Nov 26 15:11:38 crc kubenswrapper[4785]: I1126 15:11:38.288498 4785 state_mem.go:107] "Deleted CPUSet assignment" podUID="24729921-e706-4a67-a93d-70bcc92b7bb8" containerName="extract-content" Nov 26 15:11:38 crc kubenswrapper[4785]: E1126 15:11:38.288509 4785 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb128907-8fff-4ad4-99f3-2877b676c46a" containerName="extract-content" Nov 26 15:11:38 crc kubenswrapper[4785]: I1126 15:11:38.288517 4785 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb128907-8fff-4ad4-99f3-2877b676c46a" containerName="extract-content" Nov 26 15:11:38 crc kubenswrapper[4785]: E1126 15:11:38.288525 4785 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="91accea7-45f2-45c7-b620-12023af1f863" containerName="pruner" Nov 26 15:11:38 crc kubenswrapper[4785]: I1126 15:11:38.288532 4785 state_mem.go:107] "Deleted CPUSet assignment" podUID="91accea7-45f2-45c7-b620-12023af1f863" containerName="pruner" Nov 26 15:11:38 crc kubenswrapper[4785]: E1126 15:11:38.288542 4785 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b5c5c94f-25c6-4d9e-80c2-57f09055aba9" containerName="extract-utilities" Nov 26 15:11:38 crc kubenswrapper[4785]: I1126 15:11:38.288554 4785 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5c5c94f-25c6-4d9e-80c2-57f09055aba9" containerName="extract-utilities" Nov 26 15:11:38 crc kubenswrapper[4785]: E1126 15:11:38.288579 4785 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b5c5c94f-25c6-4d9e-80c2-57f09055aba9" containerName="extract-content" Nov 26 15:11:38 crc kubenswrapper[4785]: I1126 15:11:38.288586 4785 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5c5c94f-25c6-4d9e-80c2-57f09055aba9" containerName="extract-content" Nov 26 15:11:38 crc kubenswrapper[4785]: E1126 15:11:38.288594 4785 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb128907-8fff-4ad4-99f3-2877b676c46a" containerName="registry-server" Nov 26 15:11:38 crc kubenswrapper[4785]: I1126 15:11:38.288600 4785 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb128907-8fff-4ad4-99f3-2877b676c46a" containerName="registry-server" Nov 26 15:11:38 crc kubenswrapper[4785]: E1126 15:11:38.288608 4785 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5d3c5506-f58a-45bb-adbe-e895b7e4d646" containerName="oauth-openshift" Nov 26 15:11:38 crc kubenswrapper[4785]: I1126 15:11:38.288615 4785 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d3c5506-f58a-45bb-adbe-e895b7e4d646" containerName="oauth-openshift" Nov 26 15:11:38 crc kubenswrapper[4785]: E1126 15:11:38.288628 4785 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d36477d3-bcc2-47e0-8aab-687e7ae01f9e" containerName="collect-profiles" Nov 26 15:11:38 crc kubenswrapper[4785]: I1126 15:11:38.288635 4785 state_mem.go:107] "Deleted CPUSet assignment" podUID="d36477d3-bcc2-47e0-8aab-687e7ae01f9e" containerName="collect-profiles" Nov 26 15:11:38 crc kubenswrapper[4785]: E1126 15:11:38.288645 4785 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24729921-e706-4a67-a93d-70bcc92b7bb8" containerName="registry-server" Nov 26 15:11:38 crc kubenswrapper[4785]: I1126 15:11:38.288652 4785 state_mem.go:107] "Deleted CPUSet assignment" podUID="24729921-e706-4a67-a93d-70bcc92b7bb8" containerName="registry-server" Nov 26 15:11:38 crc kubenswrapper[4785]: E1126 15:11:38.288667 4785 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb128907-8fff-4ad4-99f3-2877b676c46a" containerName="extract-utilities" Nov 26 15:11:38 crc kubenswrapper[4785]: I1126 15:11:38.288676 4785 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb128907-8fff-4ad4-99f3-2877b676c46a" containerName="extract-utilities" Nov 26 15:11:38 crc kubenswrapper[4785]: E1126 15:11:38.288688 4785 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="441df9b7-56eb-451e-bdfa-beb5046aa892" containerName="registry-server" Nov 26 15:11:38 crc kubenswrapper[4785]: I1126 15:11:38.288696 4785 state_mem.go:107] "Deleted CPUSet assignment" podUID="441df9b7-56eb-451e-bdfa-beb5046aa892" containerName="registry-server" Nov 26 15:11:38 crc kubenswrapper[4785]: E1126 15:11:38.288707 4785 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b5c5c94f-25c6-4d9e-80c2-57f09055aba9" containerName="registry-server" Nov 26 15:11:38 crc kubenswrapper[4785]: I1126 15:11:38.288714 4785 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5c5c94f-25c6-4d9e-80c2-57f09055aba9" containerName="registry-server" Nov 26 15:11:38 crc kubenswrapper[4785]: I1126 15:11:38.288815 4785 memory_manager.go:354] "RemoveStaleState removing state" podUID="d36477d3-bcc2-47e0-8aab-687e7ae01f9e" containerName="collect-profiles" Nov 26 15:11:38 crc kubenswrapper[4785]: I1126 15:11:38.288830 4785 memory_manager.go:354] "RemoveStaleState removing state" podUID="441df9b7-56eb-451e-bdfa-beb5046aa892" containerName="registry-server" Nov 26 15:11:38 crc kubenswrapper[4785]: I1126 15:11:38.288842 4785 memory_manager.go:354] "RemoveStaleState removing state" podUID="eafbd7cf-352b-45ee-b6f3-a796ae7a49ed" containerName="pruner" Nov 26 15:11:38 crc kubenswrapper[4785]: I1126 15:11:38.288851 4785 memory_manager.go:354] "RemoveStaleState removing state" podUID="fb128907-8fff-4ad4-99f3-2877b676c46a" containerName="registry-server" Nov 26 15:11:38 crc kubenswrapper[4785]: I1126 15:11:38.288860 4785 memory_manager.go:354] "RemoveStaleState removing state" podUID="5d3c5506-f58a-45bb-adbe-e895b7e4d646" containerName="oauth-openshift" Nov 26 15:11:38 crc kubenswrapper[4785]: I1126 15:11:38.288869 4785 memory_manager.go:354] "RemoveStaleState removing state" podUID="91accea7-45f2-45c7-b620-12023af1f863" containerName="pruner" Nov 26 15:11:38 crc kubenswrapper[4785]: I1126 15:11:38.288878 4785 memory_manager.go:354] "RemoveStaleState removing state" podUID="b5c5c94f-25c6-4d9e-80c2-57f09055aba9" containerName="registry-server" Nov 26 15:11:38 crc kubenswrapper[4785]: I1126 15:11:38.288885 4785 memory_manager.go:354] "RemoveStaleState removing state" podUID="24729921-e706-4a67-a93d-70bcc92b7bb8" containerName="registry-server" Nov 26 15:11:38 crc kubenswrapper[4785]: I1126 15:11:38.289264 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-6bbf4c9fdf-5bzvw" Nov 26 15:11:38 crc kubenswrapper[4785]: I1126 15:11:38.292162 4785 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Nov 26 15:11:38 crc kubenswrapper[4785]: I1126 15:11:38.292660 4785 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Nov 26 15:11:38 crc kubenswrapper[4785]: I1126 15:11:38.292759 4785 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Nov 26 15:11:38 crc kubenswrapper[4785]: I1126 15:11:38.292885 4785 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Nov 26 15:11:38 crc kubenswrapper[4785]: I1126 15:11:38.293111 4785 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Nov 26 15:11:38 crc kubenswrapper[4785]: I1126 15:11:38.293208 4785 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Nov 26 15:11:38 crc kubenswrapper[4785]: I1126 15:11:38.293392 4785 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Nov 26 15:11:38 crc kubenswrapper[4785]: I1126 15:11:38.293651 4785 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Nov 26 15:11:38 crc kubenswrapper[4785]: I1126 15:11:38.295046 4785 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Nov 26 15:11:38 crc kubenswrapper[4785]: I1126 15:11:38.295082 4785 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Nov 26 15:11:38 crc kubenswrapper[4785]: I1126 15:11:38.295158 4785 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Nov 26 15:11:38 crc kubenswrapper[4785]: I1126 15:11:38.295149 4785 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Nov 26 15:11:38 crc kubenswrapper[4785]: I1126 15:11:38.302840 4785 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Nov 26 15:11:38 crc kubenswrapper[4785]: I1126 15:11:38.304936 4785 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Nov 26 15:11:38 crc kubenswrapper[4785]: I1126 15:11:38.318335 4785 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Nov 26 15:11:38 crc kubenswrapper[4785]: I1126 15:11:38.319887 4785 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-6bbf4c9fdf-5bzvw"] Nov 26 15:11:38 crc kubenswrapper[4785]: I1126 15:11:38.450758 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/61135322-c104-4bb0-ba53-30edf2a19800-v4-0-config-system-service-ca\") pod \"oauth-openshift-6bbf4c9fdf-5bzvw\" (UID: \"61135322-c104-4bb0-ba53-30edf2a19800\") " pod="openshift-authentication/oauth-openshift-6bbf4c9fdf-5bzvw" Nov 26 15:11:38 crc kubenswrapper[4785]: I1126 15:11:38.450813 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/61135322-c104-4bb0-ba53-30edf2a19800-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-6bbf4c9fdf-5bzvw\" (UID: \"61135322-c104-4bb0-ba53-30edf2a19800\") " pod="openshift-authentication/oauth-openshift-6bbf4c9fdf-5bzvw" Nov 26 15:11:38 crc kubenswrapper[4785]: I1126 15:11:38.450849 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/61135322-c104-4bb0-ba53-30edf2a19800-audit-policies\") pod \"oauth-openshift-6bbf4c9fdf-5bzvw\" (UID: \"61135322-c104-4bb0-ba53-30edf2a19800\") " pod="openshift-authentication/oauth-openshift-6bbf4c9fdf-5bzvw" Nov 26 15:11:38 crc kubenswrapper[4785]: I1126 15:11:38.450923 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/61135322-c104-4bb0-ba53-30edf2a19800-v4-0-config-system-router-certs\") pod \"oauth-openshift-6bbf4c9fdf-5bzvw\" (UID: \"61135322-c104-4bb0-ba53-30edf2a19800\") " pod="openshift-authentication/oauth-openshift-6bbf4c9fdf-5bzvw" Nov 26 15:11:38 crc kubenswrapper[4785]: I1126 15:11:38.450944 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/61135322-c104-4bb0-ba53-30edf2a19800-v4-0-config-system-session\") pod \"oauth-openshift-6bbf4c9fdf-5bzvw\" (UID: \"61135322-c104-4bb0-ba53-30edf2a19800\") " pod="openshift-authentication/oauth-openshift-6bbf4c9fdf-5bzvw" Nov 26 15:11:38 crc kubenswrapper[4785]: I1126 15:11:38.450965 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/61135322-c104-4bb0-ba53-30edf2a19800-v4-0-config-user-template-login\") pod \"oauth-openshift-6bbf4c9fdf-5bzvw\" (UID: \"61135322-c104-4bb0-ba53-30edf2a19800\") " pod="openshift-authentication/oauth-openshift-6bbf4c9fdf-5bzvw" Nov 26 15:11:38 crc kubenswrapper[4785]: I1126 15:11:38.450989 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-znxn9\" (UniqueName: \"kubernetes.io/projected/61135322-c104-4bb0-ba53-30edf2a19800-kube-api-access-znxn9\") pod \"oauth-openshift-6bbf4c9fdf-5bzvw\" (UID: \"61135322-c104-4bb0-ba53-30edf2a19800\") " pod="openshift-authentication/oauth-openshift-6bbf4c9fdf-5bzvw" Nov 26 15:11:38 crc kubenswrapper[4785]: I1126 15:11:38.451045 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/61135322-c104-4bb0-ba53-30edf2a19800-audit-dir\") pod \"oauth-openshift-6bbf4c9fdf-5bzvw\" (UID: \"61135322-c104-4bb0-ba53-30edf2a19800\") " pod="openshift-authentication/oauth-openshift-6bbf4c9fdf-5bzvw" Nov 26 15:11:38 crc kubenswrapper[4785]: I1126 15:11:38.451109 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/61135322-c104-4bb0-ba53-30edf2a19800-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-6bbf4c9fdf-5bzvw\" (UID: \"61135322-c104-4bb0-ba53-30edf2a19800\") " pod="openshift-authentication/oauth-openshift-6bbf4c9fdf-5bzvw" Nov 26 15:11:38 crc kubenswrapper[4785]: I1126 15:11:38.451171 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/61135322-c104-4bb0-ba53-30edf2a19800-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-6bbf4c9fdf-5bzvw\" (UID: \"61135322-c104-4bb0-ba53-30edf2a19800\") " pod="openshift-authentication/oauth-openshift-6bbf4c9fdf-5bzvw" Nov 26 15:11:38 crc kubenswrapper[4785]: I1126 15:11:38.451194 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/61135322-c104-4bb0-ba53-30edf2a19800-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-6bbf4c9fdf-5bzvw\" (UID: \"61135322-c104-4bb0-ba53-30edf2a19800\") " pod="openshift-authentication/oauth-openshift-6bbf4c9fdf-5bzvw" Nov 26 15:11:38 crc kubenswrapper[4785]: I1126 15:11:38.451252 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/61135322-c104-4bb0-ba53-30edf2a19800-v4-0-config-system-serving-cert\") pod \"oauth-openshift-6bbf4c9fdf-5bzvw\" (UID: \"61135322-c104-4bb0-ba53-30edf2a19800\") " pod="openshift-authentication/oauth-openshift-6bbf4c9fdf-5bzvw" Nov 26 15:11:38 crc kubenswrapper[4785]: I1126 15:11:38.451278 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/61135322-c104-4bb0-ba53-30edf2a19800-v4-0-config-user-template-error\") pod \"oauth-openshift-6bbf4c9fdf-5bzvw\" (UID: \"61135322-c104-4bb0-ba53-30edf2a19800\") " pod="openshift-authentication/oauth-openshift-6bbf4c9fdf-5bzvw" Nov 26 15:11:38 crc kubenswrapper[4785]: I1126 15:11:38.451298 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/61135322-c104-4bb0-ba53-30edf2a19800-v4-0-config-system-cliconfig\") pod \"oauth-openshift-6bbf4c9fdf-5bzvw\" (UID: \"61135322-c104-4bb0-ba53-30edf2a19800\") " pod="openshift-authentication/oauth-openshift-6bbf4c9fdf-5bzvw" Nov 26 15:11:38 crc kubenswrapper[4785]: I1126 15:11:38.552920 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/61135322-c104-4bb0-ba53-30edf2a19800-audit-dir\") pod \"oauth-openshift-6bbf4c9fdf-5bzvw\" (UID: \"61135322-c104-4bb0-ba53-30edf2a19800\") " pod="openshift-authentication/oauth-openshift-6bbf4c9fdf-5bzvw" Nov 26 15:11:38 crc kubenswrapper[4785]: I1126 15:11:38.553008 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/61135322-c104-4bb0-ba53-30edf2a19800-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-6bbf4c9fdf-5bzvw\" (UID: \"61135322-c104-4bb0-ba53-30edf2a19800\") " pod="openshift-authentication/oauth-openshift-6bbf4c9fdf-5bzvw" Nov 26 15:11:38 crc kubenswrapper[4785]: I1126 15:11:38.553062 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/61135322-c104-4bb0-ba53-30edf2a19800-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-6bbf4c9fdf-5bzvw\" (UID: \"61135322-c104-4bb0-ba53-30edf2a19800\") " pod="openshift-authentication/oauth-openshift-6bbf4c9fdf-5bzvw" Nov 26 15:11:38 crc kubenswrapper[4785]: I1126 15:11:38.553084 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/61135322-c104-4bb0-ba53-30edf2a19800-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-6bbf4c9fdf-5bzvw\" (UID: \"61135322-c104-4bb0-ba53-30edf2a19800\") " pod="openshift-authentication/oauth-openshift-6bbf4c9fdf-5bzvw" Nov 26 15:11:38 crc kubenswrapper[4785]: I1126 15:11:38.553110 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/61135322-c104-4bb0-ba53-30edf2a19800-v4-0-config-system-serving-cert\") pod \"oauth-openshift-6bbf4c9fdf-5bzvw\" (UID: \"61135322-c104-4bb0-ba53-30edf2a19800\") " pod="openshift-authentication/oauth-openshift-6bbf4c9fdf-5bzvw" Nov 26 15:11:38 crc kubenswrapper[4785]: I1126 15:11:38.553135 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/61135322-c104-4bb0-ba53-30edf2a19800-v4-0-config-system-cliconfig\") pod \"oauth-openshift-6bbf4c9fdf-5bzvw\" (UID: \"61135322-c104-4bb0-ba53-30edf2a19800\") " pod="openshift-authentication/oauth-openshift-6bbf4c9fdf-5bzvw" Nov 26 15:11:38 crc kubenswrapper[4785]: I1126 15:11:38.553156 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/61135322-c104-4bb0-ba53-30edf2a19800-v4-0-config-user-template-error\") pod \"oauth-openshift-6bbf4c9fdf-5bzvw\" (UID: \"61135322-c104-4bb0-ba53-30edf2a19800\") " pod="openshift-authentication/oauth-openshift-6bbf4c9fdf-5bzvw" Nov 26 15:11:38 crc kubenswrapper[4785]: I1126 15:11:38.553216 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/61135322-c104-4bb0-ba53-30edf2a19800-v4-0-config-system-service-ca\") pod \"oauth-openshift-6bbf4c9fdf-5bzvw\" (UID: \"61135322-c104-4bb0-ba53-30edf2a19800\") " pod="openshift-authentication/oauth-openshift-6bbf4c9fdf-5bzvw" Nov 26 15:11:38 crc kubenswrapper[4785]: I1126 15:11:38.553245 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/61135322-c104-4bb0-ba53-30edf2a19800-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-6bbf4c9fdf-5bzvw\" (UID: \"61135322-c104-4bb0-ba53-30edf2a19800\") " pod="openshift-authentication/oauth-openshift-6bbf4c9fdf-5bzvw" Nov 26 15:11:38 crc kubenswrapper[4785]: I1126 15:11:38.553271 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/61135322-c104-4bb0-ba53-30edf2a19800-audit-policies\") pod \"oauth-openshift-6bbf4c9fdf-5bzvw\" (UID: \"61135322-c104-4bb0-ba53-30edf2a19800\") " pod="openshift-authentication/oauth-openshift-6bbf4c9fdf-5bzvw" Nov 26 15:11:38 crc kubenswrapper[4785]: I1126 15:11:38.553291 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/61135322-c104-4bb0-ba53-30edf2a19800-v4-0-config-system-router-certs\") pod \"oauth-openshift-6bbf4c9fdf-5bzvw\" (UID: \"61135322-c104-4bb0-ba53-30edf2a19800\") " pod="openshift-authentication/oauth-openshift-6bbf4c9fdf-5bzvw" Nov 26 15:11:38 crc kubenswrapper[4785]: I1126 15:11:38.553313 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/61135322-c104-4bb0-ba53-30edf2a19800-v4-0-config-system-session\") pod \"oauth-openshift-6bbf4c9fdf-5bzvw\" (UID: \"61135322-c104-4bb0-ba53-30edf2a19800\") " pod="openshift-authentication/oauth-openshift-6bbf4c9fdf-5bzvw" Nov 26 15:11:38 crc kubenswrapper[4785]: I1126 15:11:38.553333 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/61135322-c104-4bb0-ba53-30edf2a19800-v4-0-config-user-template-login\") pod \"oauth-openshift-6bbf4c9fdf-5bzvw\" (UID: \"61135322-c104-4bb0-ba53-30edf2a19800\") " pod="openshift-authentication/oauth-openshift-6bbf4c9fdf-5bzvw" Nov 26 15:11:38 crc kubenswrapper[4785]: I1126 15:11:38.553355 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-znxn9\" (UniqueName: \"kubernetes.io/projected/61135322-c104-4bb0-ba53-30edf2a19800-kube-api-access-znxn9\") pod \"oauth-openshift-6bbf4c9fdf-5bzvw\" (UID: \"61135322-c104-4bb0-ba53-30edf2a19800\") " pod="openshift-authentication/oauth-openshift-6bbf4c9fdf-5bzvw" Nov 26 15:11:38 crc kubenswrapper[4785]: I1126 15:11:38.553145 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/61135322-c104-4bb0-ba53-30edf2a19800-audit-dir\") pod \"oauth-openshift-6bbf4c9fdf-5bzvw\" (UID: \"61135322-c104-4bb0-ba53-30edf2a19800\") " pod="openshift-authentication/oauth-openshift-6bbf4c9fdf-5bzvw" Nov 26 15:11:38 crc kubenswrapper[4785]: I1126 15:11:38.554824 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/61135322-c104-4bb0-ba53-30edf2a19800-v4-0-config-system-service-ca\") pod \"oauth-openshift-6bbf4c9fdf-5bzvw\" (UID: \"61135322-c104-4bb0-ba53-30edf2a19800\") " pod="openshift-authentication/oauth-openshift-6bbf4c9fdf-5bzvw" Nov 26 15:11:38 crc kubenswrapper[4785]: I1126 15:11:38.555277 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/61135322-c104-4bb0-ba53-30edf2a19800-audit-policies\") pod \"oauth-openshift-6bbf4c9fdf-5bzvw\" (UID: \"61135322-c104-4bb0-ba53-30edf2a19800\") " pod="openshift-authentication/oauth-openshift-6bbf4c9fdf-5bzvw" Nov 26 15:11:38 crc kubenswrapper[4785]: I1126 15:11:38.555870 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/61135322-c104-4bb0-ba53-30edf2a19800-v4-0-config-system-cliconfig\") pod \"oauth-openshift-6bbf4c9fdf-5bzvw\" (UID: \"61135322-c104-4bb0-ba53-30edf2a19800\") " pod="openshift-authentication/oauth-openshift-6bbf4c9fdf-5bzvw" Nov 26 15:11:38 crc kubenswrapper[4785]: I1126 15:11:38.556469 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/61135322-c104-4bb0-ba53-30edf2a19800-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-6bbf4c9fdf-5bzvw\" (UID: \"61135322-c104-4bb0-ba53-30edf2a19800\") " pod="openshift-authentication/oauth-openshift-6bbf4c9fdf-5bzvw" Nov 26 15:11:38 crc kubenswrapper[4785]: I1126 15:11:38.559947 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/61135322-c104-4bb0-ba53-30edf2a19800-v4-0-config-user-template-error\") pod \"oauth-openshift-6bbf4c9fdf-5bzvw\" (UID: \"61135322-c104-4bb0-ba53-30edf2a19800\") " pod="openshift-authentication/oauth-openshift-6bbf4c9fdf-5bzvw" Nov 26 15:11:38 crc kubenswrapper[4785]: I1126 15:11:38.560253 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/61135322-c104-4bb0-ba53-30edf2a19800-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-6bbf4c9fdf-5bzvw\" (UID: \"61135322-c104-4bb0-ba53-30edf2a19800\") " pod="openshift-authentication/oauth-openshift-6bbf4c9fdf-5bzvw" Nov 26 15:11:38 crc kubenswrapper[4785]: I1126 15:11:38.560286 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/61135322-c104-4bb0-ba53-30edf2a19800-v4-0-config-user-template-login\") pod \"oauth-openshift-6bbf4c9fdf-5bzvw\" (UID: \"61135322-c104-4bb0-ba53-30edf2a19800\") " pod="openshift-authentication/oauth-openshift-6bbf4c9fdf-5bzvw" Nov 26 15:11:38 crc kubenswrapper[4785]: I1126 15:11:38.560229 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/61135322-c104-4bb0-ba53-30edf2a19800-v4-0-config-system-router-certs\") pod \"oauth-openshift-6bbf4c9fdf-5bzvw\" (UID: \"61135322-c104-4bb0-ba53-30edf2a19800\") " pod="openshift-authentication/oauth-openshift-6bbf4c9fdf-5bzvw" Nov 26 15:11:38 crc kubenswrapper[4785]: I1126 15:11:38.560466 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/61135322-c104-4bb0-ba53-30edf2a19800-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-6bbf4c9fdf-5bzvw\" (UID: \"61135322-c104-4bb0-ba53-30edf2a19800\") " pod="openshift-authentication/oauth-openshift-6bbf4c9fdf-5bzvw" Nov 26 15:11:38 crc kubenswrapper[4785]: I1126 15:11:38.560759 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/61135322-c104-4bb0-ba53-30edf2a19800-v4-0-config-system-session\") pod \"oauth-openshift-6bbf4c9fdf-5bzvw\" (UID: \"61135322-c104-4bb0-ba53-30edf2a19800\") " pod="openshift-authentication/oauth-openshift-6bbf4c9fdf-5bzvw" Nov 26 15:11:38 crc kubenswrapper[4785]: I1126 15:11:38.561094 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/61135322-c104-4bb0-ba53-30edf2a19800-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-6bbf4c9fdf-5bzvw\" (UID: \"61135322-c104-4bb0-ba53-30edf2a19800\") " pod="openshift-authentication/oauth-openshift-6bbf4c9fdf-5bzvw" Nov 26 15:11:38 crc kubenswrapper[4785]: I1126 15:11:38.561305 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/61135322-c104-4bb0-ba53-30edf2a19800-v4-0-config-system-serving-cert\") pod \"oauth-openshift-6bbf4c9fdf-5bzvw\" (UID: \"61135322-c104-4bb0-ba53-30edf2a19800\") " pod="openshift-authentication/oauth-openshift-6bbf4c9fdf-5bzvw" Nov 26 15:11:38 crc kubenswrapper[4785]: I1126 15:11:38.586682 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-znxn9\" (UniqueName: \"kubernetes.io/projected/61135322-c104-4bb0-ba53-30edf2a19800-kube-api-access-znxn9\") pod \"oauth-openshift-6bbf4c9fdf-5bzvw\" (UID: \"61135322-c104-4bb0-ba53-30edf2a19800\") " pod="openshift-authentication/oauth-openshift-6bbf4c9fdf-5bzvw" Nov 26 15:11:38 crc kubenswrapper[4785]: I1126 15:11:38.612368 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-6bbf4c9fdf-5bzvw" Nov 26 15:11:38 crc kubenswrapper[4785]: I1126 15:11:38.804741 4785 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-6bbf4c9fdf-5bzvw"] Nov 26 15:11:38 crc kubenswrapper[4785]: I1126 15:11:38.986970 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-6bbf4c9fdf-5bzvw" event={"ID":"61135322-c104-4bb0-ba53-30edf2a19800","Type":"ContainerStarted","Data":"e52cbb15979ce8cb123fe9bf6122b007a6b3f20457ba073861179dc45d61783c"} Nov 26 15:11:39 crc kubenswrapper[4785]: I1126 15:11:39.994429 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-6bbf4c9fdf-5bzvw" event={"ID":"61135322-c104-4bb0-ba53-30edf2a19800","Type":"ContainerStarted","Data":"e92f003340067673af013fdc7dcff38a0bd7c17ac709c0497f8922207410054f"} Nov 26 15:11:40 crc kubenswrapper[4785]: I1126 15:11:40.999469 4785 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-6bbf4c9fdf-5bzvw" Nov 26 15:11:41 crc kubenswrapper[4785]: I1126 15:11:41.005456 4785 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-6bbf4c9fdf-5bzvw" Nov 26 15:11:41 crc kubenswrapper[4785]: I1126 15:11:41.021866 4785 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-6bbf4c9fdf-5bzvw" podStartSLOduration=34.021846305 podStartE2EDuration="34.021846305s" podCreationTimestamp="2025-11-26 15:11:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 15:11:41.017808059 +0000 UTC m=+264.696173833" watchObservedRunningTime="2025-11-26 15:11:41.021846305 +0000 UTC m=+264.700212079" Nov 26 15:11:52 crc kubenswrapper[4785]: I1126 15:11:52.091321 4785 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-zb7zt"] Nov 26 15:11:52 crc kubenswrapper[4785]: I1126 15:11:52.091996 4785 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-zb7zt" podUID="9558a7a8-65c7-4127-9174-aa1036efc91f" containerName="registry-server" containerID="cri-o://f22d0151e6b00adccb5774242c90983df5e615327d30f4456a1c2184318c5330" gracePeriod=30 Nov 26 15:11:52 crc kubenswrapper[4785]: I1126 15:11:52.104199 4785 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-q8fnj"] Nov 26 15:11:52 crc kubenswrapper[4785]: I1126 15:11:52.104433 4785 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-q8fnj" podUID="c5c2881b-79ef-4249-a393-dc3141e2e7c2" containerName="registry-server" containerID="cri-o://56b6a6926e215ca4d5b2d845ab4bf69abf200e14f7eca8eae5cc0ee5b261ae10" gracePeriod=30 Nov 26 15:11:52 crc kubenswrapper[4785]: I1126 15:11:52.113660 4785 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-cllbv"] Nov 26 15:11:52 crc kubenswrapper[4785]: I1126 15:11:52.113892 4785 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-cllbv" podUID="c9abc8ae-a944-4f87-909b-8258f95c2c06" containerName="marketplace-operator" containerID="cri-o://fde06ae70c4fd282c87c511071786aca9fc0b03847e2f611cd8c7ab5e03c1f1b" gracePeriod=30 Nov 26 15:11:52 crc kubenswrapper[4785]: I1126 15:11:52.122174 4785 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-74t7j"] Nov 26 15:11:52 crc kubenswrapper[4785]: I1126 15:11:52.122397 4785 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-74t7j" podUID="5b0f6908-63a8-487c-ac41-e81114c43311" containerName="registry-server" containerID="cri-o://04f2300fcaa2521dd445dcfaacf8087f7d3d20484fce34a830fb014b3f2318c2" gracePeriod=30 Nov 26 15:11:52 crc kubenswrapper[4785]: I1126 15:11:52.131701 4785 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-rqxdc"] Nov 26 15:11:52 crc kubenswrapper[4785]: I1126 15:11:52.132014 4785 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-rqxdc" podUID="afb002bc-9134-4d4c-b956-15ad3e700c49" containerName="registry-server" containerID="cri-o://64d079305ba96f01fc98b43154c648a5f29e8b91bcbeff719ae2f935c0cad09d" gracePeriod=30 Nov 26 15:11:52 crc kubenswrapper[4785]: I1126 15:11:52.135682 4785 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-2fgr9"] Nov 26 15:11:52 crc kubenswrapper[4785]: I1126 15:11:52.137000 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-2fgr9" Nov 26 15:11:52 crc kubenswrapper[4785]: I1126 15:11:52.147123 4785 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-2fgr9"] Nov 26 15:11:52 crc kubenswrapper[4785]: I1126 15:11:52.319847 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/3dd50a40-cee2-4f3b-b522-cf1ab60c4be6-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-2fgr9\" (UID: \"3dd50a40-cee2-4f3b-b522-cf1ab60c4be6\") " pod="openshift-marketplace/marketplace-operator-79b997595-2fgr9" Nov 26 15:11:52 crc kubenswrapper[4785]: I1126 15:11:52.320243 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3dd50a40-cee2-4f3b-b522-cf1ab60c4be6-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-2fgr9\" (UID: \"3dd50a40-cee2-4f3b-b522-cf1ab60c4be6\") " pod="openshift-marketplace/marketplace-operator-79b997595-2fgr9" Nov 26 15:11:52 crc kubenswrapper[4785]: I1126 15:11:52.320318 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5p684\" (UniqueName: \"kubernetes.io/projected/3dd50a40-cee2-4f3b-b522-cf1ab60c4be6-kube-api-access-5p684\") pod \"marketplace-operator-79b997595-2fgr9\" (UID: \"3dd50a40-cee2-4f3b-b522-cf1ab60c4be6\") " pod="openshift-marketplace/marketplace-operator-79b997595-2fgr9" Nov 26 15:11:52 crc kubenswrapper[4785]: I1126 15:11:52.421840 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5p684\" (UniqueName: \"kubernetes.io/projected/3dd50a40-cee2-4f3b-b522-cf1ab60c4be6-kube-api-access-5p684\") pod \"marketplace-operator-79b997595-2fgr9\" (UID: \"3dd50a40-cee2-4f3b-b522-cf1ab60c4be6\") " pod="openshift-marketplace/marketplace-operator-79b997595-2fgr9" Nov 26 15:11:52 crc kubenswrapper[4785]: I1126 15:11:52.421916 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/3dd50a40-cee2-4f3b-b522-cf1ab60c4be6-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-2fgr9\" (UID: \"3dd50a40-cee2-4f3b-b522-cf1ab60c4be6\") " pod="openshift-marketplace/marketplace-operator-79b997595-2fgr9" Nov 26 15:11:52 crc kubenswrapper[4785]: I1126 15:11:52.421955 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3dd50a40-cee2-4f3b-b522-cf1ab60c4be6-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-2fgr9\" (UID: \"3dd50a40-cee2-4f3b-b522-cf1ab60c4be6\") " pod="openshift-marketplace/marketplace-operator-79b997595-2fgr9" Nov 26 15:11:52 crc kubenswrapper[4785]: I1126 15:11:52.423540 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3dd50a40-cee2-4f3b-b522-cf1ab60c4be6-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-2fgr9\" (UID: \"3dd50a40-cee2-4f3b-b522-cf1ab60c4be6\") " pod="openshift-marketplace/marketplace-operator-79b997595-2fgr9" Nov 26 15:11:52 crc kubenswrapper[4785]: I1126 15:11:52.472501 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/3dd50a40-cee2-4f3b-b522-cf1ab60c4be6-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-2fgr9\" (UID: \"3dd50a40-cee2-4f3b-b522-cf1ab60c4be6\") " pod="openshift-marketplace/marketplace-operator-79b997595-2fgr9" Nov 26 15:11:52 crc kubenswrapper[4785]: I1126 15:11:52.514259 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5p684\" (UniqueName: \"kubernetes.io/projected/3dd50a40-cee2-4f3b-b522-cf1ab60c4be6-kube-api-access-5p684\") pod \"marketplace-operator-79b997595-2fgr9\" (UID: \"3dd50a40-cee2-4f3b-b522-cf1ab60c4be6\") " pod="openshift-marketplace/marketplace-operator-79b997595-2fgr9" Nov 26 15:11:52 crc kubenswrapper[4785]: I1126 15:11:52.578121 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-2fgr9" Nov 26 15:11:52 crc kubenswrapper[4785]: I1126 15:11:52.604184 4785 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zb7zt" Nov 26 15:11:52 crc kubenswrapper[4785]: I1126 15:11:52.669654 4785 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-q8fnj" Nov 26 15:11:52 crc kubenswrapper[4785]: I1126 15:11:52.698266 4785 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-cllbv" Nov 26 15:11:52 crc kubenswrapper[4785]: I1126 15:11:52.699937 4785 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rqxdc" Nov 26 15:11:52 crc kubenswrapper[4785]: I1126 15:11:52.709183 4785 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-74t7j" Nov 26 15:11:52 crc kubenswrapper[4785]: I1126 15:11:52.735123 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9558a7a8-65c7-4127-9174-aa1036efc91f-utilities\") pod \"9558a7a8-65c7-4127-9174-aa1036efc91f\" (UID: \"9558a7a8-65c7-4127-9174-aa1036efc91f\") " Nov 26 15:11:52 crc kubenswrapper[4785]: I1126 15:11:52.735244 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gctt6\" (UniqueName: \"kubernetes.io/projected/9558a7a8-65c7-4127-9174-aa1036efc91f-kube-api-access-gctt6\") pod \"9558a7a8-65c7-4127-9174-aa1036efc91f\" (UID: \"9558a7a8-65c7-4127-9174-aa1036efc91f\") " Nov 26 15:11:52 crc kubenswrapper[4785]: I1126 15:11:52.735348 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9558a7a8-65c7-4127-9174-aa1036efc91f-catalog-content\") pod \"9558a7a8-65c7-4127-9174-aa1036efc91f\" (UID: \"9558a7a8-65c7-4127-9174-aa1036efc91f\") " Nov 26 15:11:52 crc kubenswrapper[4785]: I1126 15:11:52.736403 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9558a7a8-65c7-4127-9174-aa1036efc91f-utilities" (OuterVolumeSpecName: "utilities") pod "9558a7a8-65c7-4127-9174-aa1036efc91f" (UID: "9558a7a8-65c7-4127-9174-aa1036efc91f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 15:11:52 crc kubenswrapper[4785]: I1126 15:11:52.741716 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9558a7a8-65c7-4127-9174-aa1036efc91f-kube-api-access-gctt6" (OuterVolumeSpecName: "kube-api-access-gctt6") pod "9558a7a8-65c7-4127-9174-aa1036efc91f" (UID: "9558a7a8-65c7-4127-9174-aa1036efc91f"). InnerVolumeSpecName "kube-api-access-gctt6". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 15:11:52 crc kubenswrapper[4785]: I1126 15:11:52.786732 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9558a7a8-65c7-4127-9174-aa1036efc91f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9558a7a8-65c7-4127-9174-aa1036efc91f" (UID: "9558a7a8-65c7-4127-9174-aa1036efc91f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 15:11:52 crc kubenswrapper[4785]: I1126 15:11:52.836279 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/afb002bc-9134-4d4c-b956-15ad3e700c49-catalog-content\") pod \"afb002bc-9134-4d4c-b956-15ad3e700c49\" (UID: \"afb002bc-9134-4d4c-b956-15ad3e700c49\") " Nov 26 15:11:52 crc kubenswrapper[4785]: I1126 15:11:52.836349 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b7wkz\" (UniqueName: \"kubernetes.io/projected/afb002bc-9134-4d4c-b956-15ad3e700c49-kube-api-access-b7wkz\") pod \"afb002bc-9134-4d4c-b956-15ad3e700c49\" (UID: \"afb002bc-9134-4d4c-b956-15ad3e700c49\") " Nov 26 15:11:52 crc kubenswrapper[4785]: I1126 15:11:52.836410 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/afb002bc-9134-4d4c-b956-15ad3e700c49-utilities\") pod \"afb002bc-9134-4d4c-b956-15ad3e700c49\" (UID: \"afb002bc-9134-4d4c-b956-15ad3e700c49\") " Nov 26 15:11:52 crc kubenswrapper[4785]: I1126 15:11:52.836456 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c5c2881b-79ef-4249-a393-dc3141e2e7c2-utilities\") pod \"c5c2881b-79ef-4249-a393-dc3141e2e7c2\" (UID: \"c5c2881b-79ef-4249-a393-dc3141e2e7c2\") " Nov 26 15:11:52 crc kubenswrapper[4785]: I1126 15:11:52.836490 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5b0f6908-63a8-487c-ac41-e81114c43311-catalog-content\") pod \"5b0f6908-63a8-487c-ac41-e81114c43311\" (UID: \"5b0f6908-63a8-487c-ac41-e81114c43311\") " Nov 26 15:11:52 crc kubenswrapper[4785]: I1126 15:11:52.836523 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dsx94\" (UniqueName: \"kubernetes.io/projected/c5c2881b-79ef-4249-a393-dc3141e2e7c2-kube-api-access-dsx94\") pod \"c5c2881b-79ef-4249-a393-dc3141e2e7c2\" (UID: \"c5c2881b-79ef-4249-a393-dc3141e2e7c2\") " Nov 26 15:11:52 crc kubenswrapper[4785]: I1126 15:11:52.836608 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5b0f6908-63a8-487c-ac41-e81114c43311-utilities\") pod \"5b0f6908-63a8-487c-ac41-e81114c43311\" (UID: \"5b0f6908-63a8-487c-ac41-e81114c43311\") " Nov 26 15:11:52 crc kubenswrapper[4785]: I1126 15:11:52.836636 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/c9abc8ae-a944-4f87-909b-8258f95c2c06-marketplace-operator-metrics\") pod \"c9abc8ae-a944-4f87-909b-8258f95c2c06\" (UID: \"c9abc8ae-a944-4f87-909b-8258f95c2c06\") " Nov 26 15:11:52 crc kubenswrapper[4785]: I1126 15:11:52.836662 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c5c2881b-79ef-4249-a393-dc3141e2e7c2-catalog-content\") pod \"c5c2881b-79ef-4249-a393-dc3141e2e7c2\" (UID: \"c5c2881b-79ef-4249-a393-dc3141e2e7c2\") " Nov 26 15:11:52 crc kubenswrapper[4785]: I1126 15:11:52.836691 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kqp65\" (UniqueName: \"kubernetes.io/projected/5b0f6908-63a8-487c-ac41-e81114c43311-kube-api-access-kqp65\") pod \"5b0f6908-63a8-487c-ac41-e81114c43311\" (UID: \"5b0f6908-63a8-487c-ac41-e81114c43311\") " Nov 26 15:11:52 crc kubenswrapper[4785]: I1126 15:11:52.836730 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wzzr5\" (UniqueName: \"kubernetes.io/projected/c9abc8ae-a944-4f87-909b-8258f95c2c06-kube-api-access-wzzr5\") pod \"c9abc8ae-a944-4f87-909b-8258f95c2c06\" (UID: \"c9abc8ae-a944-4f87-909b-8258f95c2c06\") " Nov 26 15:11:52 crc kubenswrapper[4785]: I1126 15:11:52.836757 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c9abc8ae-a944-4f87-909b-8258f95c2c06-marketplace-trusted-ca\") pod \"c9abc8ae-a944-4f87-909b-8258f95c2c06\" (UID: \"c9abc8ae-a944-4f87-909b-8258f95c2c06\") " Nov 26 15:11:52 crc kubenswrapper[4785]: I1126 15:11:52.836993 4785 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9558a7a8-65c7-4127-9174-aa1036efc91f-utilities\") on node \"crc\" DevicePath \"\"" Nov 26 15:11:52 crc kubenswrapper[4785]: I1126 15:11:52.837013 4785 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gctt6\" (UniqueName: \"kubernetes.io/projected/9558a7a8-65c7-4127-9174-aa1036efc91f-kube-api-access-gctt6\") on node \"crc\" DevicePath \"\"" Nov 26 15:11:52 crc kubenswrapper[4785]: I1126 15:11:52.837026 4785 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9558a7a8-65c7-4127-9174-aa1036efc91f-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 26 15:11:52 crc kubenswrapper[4785]: I1126 15:11:52.837323 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/afb002bc-9134-4d4c-b956-15ad3e700c49-utilities" (OuterVolumeSpecName: "utilities") pod "afb002bc-9134-4d4c-b956-15ad3e700c49" (UID: "afb002bc-9134-4d4c-b956-15ad3e700c49"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 15:11:52 crc kubenswrapper[4785]: I1126 15:11:52.837625 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c5c2881b-79ef-4249-a393-dc3141e2e7c2-utilities" (OuterVolumeSpecName: "utilities") pod "c5c2881b-79ef-4249-a393-dc3141e2e7c2" (UID: "c5c2881b-79ef-4249-a393-dc3141e2e7c2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 15:11:52 crc kubenswrapper[4785]: I1126 15:11:52.837737 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5b0f6908-63a8-487c-ac41-e81114c43311-utilities" (OuterVolumeSpecName: "utilities") pod "5b0f6908-63a8-487c-ac41-e81114c43311" (UID: "5b0f6908-63a8-487c-ac41-e81114c43311"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 15:11:52 crc kubenswrapper[4785]: I1126 15:11:52.837779 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c9abc8ae-a944-4f87-909b-8258f95c2c06-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "c9abc8ae-a944-4f87-909b-8258f95c2c06" (UID: "c9abc8ae-a944-4f87-909b-8258f95c2c06"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 15:11:52 crc kubenswrapper[4785]: I1126 15:11:52.839525 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/afb002bc-9134-4d4c-b956-15ad3e700c49-kube-api-access-b7wkz" (OuterVolumeSpecName: "kube-api-access-b7wkz") pod "afb002bc-9134-4d4c-b956-15ad3e700c49" (UID: "afb002bc-9134-4d4c-b956-15ad3e700c49"). InnerVolumeSpecName "kube-api-access-b7wkz". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 15:11:52 crc kubenswrapper[4785]: I1126 15:11:52.839611 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b0f6908-63a8-487c-ac41-e81114c43311-kube-api-access-kqp65" (OuterVolumeSpecName: "kube-api-access-kqp65") pod "5b0f6908-63a8-487c-ac41-e81114c43311" (UID: "5b0f6908-63a8-487c-ac41-e81114c43311"). InnerVolumeSpecName "kube-api-access-kqp65". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 15:11:52 crc kubenswrapper[4785]: I1126 15:11:52.840041 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c5c2881b-79ef-4249-a393-dc3141e2e7c2-kube-api-access-dsx94" (OuterVolumeSpecName: "kube-api-access-dsx94") pod "c5c2881b-79ef-4249-a393-dc3141e2e7c2" (UID: "c5c2881b-79ef-4249-a393-dc3141e2e7c2"). InnerVolumeSpecName "kube-api-access-dsx94". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 15:11:52 crc kubenswrapper[4785]: I1126 15:11:52.840207 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c9abc8ae-a944-4f87-909b-8258f95c2c06-kube-api-access-wzzr5" (OuterVolumeSpecName: "kube-api-access-wzzr5") pod "c9abc8ae-a944-4f87-909b-8258f95c2c06" (UID: "c9abc8ae-a944-4f87-909b-8258f95c2c06"). InnerVolumeSpecName "kube-api-access-wzzr5". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 15:11:52 crc kubenswrapper[4785]: I1126 15:11:52.844818 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c9abc8ae-a944-4f87-909b-8258f95c2c06-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "c9abc8ae-a944-4f87-909b-8258f95c2c06" (UID: "c9abc8ae-a944-4f87-909b-8258f95c2c06"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 15:11:52 crc kubenswrapper[4785]: I1126 15:11:52.858219 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5b0f6908-63a8-487c-ac41-e81114c43311-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5b0f6908-63a8-487c-ac41-e81114c43311" (UID: "5b0f6908-63a8-487c-ac41-e81114c43311"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 15:11:52 crc kubenswrapper[4785]: I1126 15:11:52.933090 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c5c2881b-79ef-4249-a393-dc3141e2e7c2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c5c2881b-79ef-4249-a393-dc3141e2e7c2" (UID: "c5c2881b-79ef-4249-a393-dc3141e2e7c2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 15:11:52 crc kubenswrapper[4785]: I1126 15:11:52.938662 4785 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/afb002bc-9134-4d4c-b956-15ad3e700c49-utilities\") on node \"crc\" DevicePath \"\"" Nov 26 15:11:52 crc kubenswrapper[4785]: I1126 15:11:52.938704 4785 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c5c2881b-79ef-4249-a393-dc3141e2e7c2-utilities\") on node \"crc\" DevicePath \"\"" Nov 26 15:11:52 crc kubenswrapper[4785]: I1126 15:11:52.938718 4785 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5b0f6908-63a8-487c-ac41-e81114c43311-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 26 15:11:52 crc kubenswrapper[4785]: I1126 15:11:52.938734 4785 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dsx94\" (UniqueName: \"kubernetes.io/projected/c5c2881b-79ef-4249-a393-dc3141e2e7c2-kube-api-access-dsx94\") on node \"crc\" DevicePath \"\"" Nov 26 15:11:52 crc kubenswrapper[4785]: I1126 15:11:52.938748 4785 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5b0f6908-63a8-487c-ac41-e81114c43311-utilities\") on node \"crc\" DevicePath \"\"" Nov 26 15:11:52 crc kubenswrapper[4785]: I1126 15:11:52.938760 4785 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/c9abc8ae-a944-4f87-909b-8258f95c2c06-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Nov 26 15:11:52 crc kubenswrapper[4785]: I1126 15:11:52.938773 4785 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c5c2881b-79ef-4249-a393-dc3141e2e7c2-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 26 15:11:52 crc kubenswrapper[4785]: I1126 15:11:52.938785 4785 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kqp65\" (UniqueName: \"kubernetes.io/projected/5b0f6908-63a8-487c-ac41-e81114c43311-kube-api-access-kqp65\") on node \"crc\" DevicePath \"\"" Nov 26 15:11:52 crc kubenswrapper[4785]: I1126 15:11:52.938796 4785 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wzzr5\" (UniqueName: \"kubernetes.io/projected/c9abc8ae-a944-4f87-909b-8258f95c2c06-kube-api-access-wzzr5\") on node \"crc\" DevicePath \"\"" Nov 26 15:11:52 crc kubenswrapper[4785]: I1126 15:11:52.938808 4785 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c9abc8ae-a944-4f87-909b-8258f95c2c06-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Nov 26 15:11:52 crc kubenswrapper[4785]: I1126 15:11:52.938820 4785 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b7wkz\" (UniqueName: \"kubernetes.io/projected/afb002bc-9134-4d4c-b956-15ad3e700c49-kube-api-access-b7wkz\") on node \"crc\" DevicePath \"\"" Nov 26 15:11:52 crc kubenswrapper[4785]: I1126 15:11:52.945028 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/afb002bc-9134-4d4c-b956-15ad3e700c49-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "afb002bc-9134-4d4c-b956-15ad3e700c49" (UID: "afb002bc-9134-4d4c-b956-15ad3e700c49"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 15:11:53 crc kubenswrapper[4785]: I1126 15:11:53.044934 4785 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/afb002bc-9134-4d4c-b956-15ad3e700c49-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 26 15:11:53 crc kubenswrapper[4785]: I1126 15:11:53.055450 4785 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-2fgr9"] Nov 26 15:11:53 crc kubenswrapper[4785]: I1126 15:11:53.074913 4785 generic.go:334] "Generic (PLEG): container finished" podID="c5c2881b-79ef-4249-a393-dc3141e2e7c2" containerID="56b6a6926e215ca4d5b2d845ab4bf69abf200e14f7eca8eae5cc0ee5b261ae10" exitCode=0 Nov 26 15:11:53 crc kubenswrapper[4785]: I1126 15:11:53.074990 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-q8fnj" event={"ID":"c5c2881b-79ef-4249-a393-dc3141e2e7c2","Type":"ContainerDied","Data":"56b6a6926e215ca4d5b2d845ab4bf69abf200e14f7eca8eae5cc0ee5b261ae10"} Nov 26 15:11:53 crc kubenswrapper[4785]: I1126 15:11:53.075041 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-q8fnj" event={"ID":"c5c2881b-79ef-4249-a393-dc3141e2e7c2","Type":"ContainerDied","Data":"0328e44a93acc37b968cfd29fbdaa8b8ae10b88d6b11611cbb4c7e18393df392"} Nov 26 15:11:53 crc kubenswrapper[4785]: I1126 15:11:53.075133 4785 scope.go:117] "RemoveContainer" containerID="56b6a6926e215ca4d5b2d845ab4bf69abf200e14f7eca8eae5cc0ee5b261ae10" Nov 26 15:11:53 crc kubenswrapper[4785]: I1126 15:11:53.075702 4785 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-q8fnj" Nov 26 15:11:53 crc kubenswrapper[4785]: I1126 15:11:53.076885 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-2fgr9" event={"ID":"3dd50a40-cee2-4f3b-b522-cf1ab60c4be6","Type":"ContainerStarted","Data":"46ac2a1e8be5cb5824ee78857ca85e81521050317b2106764e23693466c06031"} Nov 26 15:11:53 crc kubenswrapper[4785]: I1126 15:11:53.080072 4785 generic.go:334] "Generic (PLEG): container finished" podID="afb002bc-9134-4d4c-b956-15ad3e700c49" containerID="64d079305ba96f01fc98b43154c648a5f29e8b91bcbeff719ae2f935c0cad09d" exitCode=0 Nov 26 15:11:53 crc kubenswrapper[4785]: I1126 15:11:53.080136 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rqxdc" event={"ID":"afb002bc-9134-4d4c-b956-15ad3e700c49","Type":"ContainerDied","Data":"64d079305ba96f01fc98b43154c648a5f29e8b91bcbeff719ae2f935c0cad09d"} Nov 26 15:11:53 crc kubenswrapper[4785]: I1126 15:11:53.080152 4785 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rqxdc" Nov 26 15:11:53 crc kubenswrapper[4785]: I1126 15:11:53.080156 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rqxdc" event={"ID":"afb002bc-9134-4d4c-b956-15ad3e700c49","Type":"ContainerDied","Data":"337b3925081fe5aa549c6af8a2761dbe6ab1da83c6a832f87daecafc8bd5d769"} Nov 26 15:11:53 crc kubenswrapper[4785]: I1126 15:11:53.085407 4785 generic.go:334] "Generic (PLEG): container finished" podID="9558a7a8-65c7-4127-9174-aa1036efc91f" containerID="f22d0151e6b00adccb5774242c90983df5e615327d30f4456a1c2184318c5330" exitCode=0 Nov 26 15:11:53 crc kubenswrapper[4785]: I1126 15:11:53.085520 4785 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zb7zt" Nov 26 15:11:53 crc kubenswrapper[4785]: I1126 15:11:53.085495 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zb7zt" event={"ID":"9558a7a8-65c7-4127-9174-aa1036efc91f","Type":"ContainerDied","Data":"f22d0151e6b00adccb5774242c90983df5e615327d30f4456a1c2184318c5330"} Nov 26 15:11:53 crc kubenswrapper[4785]: I1126 15:11:53.085632 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zb7zt" event={"ID":"9558a7a8-65c7-4127-9174-aa1036efc91f","Type":"ContainerDied","Data":"c57f52d524b132a6298c0b6d4f6cdf6751ca2b2ef6821e3cd9b20c9fabc15bfa"} Nov 26 15:11:53 crc kubenswrapper[4785]: I1126 15:11:53.088460 4785 generic.go:334] "Generic (PLEG): container finished" podID="c9abc8ae-a944-4f87-909b-8258f95c2c06" containerID="fde06ae70c4fd282c87c511071786aca9fc0b03847e2f611cd8c7ab5e03c1f1b" exitCode=0 Nov 26 15:11:53 crc kubenswrapper[4785]: I1126 15:11:53.088609 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-cllbv" event={"ID":"c9abc8ae-a944-4f87-909b-8258f95c2c06","Type":"ContainerDied","Data":"fde06ae70c4fd282c87c511071786aca9fc0b03847e2f611cd8c7ab5e03c1f1b"} Nov 26 15:11:53 crc kubenswrapper[4785]: I1126 15:11:53.088724 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-cllbv" event={"ID":"c9abc8ae-a944-4f87-909b-8258f95c2c06","Type":"ContainerDied","Data":"a36dcba9ed41cef720133ca4e4654b2cc3a3664cafeb1c6c82703ec22ef8cc11"} Nov 26 15:11:53 crc kubenswrapper[4785]: I1126 15:11:53.088673 4785 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-cllbv" Nov 26 15:11:53 crc kubenswrapper[4785]: I1126 15:11:53.092047 4785 generic.go:334] "Generic (PLEG): container finished" podID="5b0f6908-63a8-487c-ac41-e81114c43311" containerID="04f2300fcaa2521dd445dcfaacf8087f7d3d20484fce34a830fb014b3f2318c2" exitCode=0 Nov 26 15:11:53 crc kubenswrapper[4785]: I1126 15:11:53.092095 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-74t7j" event={"ID":"5b0f6908-63a8-487c-ac41-e81114c43311","Type":"ContainerDied","Data":"04f2300fcaa2521dd445dcfaacf8087f7d3d20484fce34a830fb014b3f2318c2"} Nov 26 15:11:53 crc kubenswrapper[4785]: I1126 15:11:53.092267 4785 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-74t7j" Nov 26 15:11:53 crc kubenswrapper[4785]: I1126 15:11:53.092833 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-74t7j" event={"ID":"5b0f6908-63a8-487c-ac41-e81114c43311","Type":"ContainerDied","Data":"138862daa24f51976894076a140afe04aa715795ad6946449ebf19c5d581ecfe"} Nov 26 15:11:53 crc kubenswrapper[4785]: I1126 15:11:53.104035 4785 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-q8fnj"] Nov 26 15:11:53 crc kubenswrapper[4785]: I1126 15:11:53.107086 4785 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-q8fnj"] Nov 26 15:11:53 crc kubenswrapper[4785]: I1126 15:11:53.111487 4785 scope.go:117] "RemoveContainer" containerID="f7f636ad237724bcaca6c69358cb44c7e7996e7be8803fee8d3a6bbd582ca91d" Nov 26 15:11:53 crc kubenswrapper[4785]: I1126 15:11:53.122320 4785 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-rqxdc"] Nov 26 15:11:53 crc kubenswrapper[4785]: I1126 15:11:53.125997 4785 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-rqxdc"] Nov 26 15:11:53 crc kubenswrapper[4785]: I1126 15:11:53.136090 4785 scope.go:117] "RemoveContainer" containerID="91dbbf16cf7a503365ec8b3c0ccc04696c430d484a8382fe5c62be9caa6c695e" Nov 26 15:11:53 crc kubenswrapper[4785]: I1126 15:11:53.136154 4785 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-cllbv"] Nov 26 15:11:53 crc kubenswrapper[4785]: I1126 15:11:53.140818 4785 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-cllbv"] Nov 26 15:11:53 crc kubenswrapper[4785]: I1126 15:11:53.153952 4785 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-zb7zt"] Nov 26 15:11:53 crc kubenswrapper[4785]: I1126 15:11:53.158072 4785 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-zb7zt"] Nov 26 15:11:53 crc kubenswrapper[4785]: I1126 15:11:53.163262 4785 scope.go:117] "RemoveContainer" containerID="56b6a6926e215ca4d5b2d845ab4bf69abf200e14f7eca8eae5cc0ee5b261ae10" Nov 26 15:11:53 crc kubenswrapper[4785]: I1126 15:11:53.163411 4785 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-74t7j"] Nov 26 15:11:53 crc kubenswrapper[4785]: E1126 15:11:53.164833 4785 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"56b6a6926e215ca4d5b2d845ab4bf69abf200e14f7eca8eae5cc0ee5b261ae10\": container with ID starting with 56b6a6926e215ca4d5b2d845ab4bf69abf200e14f7eca8eae5cc0ee5b261ae10 not found: ID does not exist" containerID="56b6a6926e215ca4d5b2d845ab4bf69abf200e14f7eca8eae5cc0ee5b261ae10" Nov 26 15:11:53 crc kubenswrapper[4785]: I1126 15:11:53.164879 4785 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"56b6a6926e215ca4d5b2d845ab4bf69abf200e14f7eca8eae5cc0ee5b261ae10"} err="failed to get container status \"56b6a6926e215ca4d5b2d845ab4bf69abf200e14f7eca8eae5cc0ee5b261ae10\": rpc error: code = NotFound desc = could not find container \"56b6a6926e215ca4d5b2d845ab4bf69abf200e14f7eca8eae5cc0ee5b261ae10\": container with ID starting with 56b6a6926e215ca4d5b2d845ab4bf69abf200e14f7eca8eae5cc0ee5b261ae10 not found: ID does not exist" Nov 26 15:11:53 crc kubenswrapper[4785]: I1126 15:11:53.164910 4785 scope.go:117] "RemoveContainer" containerID="f7f636ad237724bcaca6c69358cb44c7e7996e7be8803fee8d3a6bbd582ca91d" Nov 26 15:11:53 crc kubenswrapper[4785]: E1126 15:11:53.165270 4785 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f7f636ad237724bcaca6c69358cb44c7e7996e7be8803fee8d3a6bbd582ca91d\": container with ID starting with f7f636ad237724bcaca6c69358cb44c7e7996e7be8803fee8d3a6bbd582ca91d not found: ID does not exist" containerID="f7f636ad237724bcaca6c69358cb44c7e7996e7be8803fee8d3a6bbd582ca91d" Nov 26 15:11:53 crc kubenswrapper[4785]: I1126 15:11:53.165310 4785 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f7f636ad237724bcaca6c69358cb44c7e7996e7be8803fee8d3a6bbd582ca91d"} err="failed to get container status \"f7f636ad237724bcaca6c69358cb44c7e7996e7be8803fee8d3a6bbd582ca91d\": rpc error: code = NotFound desc = could not find container \"f7f636ad237724bcaca6c69358cb44c7e7996e7be8803fee8d3a6bbd582ca91d\": container with ID starting with f7f636ad237724bcaca6c69358cb44c7e7996e7be8803fee8d3a6bbd582ca91d not found: ID does not exist" Nov 26 15:11:53 crc kubenswrapper[4785]: I1126 15:11:53.165336 4785 scope.go:117] "RemoveContainer" containerID="91dbbf16cf7a503365ec8b3c0ccc04696c430d484a8382fe5c62be9caa6c695e" Nov 26 15:11:53 crc kubenswrapper[4785]: E1126 15:11:53.165752 4785 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"91dbbf16cf7a503365ec8b3c0ccc04696c430d484a8382fe5c62be9caa6c695e\": container with ID starting with 91dbbf16cf7a503365ec8b3c0ccc04696c430d484a8382fe5c62be9caa6c695e not found: ID does not exist" containerID="91dbbf16cf7a503365ec8b3c0ccc04696c430d484a8382fe5c62be9caa6c695e" Nov 26 15:11:53 crc kubenswrapper[4785]: I1126 15:11:53.165784 4785 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"91dbbf16cf7a503365ec8b3c0ccc04696c430d484a8382fe5c62be9caa6c695e"} err="failed to get container status \"91dbbf16cf7a503365ec8b3c0ccc04696c430d484a8382fe5c62be9caa6c695e\": rpc error: code = NotFound desc = could not find container \"91dbbf16cf7a503365ec8b3c0ccc04696c430d484a8382fe5c62be9caa6c695e\": container with ID starting with 91dbbf16cf7a503365ec8b3c0ccc04696c430d484a8382fe5c62be9caa6c695e not found: ID does not exist" Nov 26 15:11:53 crc kubenswrapper[4785]: I1126 15:11:53.165802 4785 scope.go:117] "RemoveContainer" containerID="64d079305ba96f01fc98b43154c648a5f29e8b91bcbeff719ae2f935c0cad09d" Nov 26 15:11:53 crc kubenswrapper[4785]: I1126 15:11:53.166771 4785 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-74t7j"] Nov 26 15:11:53 crc kubenswrapper[4785]: I1126 15:11:53.179658 4785 scope.go:117] "RemoveContainer" containerID="92fb77da21fbfafb18f7fc7830fad7ddba592c90d22dce9690cde855b699420a" Nov 26 15:11:53 crc kubenswrapper[4785]: I1126 15:11:53.198307 4785 scope.go:117] "RemoveContainer" containerID="31bd3a334482ead8b5c1623efea7ce87998d06a89110c376bc47c73f3ece6d67" Nov 26 15:11:53 crc kubenswrapper[4785]: I1126 15:11:53.211380 4785 scope.go:117] "RemoveContainer" containerID="64d079305ba96f01fc98b43154c648a5f29e8b91bcbeff719ae2f935c0cad09d" Nov 26 15:11:53 crc kubenswrapper[4785]: E1126 15:11:53.211759 4785 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"64d079305ba96f01fc98b43154c648a5f29e8b91bcbeff719ae2f935c0cad09d\": container with ID starting with 64d079305ba96f01fc98b43154c648a5f29e8b91bcbeff719ae2f935c0cad09d not found: ID does not exist" containerID="64d079305ba96f01fc98b43154c648a5f29e8b91bcbeff719ae2f935c0cad09d" Nov 26 15:11:53 crc kubenswrapper[4785]: I1126 15:11:53.211791 4785 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"64d079305ba96f01fc98b43154c648a5f29e8b91bcbeff719ae2f935c0cad09d"} err="failed to get container status \"64d079305ba96f01fc98b43154c648a5f29e8b91bcbeff719ae2f935c0cad09d\": rpc error: code = NotFound desc = could not find container \"64d079305ba96f01fc98b43154c648a5f29e8b91bcbeff719ae2f935c0cad09d\": container with ID starting with 64d079305ba96f01fc98b43154c648a5f29e8b91bcbeff719ae2f935c0cad09d not found: ID does not exist" Nov 26 15:11:53 crc kubenswrapper[4785]: I1126 15:11:53.211817 4785 scope.go:117] "RemoveContainer" containerID="92fb77da21fbfafb18f7fc7830fad7ddba592c90d22dce9690cde855b699420a" Nov 26 15:11:53 crc kubenswrapper[4785]: E1126 15:11:53.212174 4785 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"92fb77da21fbfafb18f7fc7830fad7ddba592c90d22dce9690cde855b699420a\": container with ID starting with 92fb77da21fbfafb18f7fc7830fad7ddba592c90d22dce9690cde855b699420a not found: ID does not exist" containerID="92fb77da21fbfafb18f7fc7830fad7ddba592c90d22dce9690cde855b699420a" Nov 26 15:11:53 crc kubenswrapper[4785]: I1126 15:11:53.212196 4785 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"92fb77da21fbfafb18f7fc7830fad7ddba592c90d22dce9690cde855b699420a"} err="failed to get container status \"92fb77da21fbfafb18f7fc7830fad7ddba592c90d22dce9690cde855b699420a\": rpc error: code = NotFound desc = could not find container \"92fb77da21fbfafb18f7fc7830fad7ddba592c90d22dce9690cde855b699420a\": container with ID starting with 92fb77da21fbfafb18f7fc7830fad7ddba592c90d22dce9690cde855b699420a not found: ID does not exist" Nov 26 15:11:53 crc kubenswrapper[4785]: I1126 15:11:53.212209 4785 scope.go:117] "RemoveContainer" containerID="31bd3a334482ead8b5c1623efea7ce87998d06a89110c376bc47c73f3ece6d67" Nov 26 15:11:53 crc kubenswrapper[4785]: E1126 15:11:53.212521 4785 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"31bd3a334482ead8b5c1623efea7ce87998d06a89110c376bc47c73f3ece6d67\": container with ID starting with 31bd3a334482ead8b5c1623efea7ce87998d06a89110c376bc47c73f3ece6d67 not found: ID does not exist" containerID="31bd3a334482ead8b5c1623efea7ce87998d06a89110c376bc47c73f3ece6d67" Nov 26 15:11:53 crc kubenswrapper[4785]: I1126 15:11:53.212546 4785 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"31bd3a334482ead8b5c1623efea7ce87998d06a89110c376bc47c73f3ece6d67"} err="failed to get container status \"31bd3a334482ead8b5c1623efea7ce87998d06a89110c376bc47c73f3ece6d67\": rpc error: code = NotFound desc = could not find container \"31bd3a334482ead8b5c1623efea7ce87998d06a89110c376bc47c73f3ece6d67\": container with ID starting with 31bd3a334482ead8b5c1623efea7ce87998d06a89110c376bc47c73f3ece6d67 not found: ID does not exist" Nov 26 15:11:53 crc kubenswrapper[4785]: I1126 15:11:53.212588 4785 scope.go:117] "RemoveContainer" containerID="f22d0151e6b00adccb5774242c90983df5e615327d30f4456a1c2184318c5330" Nov 26 15:11:53 crc kubenswrapper[4785]: I1126 15:11:53.228008 4785 scope.go:117] "RemoveContainer" containerID="82ef80c8af4f9f0e3656991221146c39ae84d5198ddced371ed1514e247a5272" Nov 26 15:11:53 crc kubenswrapper[4785]: I1126 15:11:53.246486 4785 scope.go:117] "RemoveContainer" containerID="87eac17aa6b913faf22fad6e705625c5fcb268424a7043799be8837a60aae5e1" Nov 26 15:11:53 crc kubenswrapper[4785]: I1126 15:11:53.261355 4785 scope.go:117] "RemoveContainer" containerID="f22d0151e6b00adccb5774242c90983df5e615327d30f4456a1c2184318c5330" Nov 26 15:11:53 crc kubenswrapper[4785]: E1126 15:11:53.261940 4785 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f22d0151e6b00adccb5774242c90983df5e615327d30f4456a1c2184318c5330\": container with ID starting with f22d0151e6b00adccb5774242c90983df5e615327d30f4456a1c2184318c5330 not found: ID does not exist" containerID="f22d0151e6b00adccb5774242c90983df5e615327d30f4456a1c2184318c5330" Nov 26 15:11:53 crc kubenswrapper[4785]: I1126 15:11:53.262004 4785 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f22d0151e6b00adccb5774242c90983df5e615327d30f4456a1c2184318c5330"} err="failed to get container status \"f22d0151e6b00adccb5774242c90983df5e615327d30f4456a1c2184318c5330\": rpc error: code = NotFound desc = could not find container \"f22d0151e6b00adccb5774242c90983df5e615327d30f4456a1c2184318c5330\": container with ID starting with f22d0151e6b00adccb5774242c90983df5e615327d30f4456a1c2184318c5330 not found: ID does not exist" Nov 26 15:11:53 crc kubenswrapper[4785]: I1126 15:11:53.262042 4785 scope.go:117] "RemoveContainer" containerID="82ef80c8af4f9f0e3656991221146c39ae84d5198ddced371ed1514e247a5272" Nov 26 15:11:53 crc kubenswrapper[4785]: E1126 15:11:53.262627 4785 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"82ef80c8af4f9f0e3656991221146c39ae84d5198ddced371ed1514e247a5272\": container with ID starting with 82ef80c8af4f9f0e3656991221146c39ae84d5198ddced371ed1514e247a5272 not found: ID does not exist" containerID="82ef80c8af4f9f0e3656991221146c39ae84d5198ddced371ed1514e247a5272" Nov 26 15:11:53 crc kubenswrapper[4785]: I1126 15:11:53.262658 4785 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"82ef80c8af4f9f0e3656991221146c39ae84d5198ddced371ed1514e247a5272"} err="failed to get container status \"82ef80c8af4f9f0e3656991221146c39ae84d5198ddced371ed1514e247a5272\": rpc error: code = NotFound desc = could not find container \"82ef80c8af4f9f0e3656991221146c39ae84d5198ddced371ed1514e247a5272\": container with ID starting with 82ef80c8af4f9f0e3656991221146c39ae84d5198ddced371ed1514e247a5272 not found: ID does not exist" Nov 26 15:11:53 crc kubenswrapper[4785]: I1126 15:11:53.262679 4785 scope.go:117] "RemoveContainer" containerID="87eac17aa6b913faf22fad6e705625c5fcb268424a7043799be8837a60aae5e1" Nov 26 15:11:53 crc kubenswrapper[4785]: E1126 15:11:53.263230 4785 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"87eac17aa6b913faf22fad6e705625c5fcb268424a7043799be8837a60aae5e1\": container with ID starting with 87eac17aa6b913faf22fad6e705625c5fcb268424a7043799be8837a60aae5e1 not found: ID does not exist" containerID="87eac17aa6b913faf22fad6e705625c5fcb268424a7043799be8837a60aae5e1" Nov 26 15:11:53 crc kubenswrapper[4785]: I1126 15:11:53.263277 4785 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"87eac17aa6b913faf22fad6e705625c5fcb268424a7043799be8837a60aae5e1"} err="failed to get container status \"87eac17aa6b913faf22fad6e705625c5fcb268424a7043799be8837a60aae5e1\": rpc error: code = NotFound desc = could not find container \"87eac17aa6b913faf22fad6e705625c5fcb268424a7043799be8837a60aae5e1\": container with ID starting with 87eac17aa6b913faf22fad6e705625c5fcb268424a7043799be8837a60aae5e1 not found: ID does not exist" Nov 26 15:11:53 crc kubenswrapper[4785]: I1126 15:11:53.263300 4785 scope.go:117] "RemoveContainer" containerID="fde06ae70c4fd282c87c511071786aca9fc0b03847e2f611cd8c7ab5e03c1f1b" Nov 26 15:11:53 crc kubenswrapper[4785]: I1126 15:11:53.278784 4785 scope.go:117] "RemoveContainer" containerID="fde06ae70c4fd282c87c511071786aca9fc0b03847e2f611cd8c7ab5e03c1f1b" Nov 26 15:11:53 crc kubenswrapper[4785]: E1126 15:11:53.279649 4785 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fde06ae70c4fd282c87c511071786aca9fc0b03847e2f611cd8c7ab5e03c1f1b\": container with ID starting with fde06ae70c4fd282c87c511071786aca9fc0b03847e2f611cd8c7ab5e03c1f1b not found: ID does not exist" containerID="fde06ae70c4fd282c87c511071786aca9fc0b03847e2f611cd8c7ab5e03c1f1b" Nov 26 15:11:53 crc kubenswrapper[4785]: I1126 15:11:53.279682 4785 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fde06ae70c4fd282c87c511071786aca9fc0b03847e2f611cd8c7ab5e03c1f1b"} err="failed to get container status \"fde06ae70c4fd282c87c511071786aca9fc0b03847e2f611cd8c7ab5e03c1f1b\": rpc error: code = NotFound desc = could not find container \"fde06ae70c4fd282c87c511071786aca9fc0b03847e2f611cd8c7ab5e03c1f1b\": container with ID starting with fde06ae70c4fd282c87c511071786aca9fc0b03847e2f611cd8c7ab5e03c1f1b not found: ID does not exist" Nov 26 15:11:53 crc kubenswrapper[4785]: I1126 15:11:53.279705 4785 scope.go:117] "RemoveContainer" containerID="04f2300fcaa2521dd445dcfaacf8087f7d3d20484fce34a830fb014b3f2318c2" Nov 26 15:11:53 crc kubenswrapper[4785]: I1126 15:11:53.331428 4785 scope.go:117] "RemoveContainer" containerID="acdc0012490ad120024a9b5b096deec77106827b47fe3b239b652118b2d1cade" Nov 26 15:11:53 crc kubenswrapper[4785]: I1126 15:11:53.343864 4785 scope.go:117] "RemoveContainer" containerID="e8e59c3719f4769be1b82ffdd1169f29354272d4bc0a09bd6bedaacecd2577bb" Nov 26 15:11:53 crc kubenswrapper[4785]: I1126 15:11:53.373157 4785 scope.go:117] "RemoveContainer" containerID="04f2300fcaa2521dd445dcfaacf8087f7d3d20484fce34a830fb014b3f2318c2" Nov 26 15:11:53 crc kubenswrapper[4785]: E1126 15:11:53.373722 4785 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"04f2300fcaa2521dd445dcfaacf8087f7d3d20484fce34a830fb014b3f2318c2\": container with ID starting with 04f2300fcaa2521dd445dcfaacf8087f7d3d20484fce34a830fb014b3f2318c2 not found: ID does not exist" containerID="04f2300fcaa2521dd445dcfaacf8087f7d3d20484fce34a830fb014b3f2318c2" Nov 26 15:11:53 crc kubenswrapper[4785]: I1126 15:11:53.373773 4785 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"04f2300fcaa2521dd445dcfaacf8087f7d3d20484fce34a830fb014b3f2318c2"} err="failed to get container status \"04f2300fcaa2521dd445dcfaacf8087f7d3d20484fce34a830fb014b3f2318c2\": rpc error: code = NotFound desc = could not find container \"04f2300fcaa2521dd445dcfaacf8087f7d3d20484fce34a830fb014b3f2318c2\": container with ID starting with 04f2300fcaa2521dd445dcfaacf8087f7d3d20484fce34a830fb014b3f2318c2 not found: ID does not exist" Nov 26 15:11:53 crc kubenswrapper[4785]: I1126 15:11:53.373805 4785 scope.go:117] "RemoveContainer" containerID="acdc0012490ad120024a9b5b096deec77106827b47fe3b239b652118b2d1cade" Nov 26 15:11:53 crc kubenswrapper[4785]: E1126 15:11:53.374200 4785 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"acdc0012490ad120024a9b5b096deec77106827b47fe3b239b652118b2d1cade\": container with ID starting with acdc0012490ad120024a9b5b096deec77106827b47fe3b239b652118b2d1cade not found: ID does not exist" containerID="acdc0012490ad120024a9b5b096deec77106827b47fe3b239b652118b2d1cade" Nov 26 15:11:53 crc kubenswrapper[4785]: I1126 15:11:53.374223 4785 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"acdc0012490ad120024a9b5b096deec77106827b47fe3b239b652118b2d1cade"} err="failed to get container status \"acdc0012490ad120024a9b5b096deec77106827b47fe3b239b652118b2d1cade\": rpc error: code = NotFound desc = could not find container \"acdc0012490ad120024a9b5b096deec77106827b47fe3b239b652118b2d1cade\": container with ID starting with acdc0012490ad120024a9b5b096deec77106827b47fe3b239b652118b2d1cade not found: ID does not exist" Nov 26 15:11:53 crc kubenswrapper[4785]: I1126 15:11:53.374237 4785 scope.go:117] "RemoveContainer" containerID="e8e59c3719f4769be1b82ffdd1169f29354272d4bc0a09bd6bedaacecd2577bb" Nov 26 15:11:53 crc kubenswrapper[4785]: E1126 15:11:53.374810 4785 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e8e59c3719f4769be1b82ffdd1169f29354272d4bc0a09bd6bedaacecd2577bb\": container with ID starting with e8e59c3719f4769be1b82ffdd1169f29354272d4bc0a09bd6bedaacecd2577bb not found: ID does not exist" containerID="e8e59c3719f4769be1b82ffdd1169f29354272d4bc0a09bd6bedaacecd2577bb" Nov 26 15:11:53 crc kubenswrapper[4785]: I1126 15:11:53.374889 4785 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e8e59c3719f4769be1b82ffdd1169f29354272d4bc0a09bd6bedaacecd2577bb"} err="failed to get container status \"e8e59c3719f4769be1b82ffdd1169f29354272d4bc0a09bd6bedaacecd2577bb\": rpc error: code = NotFound desc = could not find container \"e8e59c3719f4769be1b82ffdd1169f29354272d4bc0a09bd6bedaacecd2577bb\": container with ID starting with e8e59c3719f4769be1b82ffdd1169f29354272d4bc0a09bd6bedaacecd2577bb not found: ID does not exist" Nov 26 15:11:53 crc kubenswrapper[4785]: I1126 15:11:53.867981 4785 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-ffv6f"] Nov 26 15:11:53 crc kubenswrapper[4785]: E1126 15:11:53.868413 4785 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5b0f6908-63a8-487c-ac41-e81114c43311" containerName="extract-content" Nov 26 15:11:53 crc kubenswrapper[4785]: I1126 15:11:53.868427 4785 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b0f6908-63a8-487c-ac41-e81114c43311" containerName="extract-content" Nov 26 15:11:53 crc kubenswrapper[4785]: E1126 15:11:53.868438 4785 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="afb002bc-9134-4d4c-b956-15ad3e700c49" containerName="extract-content" Nov 26 15:11:53 crc kubenswrapper[4785]: I1126 15:11:53.868444 4785 state_mem.go:107] "Deleted CPUSet assignment" podUID="afb002bc-9134-4d4c-b956-15ad3e700c49" containerName="extract-content" Nov 26 15:11:53 crc kubenswrapper[4785]: E1126 15:11:53.868453 4785 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9abc8ae-a944-4f87-909b-8258f95c2c06" containerName="marketplace-operator" Nov 26 15:11:53 crc kubenswrapper[4785]: I1126 15:11:53.868459 4785 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9abc8ae-a944-4f87-909b-8258f95c2c06" containerName="marketplace-operator" Nov 26 15:11:53 crc kubenswrapper[4785]: E1126 15:11:53.868467 4785 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9558a7a8-65c7-4127-9174-aa1036efc91f" containerName="extract-content" Nov 26 15:11:53 crc kubenswrapper[4785]: I1126 15:11:53.868473 4785 state_mem.go:107] "Deleted CPUSet assignment" podUID="9558a7a8-65c7-4127-9174-aa1036efc91f" containerName="extract-content" Nov 26 15:11:53 crc kubenswrapper[4785]: E1126 15:11:53.868483 4785 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="afb002bc-9134-4d4c-b956-15ad3e700c49" containerName="registry-server" Nov 26 15:11:53 crc kubenswrapper[4785]: I1126 15:11:53.868489 4785 state_mem.go:107] "Deleted CPUSet assignment" podUID="afb002bc-9134-4d4c-b956-15ad3e700c49" containerName="registry-server" Nov 26 15:11:53 crc kubenswrapper[4785]: E1126 15:11:53.868499 4785 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="afb002bc-9134-4d4c-b956-15ad3e700c49" containerName="extract-utilities" Nov 26 15:11:53 crc kubenswrapper[4785]: I1126 15:11:53.868504 4785 state_mem.go:107] "Deleted CPUSet assignment" podUID="afb002bc-9134-4d4c-b956-15ad3e700c49" containerName="extract-utilities" Nov 26 15:11:53 crc kubenswrapper[4785]: E1126 15:11:53.868510 4785 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5b0f6908-63a8-487c-ac41-e81114c43311" containerName="extract-utilities" Nov 26 15:11:53 crc kubenswrapper[4785]: I1126 15:11:53.868515 4785 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b0f6908-63a8-487c-ac41-e81114c43311" containerName="extract-utilities" Nov 26 15:11:53 crc kubenswrapper[4785]: E1126 15:11:53.868524 4785 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9558a7a8-65c7-4127-9174-aa1036efc91f" containerName="registry-server" Nov 26 15:11:53 crc kubenswrapper[4785]: I1126 15:11:53.868530 4785 state_mem.go:107] "Deleted CPUSet assignment" podUID="9558a7a8-65c7-4127-9174-aa1036efc91f" containerName="registry-server" Nov 26 15:11:53 crc kubenswrapper[4785]: E1126 15:11:53.868541 4785 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5b0f6908-63a8-487c-ac41-e81114c43311" containerName="registry-server" Nov 26 15:11:53 crc kubenswrapper[4785]: I1126 15:11:53.868546 4785 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b0f6908-63a8-487c-ac41-e81114c43311" containerName="registry-server" Nov 26 15:11:53 crc kubenswrapper[4785]: E1126 15:11:53.868573 4785 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c5c2881b-79ef-4249-a393-dc3141e2e7c2" containerName="extract-content" Nov 26 15:11:53 crc kubenswrapper[4785]: I1126 15:11:53.868582 4785 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5c2881b-79ef-4249-a393-dc3141e2e7c2" containerName="extract-content" Nov 26 15:11:53 crc kubenswrapper[4785]: E1126 15:11:53.868592 4785 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9558a7a8-65c7-4127-9174-aa1036efc91f" containerName="extract-utilities" Nov 26 15:11:53 crc kubenswrapper[4785]: I1126 15:11:53.868598 4785 state_mem.go:107] "Deleted CPUSet assignment" podUID="9558a7a8-65c7-4127-9174-aa1036efc91f" containerName="extract-utilities" Nov 26 15:11:53 crc kubenswrapper[4785]: E1126 15:11:53.868606 4785 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c5c2881b-79ef-4249-a393-dc3141e2e7c2" containerName="registry-server" Nov 26 15:11:53 crc kubenswrapper[4785]: I1126 15:11:53.868612 4785 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5c2881b-79ef-4249-a393-dc3141e2e7c2" containerName="registry-server" Nov 26 15:11:53 crc kubenswrapper[4785]: E1126 15:11:53.868621 4785 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c5c2881b-79ef-4249-a393-dc3141e2e7c2" containerName="extract-utilities" Nov 26 15:11:53 crc kubenswrapper[4785]: I1126 15:11:53.868626 4785 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5c2881b-79ef-4249-a393-dc3141e2e7c2" containerName="extract-utilities" Nov 26 15:11:53 crc kubenswrapper[4785]: I1126 15:11:53.868767 4785 memory_manager.go:354] "RemoveStaleState removing state" podUID="5b0f6908-63a8-487c-ac41-e81114c43311" containerName="registry-server" Nov 26 15:11:53 crc kubenswrapper[4785]: I1126 15:11:53.868783 4785 memory_manager.go:354] "RemoveStaleState removing state" podUID="afb002bc-9134-4d4c-b956-15ad3e700c49" containerName="registry-server" Nov 26 15:11:53 crc kubenswrapper[4785]: I1126 15:11:53.868813 4785 memory_manager.go:354] "RemoveStaleState removing state" podUID="9558a7a8-65c7-4127-9174-aa1036efc91f" containerName="registry-server" Nov 26 15:11:53 crc kubenswrapper[4785]: I1126 15:11:53.868824 4785 memory_manager.go:354] "RemoveStaleState removing state" podUID="c9abc8ae-a944-4f87-909b-8258f95c2c06" containerName="marketplace-operator" Nov 26 15:11:53 crc kubenswrapper[4785]: I1126 15:11:53.868833 4785 memory_manager.go:354] "RemoveStaleState removing state" podUID="c5c2881b-79ef-4249-a393-dc3141e2e7c2" containerName="registry-server" Nov 26 15:11:53 crc kubenswrapper[4785]: I1126 15:11:53.869865 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ffv6f" Nov 26 15:11:53 crc kubenswrapper[4785]: I1126 15:11:53.873974 4785 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Nov 26 15:11:53 crc kubenswrapper[4785]: I1126 15:11:53.888266 4785 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-ffv6f"] Nov 26 15:11:53 crc kubenswrapper[4785]: I1126 15:11:53.959309 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/75367016-4697-457d-8bbe-c874cfa6e712-utilities\") pod \"redhat-marketplace-ffv6f\" (UID: \"75367016-4697-457d-8bbe-c874cfa6e712\") " pod="openshift-marketplace/redhat-marketplace-ffv6f" Nov 26 15:11:53 crc kubenswrapper[4785]: I1126 15:11:53.959378 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/75367016-4697-457d-8bbe-c874cfa6e712-catalog-content\") pod \"redhat-marketplace-ffv6f\" (UID: \"75367016-4697-457d-8bbe-c874cfa6e712\") " pod="openshift-marketplace/redhat-marketplace-ffv6f" Nov 26 15:11:53 crc kubenswrapper[4785]: I1126 15:11:53.959398 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ftlk6\" (UniqueName: \"kubernetes.io/projected/75367016-4697-457d-8bbe-c874cfa6e712-kube-api-access-ftlk6\") pod \"redhat-marketplace-ffv6f\" (UID: \"75367016-4697-457d-8bbe-c874cfa6e712\") " pod="openshift-marketplace/redhat-marketplace-ffv6f" Nov 26 15:11:54 crc kubenswrapper[4785]: I1126 15:11:54.060593 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/75367016-4697-457d-8bbe-c874cfa6e712-utilities\") pod \"redhat-marketplace-ffv6f\" (UID: \"75367016-4697-457d-8bbe-c874cfa6e712\") " pod="openshift-marketplace/redhat-marketplace-ffv6f" Nov 26 15:11:54 crc kubenswrapper[4785]: I1126 15:11:54.060777 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/75367016-4697-457d-8bbe-c874cfa6e712-catalog-content\") pod \"redhat-marketplace-ffv6f\" (UID: \"75367016-4697-457d-8bbe-c874cfa6e712\") " pod="openshift-marketplace/redhat-marketplace-ffv6f" Nov 26 15:11:54 crc kubenswrapper[4785]: I1126 15:11:54.060830 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ftlk6\" (UniqueName: \"kubernetes.io/projected/75367016-4697-457d-8bbe-c874cfa6e712-kube-api-access-ftlk6\") pod \"redhat-marketplace-ffv6f\" (UID: \"75367016-4697-457d-8bbe-c874cfa6e712\") " pod="openshift-marketplace/redhat-marketplace-ffv6f" Nov 26 15:11:54 crc kubenswrapper[4785]: I1126 15:11:54.061104 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/75367016-4697-457d-8bbe-c874cfa6e712-utilities\") pod \"redhat-marketplace-ffv6f\" (UID: \"75367016-4697-457d-8bbe-c874cfa6e712\") " pod="openshift-marketplace/redhat-marketplace-ffv6f" Nov 26 15:11:54 crc kubenswrapper[4785]: I1126 15:11:54.061350 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/75367016-4697-457d-8bbe-c874cfa6e712-catalog-content\") pod \"redhat-marketplace-ffv6f\" (UID: \"75367016-4697-457d-8bbe-c874cfa6e712\") " pod="openshift-marketplace/redhat-marketplace-ffv6f" Nov 26 15:11:54 crc kubenswrapper[4785]: I1126 15:11:54.085571 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ftlk6\" (UniqueName: \"kubernetes.io/projected/75367016-4697-457d-8bbe-c874cfa6e712-kube-api-access-ftlk6\") pod \"redhat-marketplace-ffv6f\" (UID: \"75367016-4697-457d-8bbe-c874cfa6e712\") " pod="openshift-marketplace/redhat-marketplace-ffv6f" Nov 26 15:11:54 crc kubenswrapper[4785]: I1126 15:11:54.102165 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-2fgr9" event={"ID":"3dd50a40-cee2-4f3b-b522-cf1ab60c4be6","Type":"ContainerStarted","Data":"a545ce4fd76bb98bc3d7dbed64b20c4c891ad2326048630807bf3aa57a82a81d"} Nov 26 15:11:54 crc kubenswrapper[4785]: I1126 15:11:54.102345 4785 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-2fgr9" Nov 26 15:11:54 crc kubenswrapper[4785]: I1126 15:11:54.108725 4785 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-2fgr9" Nov 26 15:11:54 crc kubenswrapper[4785]: I1126 15:11:54.150572 4785 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-2fgr9" podStartSLOduration=2.150533834 podStartE2EDuration="2.150533834s" podCreationTimestamp="2025-11-26 15:11:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 15:11:54.123369611 +0000 UTC m=+277.801735385" watchObservedRunningTime="2025-11-26 15:11:54.150533834 +0000 UTC m=+277.828899598" Nov 26 15:11:54 crc kubenswrapper[4785]: I1126 15:11:54.189684 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ffv6f" Nov 26 15:11:54 crc kubenswrapper[4785]: I1126 15:11:54.599189 4785 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-ffv6f"] Nov 26 15:11:54 crc kubenswrapper[4785]: W1126 15:11:54.607371 4785 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod75367016_4697_457d_8bbe_c874cfa6e712.slice/crio-e0d8ced2bc69c055196be714ff0fcdc35fc6606af80cbd630be950b0f0af6ca5 WatchSource:0}: Error finding container e0d8ced2bc69c055196be714ff0fcdc35fc6606af80cbd630be950b0f0af6ca5: Status 404 returned error can't find the container with id e0d8ced2bc69c055196be714ff0fcdc35fc6606af80cbd630be950b0f0af6ca5 Nov 26 15:11:54 crc kubenswrapper[4785]: I1126 15:11:54.864963 4785 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-tnkfn"] Nov 26 15:11:54 crc kubenswrapper[4785]: I1126 15:11:54.865934 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-tnkfn" Nov 26 15:11:54 crc kubenswrapper[4785]: I1126 15:11:54.871822 4785 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Nov 26 15:11:54 crc kubenswrapper[4785]: I1126 15:11:54.880833 4785 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-tnkfn"] Nov 26 15:11:54 crc kubenswrapper[4785]: I1126 15:11:54.975020 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7b86e813-9dab-4537-be3d-9903e0b53f70-utilities\") pod \"certified-operators-tnkfn\" (UID: \"7b86e813-9dab-4537-be3d-9903e0b53f70\") " pod="openshift-marketplace/certified-operators-tnkfn" Nov 26 15:11:54 crc kubenswrapper[4785]: I1126 15:11:54.975074 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-92hgv\" (UniqueName: \"kubernetes.io/projected/7b86e813-9dab-4537-be3d-9903e0b53f70-kube-api-access-92hgv\") pod \"certified-operators-tnkfn\" (UID: \"7b86e813-9dab-4537-be3d-9903e0b53f70\") " pod="openshift-marketplace/certified-operators-tnkfn" Nov 26 15:11:54 crc kubenswrapper[4785]: I1126 15:11:54.975116 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7b86e813-9dab-4537-be3d-9903e0b53f70-catalog-content\") pod \"certified-operators-tnkfn\" (UID: \"7b86e813-9dab-4537-be3d-9903e0b53f70\") " pod="openshift-marketplace/certified-operators-tnkfn" Nov 26 15:11:55 crc kubenswrapper[4785]: I1126 15:11:55.055330 4785 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b0f6908-63a8-487c-ac41-e81114c43311" path="/var/lib/kubelet/pods/5b0f6908-63a8-487c-ac41-e81114c43311/volumes" Nov 26 15:11:55 crc kubenswrapper[4785]: I1126 15:11:55.056291 4785 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9558a7a8-65c7-4127-9174-aa1036efc91f" path="/var/lib/kubelet/pods/9558a7a8-65c7-4127-9174-aa1036efc91f/volumes" Nov 26 15:11:55 crc kubenswrapper[4785]: I1126 15:11:55.057116 4785 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="afb002bc-9134-4d4c-b956-15ad3e700c49" path="/var/lib/kubelet/pods/afb002bc-9134-4d4c-b956-15ad3e700c49/volumes" Nov 26 15:11:55 crc kubenswrapper[4785]: I1126 15:11:55.058334 4785 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c5c2881b-79ef-4249-a393-dc3141e2e7c2" path="/var/lib/kubelet/pods/c5c2881b-79ef-4249-a393-dc3141e2e7c2/volumes" Nov 26 15:11:55 crc kubenswrapper[4785]: I1126 15:11:55.059469 4785 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c9abc8ae-a944-4f87-909b-8258f95c2c06" path="/var/lib/kubelet/pods/c9abc8ae-a944-4f87-909b-8258f95c2c06/volumes" Nov 26 15:11:55 crc kubenswrapper[4785]: I1126 15:11:55.076532 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7b86e813-9dab-4537-be3d-9903e0b53f70-utilities\") pod \"certified-operators-tnkfn\" (UID: \"7b86e813-9dab-4537-be3d-9903e0b53f70\") " pod="openshift-marketplace/certified-operators-tnkfn" Nov 26 15:11:55 crc kubenswrapper[4785]: I1126 15:11:55.076602 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-92hgv\" (UniqueName: \"kubernetes.io/projected/7b86e813-9dab-4537-be3d-9903e0b53f70-kube-api-access-92hgv\") pod \"certified-operators-tnkfn\" (UID: \"7b86e813-9dab-4537-be3d-9903e0b53f70\") " pod="openshift-marketplace/certified-operators-tnkfn" Nov 26 15:11:55 crc kubenswrapper[4785]: I1126 15:11:55.076682 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7b86e813-9dab-4537-be3d-9903e0b53f70-catalog-content\") pod \"certified-operators-tnkfn\" (UID: \"7b86e813-9dab-4537-be3d-9903e0b53f70\") " pod="openshift-marketplace/certified-operators-tnkfn" Nov 26 15:11:55 crc kubenswrapper[4785]: I1126 15:11:55.077103 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7b86e813-9dab-4537-be3d-9903e0b53f70-catalog-content\") pod \"certified-operators-tnkfn\" (UID: \"7b86e813-9dab-4537-be3d-9903e0b53f70\") " pod="openshift-marketplace/certified-operators-tnkfn" Nov 26 15:11:55 crc kubenswrapper[4785]: I1126 15:11:55.077841 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7b86e813-9dab-4537-be3d-9903e0b53f70-utilities\") pod \"certified-operators-tnkfn\" (UID: \"7b86e813-9dab-4537-be3d-9903e0b53f70\") " pod="openshift-marketplace/certified-operators-tnkfn" Nov 26 15:11:55 crc kubenswrapper[4785]: I1126 15:11:55.108139 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-92hgv\" (UniqueName: \"kubernetes.io/projected/7b86e813-9dab-4537-be3d-9903e0b53f70-kube-api-access-92hgv\") pod \"certified-operators-tnkfn\" (UID: \"7b86e813-9dab-4537-be3d-9903e0b53f70\") " pod="openshift-marketplace/certified-operators-tnkfn" Nov 26 15:11:55 crc kubenswrapper[4785]: I1126 15:11:55.118470 4785 generic.go:334] "Generic (PLEG): container finished" podID="75367016-4697-457d-8bbe-c874cfa6e712" containerID="ad3e6c33bd4608923c4b5d7ef09453e3ed7505d5ca94ac8e9c98c737ff73aa8e" exitCode=0 Nov 26 15:11:55 crc kubenswrapper[4785]: I1126 15:11:55.118634 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ffv6f" event={"ID":"75367016-4697-457d-8bbe-c874cfa6e712","Type":"ContainerDied","Data":"ad3e6c33bd4608923c4b5d7ef09453e3ed7505d5ca94ac8e9c98c737ff73aa8e"} Nov 26 15:11:55 crc kubenswrapper[4785]: I1126 15:11:55.118689 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ffv6f" event={"ID":"75367016-4697-457d-8bbe-c874cfa6e712","Type":"ContainerStarted","Data":"e0d8ced2bc69c055196be714ff0fcdc35fc6606af80cbd630be950b0f0af6ca5"} Nov 26 15:11:55 crc kubenswrapper[4785]: I1126 15:11:55.186304 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-tnkfn" Nov 26 15:11:55 crc kubenswrapper[4785]: I1126 15:11:55.583234 4785 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-tnkfn"] Nov 26 15:11:55 crc kubenswrapper[4785]: W1126 15:11:55.591803 4785 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7b86e813_9dab_4537_be3d_9903e0b53f70.slice/crio-d00d2c2bf4f131d57bc8f280bb7c1cac909e3c77ec41345907013b6c7db0cca5 WatchSource:0}: Error finding container d00d2c2bf4f131d57bc8f280bb7c1cac909e3c77ec41345907013b6c7db0cca5: Status 404 returned error can't find the container with id d00d2c2bf4f131d57bc8f280bb7c1cac909e3c77ec41345907013b6c7db0cca5 Nov 26 15:11:56 crc kubenswrapper[4785]: I1126 15:11:56.125113 4785 generic.go:334] "Generic (PLEG): container finished" podID="7b86e813-9dab-4537-be3d-9903e0b53f70" containerID="e05f49f2bb82e772cb2b52d9edf21139b104a55df88c33b613ea06da73a1fa5f" exitCode=0 Nov 26 15:11:56 crc kubenswrapper[4785]: I1126 15:11:56.125686 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tnkfn" event={"ID":"7b86e813-9dab-4537-be3d-9903e0b53f70","Type":"ContainerDied","Data":"e05f49f2bb82e772cb2b52d9edf21139b104a55df88c33b613ea06da73a1fa5f"} Nov 26 15:11:56 crc kubenswrapper[4785]: I1126 15:11:56.125710 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tnkfn" event={"ID":"7b86e813-9dab-4537-be3d-9903e0b53f70","Type":"ContainerStarted","Data":"d00d2c2bf4f131d57bc8f280bb7c1cac909e3c77ec41345907013b6c7db0cca5"} Nov 26 15:11:56 crc kubenswrapper[4785]: I1126 15:11:56.276009 4785 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-vmfhd"] Nov 26 15:11:56 crc kubenswrapper[4785]: I1126 15:11:56.277063 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vmfhd" Nov 26 15:11:56 crc kubenswrapper[4785]: I1126 15:11:56.279538 4785 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Nov 26 15:11:56 crc kubenswrapper[4785]: I1126 15:11:56.284061 4785 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-vmfhd"] Nov 26 15:11:56 crc kubenswrapper[4785]: I1126 15:11:56.293636 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rkzws\" (UniqueName: \"kubernetes.io/projected/5a04c61d-ff4c-486b-bba4-1b133ac1b4bc-kube-api-access-rkzws\") pod \"community-operators-vmfhd\" (UID: \"5a04c61d-ff4c-486b-bba4-1b133ac1b4bc\") " pod="openshift-marketplace/community-operators-vmfhd" Nov 26 15:11:56 crc kubenswrapper[4785]: I1126 15:11:56.293724 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5a04c61d-ff4c-486b-bba4-1b133ac1b4bc-catalog-content\") pod \"community-operators-vmfhd\" (UID: \"5a04c61d-ff4c-486b-bba4-1b133ac1b4bc\") " pod="openshift-marketplace/community-operators-vmfhd" Nov 26 15:11:56 crc kubenswrapper[4785]: I1126 15:11:56.293759 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5a04c61d-ff4c-486b-bba4-1b133ac1b4bc-utilities\") pod \"community-operators-vmfhd\" (UID: \"5a04c61d-ff4c-486b-bba4-1b133ac1b4bc\") " pod="openshift-marketplace/community-operators-vmfhd" Nov 26 15:11:56 crc kubenswrapper[4785]: I1126 15:11:56.394875 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5a04c61d-ff4c-486b-bba4-1b133ac1b4bc-catalog-content\") pod \"community-operators-vmfhd\" (UID: \"5a04c61d-ff4c-486b-bba4-1b133ac1b4bc\") " pod="openshift-marketplace/community-operators-vmfhd" Nov 26 15:11:56 crc kubenswrapper[4785]: I1126 15:11:56.394923 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5a04c61d-ff4c-486b-bba4-1b133ac1b4bc-utilities\") pod \"community-operators-vmfhd\" (UID: \"5a04c61d-ff4c-486b-bba4-1b133ac1b4bc\") " pod="openshift-marketplace/community-operators-vmfhd" Nov 26 15:11:56 crc kubenswrapper[4785]: I1126 15:11:56.395117 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rkzws\" (UniqueName: \"kubernetes.io/projected/5a04c61d-ff4c-486b-bba4-1b133ac1b4bc-kube-api-access-rkzws\") pod \"community-operators-vmfhd\" (UID: \"5a04c61d-ff4c-486b-bba4-1b133ac1b4bc\") " pod="openshift-marketplace/community-operators-vmfhd" Nov 26 15:11:56 crc kubenswrapper[4785]: I1126 15:11:56.395310 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5a04c61d-ff4c-486b-bba4-1b133ac1b4bc-catalog-content\") pod \"community-operators-vmfhd\" (UID: \"5a04c61d-ff4c-486b-bba4-1b133ac1b4bc\") " pod="openshift-marketplace/community-operators-vmfhd" Nov 26 15:11:56 crc kubenswrapper[4785]: I1126 15:11:56.395435 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5a04c61d-ff4c-486b-bba4-1b133ac1b4bc-utilities\") pod \"community-operators-vmfhd\" (UID: \"5a04c61d-ff4c-486b-bba4-1b133ac1b4bc\") " pod="openshift-marketplace/community-operators-vmfhd" Nov 26 15:11:56 crc kubenswrapper[4785]: I1126 15:11:56.424718 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rkzws\" (UniqueName: \"kubernetes.io/projected/5a04c61d-ff4c-486b-bba4-1b133ac1b4bc-kube-api-access-rkzws\") pod \"community-operators-vmfhd\" (UID: \"5a04c61d-ff4c-486b-bba4-1b133ac1b4bc\") " pod="openshift-marketplace/community-operators-vmfhd" Nov 26 15:11:56 crc kubenswrapper[4785]: I1126 15:11:56.599754 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vmfhd" Nov 26 15:11:57 crc kubenswrapper[4785]: I1126 15:11:57.023580 4785 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-vmfhd"] Nov 26 15:11:57 crc kubenswrapper[4785]: W1126 15:11:57.032209 4785 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5a04c61d_ff4c_486b_bba4_1b133ac1b4bc.slice/crio-8d8c953e869d1359ac8a59f9ba266751d61a4ce435f334cfbffc0fd59c03e0f5 WatchSource:0}: Error finding container 8d8c953e869d1359ac8a59f9ba266751d61a4ce435f334cfbffc0fd59c03e0f5: Status 404 returned error can't find the container with id 8d8c953e869d1359ac8a59f9ba266751d61a4ce435f334cfbffc0fd59c03e0f5 Nov 26 15:11:57 crc kubenswrapper[4785]: I1126 15:11:57.130719 4785 generic.go:334] "Generic (PLEG): container finished" podID="75367016-4697-457d-8bbe-c874cfa6e712" containerID="c59af7a48f3053399297f9b2e5cf32b9dbcc549211a205b96f22eee86436e51b" exitCode=0 Nov 26 15:11:57 crc kubenswrapper[4785]: I1126 15:11:57.130776 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ffv6f" event={"ID":"75367016-4697-457d-8bbe-c874cfa6e712","Type":"ContainerDied","Data":"c59af7a48f3053399297f9b2e5cf32b9dbcc549211a205b96f22eee86436e51b"} Nov 26 15:11:57 crc kubenswrapper[4785]: I1126 15:11:57.132541 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vmfhd" event={"ID":"5a04c61d-ff4c-486b-bba4-1b133ac1b4bc","Type":"ContainerStarted","Data":"8d8c953e869d1359ac8a59f9ba266751d61a4ce435f334cfbffc0fd59c03e0f5"} Nov 26 15:11:57 crc kubenswrapper[4785]: I1126 15:11:57.140094 4785 generic.go:334] "Generic (PLEG): container finished" podID="7b86e813-9dab-4537-be3d-9903e0b53f70" containerID="71b84ac16a0ccc103716017f43cc34fadd143e5d4f9d37acc917bdcfb827bdaf" exitCode=0 Nov 26 15:11:57 crc kubenswrapper[4785]: I1126 15:11:57.140131 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tnkfn" event={"ID":"7b86e813-9dab-4537-be3d-9903e0b53f70","Type":"ContainerDied","Data":"71b84ac16a0ccc103716017f43cc34fadd143e5d4f9d37acc917bdcfb827bdaf"} Nov 26 15:11:57 crc kubenswrapper[4785]: I1126 15:11:57.268585 4785 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-td9bd"] Nov 26 15:11:57 crc kubenswrapper[4785]: I1126 15:11:57.270745 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-td9bd" Nov 26 15:11:57 crc kubenswrapper[4785]: I1126 15:11:57.274172 4785 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Nov 26 15:11:57 crc kubenswrapper[4785]: I1126 15:11:57.278460 4785 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-td9bd"] Nov 26 15:11:57 crc kubenswrapper[4785]: I1126 15:11:57.307497 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/449ae537-a267-4c68-9aea-2712023ab42f-utilities\") pod \"redhat-operators-td9bd\" (UID: \"449ae537-a267-4c68-9aea-2712023ab42f\") " pod="openshift-marketplace/redhat-operators-td9bd" Nov 26 15:11:57 crc kubenswrapper[4785]: I1126 15:11:57.307629 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/449ae537-a267-4c68-9aea-2712023ab42f-catalog-content\") pod \"redhat-operators-td9bd\" (UID: \"449ae537-a267-4c68-9aea-2712023ab42f\") " pod="openshift-marketplace/redhat-operators-td9bd" Nov 26 15:11:57 crc kubenswrapper[4785]: I1126 15:11:57.307801 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mjkhm\" (UniqueName: \"kubernetes.io/projected/449ae537-a267-4c68-9aea-2712023ab42f-kube-api-access-mjkhm\") pod \"redhat-operators-td9bd\" (UID: \"449ae537-a267-4c68-9aea-2712023ab42f\") " pod="openshift-marketplace/redhat-operators-td9bd" Nov 26 15:11:57 crc kubenswrapper[4785]: I1126 15:11:57.409031 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mjkhm\" (UniqueName: \"kubernetes.io/projected/449ae537-a267-4c68-9aea-2712023ab42f-kube-api-access-mjkhm\") pod \"redhat-operators-td9bd\" (UID: \"449ae537-a267-4c68-9aea-2712023ab42f\") " pod="openshift-marketplace/redhat-operators-td9bd" Nov 26 15:11:57 crc kubenswrapper[4785]: I1126 15:11:57.409099 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/449ae537-a267-4c68-9aea-2712023ab42f-utilities\") pod \"redhat-operators-td9bd\" (UID: \"449ae537-a267-4c68-9aea-2712023ab42f\") " pod="openshift-marketplace/redhat-operators-td9bd" Nov 26 15:11:57 crc kubenswrapper[4785]: I1126 15:11:57.409125 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/449ae537-a267-4c68-9aea-2712023ab42f-catalog-content\") pod \"redhat-operators-td9bd\" (UID: \"449ae537-a267-4c68-9aea-2712023ab42f\") " pod="openshift-marketplace/redhat-operators-td9bd" Nov 26 15:11:57 crc kubenswrapper[4785]: I1126 15:11:57.409511 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/449ae537-a267-4c68-9aea-2712023ab42f-catalog-content\") pod \"redhat-operators-td9bd\" (UID: \"449ae537-a267-4c68-9aea-2712023ab42f\") " pod="openshift-marketplace/redhat-operators-td9bd" Nov 26 15:11:57 crc kubenswrapper[4785]: I1126 15:11:57.409687 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/449ae537-a267-4c68-9aea-2712023ab42f-utilities\") pod \"redhat-operators-td9bd\" (UID: \"449ae537-a267-4c68-9aea-2712023ab42f\") " pod="openshift-marketplace/redhat-operators-td9bd" Nov 26 15:11:57 crc kubenswrapper[4785]: I1126 15:11:57.428669 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mjkhm\" (UniqueName: \"kubernetes.io/projected/449ae537-a267-4c68-9aea-2712023ab42f-kube-api-access-mjkhm\") pod \"redhat-operators-td9bd\" (UID: \"449ae537-a267-4c68-9aea-2712023ab42f\") " pod="openshift-marketplace/redhat-operators-td9bd" Nov 26 15:11:57 crc kubenswrapper[4785]: I1126 15:11:57.611699 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-td9bd" Nov 26 15:11:57 crc kubenswrapper[4785]: I1126 15:11:57.821859 4785 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-td9bd"] Nov 26 15:11:57 crc kubenswrapper[4785]: W1126 15:11:57.831357 4785 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod449ae537_a267_4c68_9aea_2712023ab42f.slice/crio-ec681baf0da609391e7bc5710c68f073f6a546d6b8fcab494bab1d01d128042d WatchSource:0}: Error finding container ec681baf0da609391e7bc5710c68f073f6a546d6b8fcab494bab1d01d128042d: Status 404 returned error can't find the container with id ec681baf0da609391e7bc5710c68f073f6a546d6b8fcab494bab1d01d128042d Nov 26 15:11:58 crc kubenswrapper[4785]: I1126 15:11:58.146177 4785 generic.go:334] "Generic (PLEG): container finished" podID="5a04c61d-ff4c-486b-bba4-1b133ac1b4bc" containerID="5ca68575ce37f5e734e37f3b770e8d88c75e3fae1f2109218f63df7630d7ae27" exitCode=0 Nov 26 15:11:58 crc kubenswrapper[4785]: I1126 15:11:58.146249 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vmfhd" event={"ID":"5a04c61d-ff4c-486b-bba4-1b133ac1b4bc","Type":"ContainerDied","Data":"5ca68575ce37f5e734e37f3b770e8d88c75e3fae1f2109218f63df7630d7ae27"} Nov 26 15:11:58 crc kubenswrapper[4785]: I1126 15:11:58.149054 4785 generic.go:334] "Generic (PLEG): container finished" podID="449ae537-a267-4c68-9aea-2712023ab42f" containerID="932e32a5e4546ac4003121d14b81cbea6887c6e59e5abe5df853568acf4bea9c" exitCode=0 Nov 26 15:11:58 crc kubenswrapper[4785]: I1126 15:11:58.149186 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-td9bd" event={"ID":"449ae537-a267-4c68-9aea-2712023ab42f","Type":"ContainerDied","Data":"932e32a5e4546ac4003121d14b81cbea6887c6e59e5abe5df853568acf4bea9c"} Nov 26 15:11:58 crc kubenswrapper[4785]: I1126 15:11:58.149267 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-td9bd" event={"ID":"449ae537-a267-4c68-9aea-2712023ab42f","Type":"ContainerStarted","Data":"ec681baf0da609391e7bc5710c68f073f6a546d6b8fcab494bab1d01d128042d"} Nov 26 15:11:58 crc kubenswrapper[4785]: I1126 15:11:58.153062 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tnkfn" event={"ID":"7b86e813-9dab-4537-be3d-9903e0b53f70","Type":"ContainerStarted","Data":"2793ab14dd899aabfa0d07f9f29450649d0db8f3b958f960fb141b8a652a0311"} Nov 26 15:11:58 crc kubenswrapper[4785]: I1126 15:11:58.158871 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ffv6f" event={"ID":"75367016-4697-457d-8bbe-c874cfa6e712","Type":"ContainerStarted","Data":"85f7ad227a31583fb81888895ef125706518064cc25d556962f0d87f0e081487"} Nov 26 15:11:58 crc kubenswrapper[4785]: I1126 15:11:58.198451 4785 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-ffv6f" podStartSLOduration=2.764029406 podStartE2EDuration="5.198435269s" podCreationTimestamp="2025-11-26 15:11:53 +0000 UTC" firstStartedPulling="2025-11-26 15:11:55.120811462 +0000 UTC m=+278.799177266" lastFinishedPulling="2025-11-26 15:11:57.555217365 +0000 UTC m=+281.233583129" observedRunningTime="2025-11-26 15:11:58.196790461 +0000 UTC m=+281.875156225" watchObservedRunningTime="2025-11-26 15:11:58.198435269 +0000 UTC m=+281.876801033" Nov 26 15:11:58 crc kubenswrapper[4785]: I1126 15:11:58.214320 4785 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-tnkfn" podStartSLOduration=2.576298386 podStartE2EDuration="4.214298176s" podCreationTimestamp="2025-11-26 15:11:54 +0000 UTC" firstStartedPulling="2025-11-26 15:11:56.126365228 +0000 UTC m=+279.804730992" lastFinishedPulling="2025-11-26 15:11:57.764365018 +0000 UTC m=+281.442730782" observedRunningTime="2025-11-26 15:11:58.211437744 +0000 UTC m=+281.889803518" watchObservedRunningTime="2025-11-26 15:11:58.214298176 +0000 UTC m=+281.892663940" Nov 26 15:11:59 crc kubenswrapper[4785]: I1126 15:11:59.165838 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vmfhd" event={"ID":"5a04c61d-ff4c-486b-bba4-1b133ac1b4bc","Type":"ContainerStarted","Data":"9d9a1af617b07fed98d5817b0e4f695037d1d6b5e400fe7be6acafbdeb16b804"} Nov 26 15:11:59 crc kubenswrapper[4785]: I1126 15:11:59.167847 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-td9bd" event={"ID":"449ae537-a267-4c68-9aea-2712023ab42f","Type":"ContainerStarted","Data":"594908b8ef8ba4ffa633854b81c6e63c3739e65d26acb8e066297b0a5ecb9f04"} Nov 26 15:12:00 crc kubenswrapper[4785]: I1126 15:12:00.175952 4785 generic.go:334] "Generic (PLEG): container finished" podID="5a04c61d-ff4c-486b-bba4-1b133ac1b4bc" containerID="9d9a1af617b07fed98d5817b0e4f695037d1d6b5e400fe7be6acafbdeb16b804" exitCode=0 Nov 26 15:12:00 crc kubenswrapper[4785]: I1126 15:12:00.176000 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vmfhd" event={"ID":"5a04c61d-ff4c-486b-bba4-1b133ac1b4bc","Type":"ContainerDied","Data":"9d9a1af617b07fed98d5817b0e4f695037d1d6b5e400fe7be6acafbdeb16b804"} Nov 26 15:12:00 crc kubenswrapper[4785]: I1126 15:12:00.188381 4785 generic.go:334] "Generic (PLEG): container finished" podID="449ae537-a267-4c68-9aea-2712023ab42f" containerID="594908b8ef8ba4ffa633854b81c6e63c3739e65d26acb8e066297b0a5ecb9f04" exitCode=0 Nov 26 15:12:00 crc kubenswrapper[4785]: I1126 15:12:00.188434 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-td9bd" event={"ID":"449ae537-a267-4c68-9aea-2712023ab42f","Type":"ContainerDied","Data":"594908b8ef8ba4ffa633854b81c6e63c3739e65d26acb8e066297b0a5ecb9f04"} Nov 26 15:12:02 crc kubenswrapper[4785]: I1126 15:12:02.212182 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-td9bd" event={"ID":"449ae537-a267-4c68-9aea-2712023ab42f","Type":"ContainerStarted","Data":"ac98224f6ce84f1b7a5f35ab5041dfbd9e9842d6094bdcba1969d97e0fa7840d"} Nov 26 15:12:02 crc kubenswrapper[4785]: I1126 15:12:02.217153 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vmfhd" event={"ID":"5a04c61d-ff4c-486b-bba4-1b133ac1b4bc","Type":"ContainerStarted","Data":"de1687996badef926084a3430b2d04e3f233a9c701c5df62ac2e93cb978005b7"} Nov 26 15:12:02 crc kubenswrapper[4785]: I1126 15:12:02.232851 4785 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-td9bd" podStartSLOduration=2.5579392739999998 podStartE2EDuration="5.232835524s" podCreationTimestamp="2025-11-26 15:11:57 +0000 UTC" firstStartedPulling="2025-11-26 15:11:58.150813555 +0000 UTC m=+281.829179319" lastFinishedPulling="2025-11-26 15:12:00.825709805 +0000 UTC m=+284.504075569" observedRunningTime="2025-11-26 15:12:02.23232579 +0000 UTC m=+285.910691574" watchObservedRunningTime="2025-11-26 15:12:02.232835524 +0000 UTC m=+285.911201288" Nov 26 15:12:02 crc kubenswrapper[4785]: I1126 15:12:02.249362 4785 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-vmfhd" podStartSLOduration=3.632964718 podStartE2EDuration="6.249345281s" podCreationTimestamp="2025-11-26 15:11:56 +0000 UTC" firstStartedPulling="2025-11-26 15:11:58.147628363 +0000 UTC m=+281.825994127" lastFinishedPulling="2025-11-26 15:12:00.764008926 +0000 UTC m=+284.442374690" observedRunningTime="2025-11-26 15:12:02.247608971 +0000 UTC m=+285.925974755" watchObservedRunningTime="2025-11-26 15:12:02.249345281 +0000 UTC m=+285.927711045" Nov 26 15:12:04 crc kubenswrapper[4785]: I1126 15:12:04.191064 4785 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-ffv6f" Nov 26 15:12:04 crc kubenswrapper[4785]: I1126 15:12:04.191127 4785 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-ffv6f" Nov 26 15:12:04 crc kubenswrapper[4785]: I1126 15:12:04.257756 4785 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-ffv6f" Nov 26 15:12:04 crc kubenswrapper[4785]: I1126 15:12:04.318788 4785 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-ffv6f" Nov 26 15:12:05 crc kubenswrapper[4785]: I1126 15:12:05.186939 4785 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-tnkfn" Nov 26 15:12:05 crc kubenswrapper[4785]: I1126 15:12:05.186981 4785 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-tnkfn" Nov 26 15:12:05 crc kubenswrapper[4785]: I1126 15:12:05.225902 4785 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-tnkfn" Nov 26 15:12:05 crc kubenswrapper[4785]: I1126 15:12:05.275151 4785 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-tnkfn" Nov 26 15:12:06 crc kubenswrapper[4785]: I1126 15:12:06.600016 4785 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-vmfhd" Nov 26 15:12:06 crc kubenswrapper[4785]: I1126 15:12:06.600070 4785 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-vmfhd" Nov 26 15:12:06 crc kubenswrapper[4785]: I1126 15:12:06.643240 4785 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-vmfhd" Nov 26 15:12:07 crc kubenswrapper[4785]: I1126 15:12:07.290656 4785 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-vmfhd" Nov 26 15:12:07 crc kubenswrapper[4785]: I1126 15:12:07.612466 4785 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-td9bd" Nov 26 15:12:07 crc kubenswrapper[4785]: I1126 15:12:07.612884 4785 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-td9bd" Nov 26 15:12:07 crc kubenswrapper[4785]: I1126 15:12:07.651639 4785 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-td9bd" Nov 26 15:12:08 crc kubenswrapper[4785]: I1126 15:12:08.285329 4785 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-td9bd" Nov 26 15:13:37 crc kubenswrapper[4785]: I1126 15:13:37.289738 4785 patch_prober.go:28] interesting pod/machine-config-daemon-gkxdl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 26 15:13:37 crc kubenswrapper[4785]: I1126 15:13:37.290504 4785 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-gkxdl" podUID="5539d39a-e2bc-4e7f-8b5a-a5e3e10c4ba4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 26 15:14:07 crc kubenswrapper[4785]: I1126 15:14:07.289051 4785 patch_prober.go:28] interesting pod/machine-config-daemon-gkxdl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 26 15:14:07 crc kubenswrapper[4785]: I1126 15:14:07.290631 4785 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-gkxdl" podUID="5539d39a-e2bc-4e7f-8b5a-a5e3e10c4ba4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 26 15:14:29 crc kubenswrapper[4785]: I1126 15:14:29.721616 4785 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-pmssh"] Nov 26 15:14:29 crc kubenswrapper[4785]: I1126 15:14:29.722798 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-pmssh" Nov 26 15:14:29 crc kubenswrapper[4785]: I1126 15:14:29.754589 4785 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-pmssh"] Nov 26 15:14:29 crc kubenswrapper[4785]: I1126 15:14:29.834410 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/034d2d06-5575-4466-9b44-e1a08901189b-bound-sa-token\") pod \"image-registry-66df7c8f76-pmssh\" (UID: \"034d2d06-5575-4466-9b44-e1a08901189b\") " pod="openshift-image-registry/image-registry-66df7c8f76-pmssh" Nov 26 15:14:29 crc kubenswrapper[4785]: I1126 15:14:29.834477 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/034d2d06-5575-4466-9b44-e1a08901189b-installation-pull-secrets\") pod \"image-registry-66df7c8f76-pmssh\" (UID: \"034d2d06-5575-4466-9b44-e1a08901189b\") " pod="openshift-image-registry/image-registry-66df7c8f76-pmssh" Nov 26 15:14:29 crc kubenswrapper[4785]: I1126 15:14:29.834514 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/034d2d06-5575-4466-9b44-e1a08901189b-registry-tls\") pod \"image-registry-66df7c8f76-pmssh\" (UID: \"034d2d06-5575-4466-9b44-e1a08901189b\") " pod="openshift-image-registry/image-registry-66df7c8f76-pmssh" Nov 26 15:14:29 crc kubenswrapper[4785]: I1126 15:14:29.834532 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/034d2d06-5575-4466-9b44-e1a08901189b-registry-certificates\") pod \"image-registry-66df7c8f76-pmssh\" (UID: \"034d2d06-5575-4466-9b44-e1a08901189b\") " pod="openshift-image-registry/image-registry-66df7c8f76-pmssh" Nov 26 15:14:29 crc kubenswrapper[4785]: I1126 15:14:29.834567 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/034d2d06-5575-4466-9b44-e1a08901189b-trusted-ca\") pod \"image-registry-66df7c8f76-pmssh\" (UID: \"034d2d06-5575-4466-9b44-e1a08901189b\") " pod="openshift-image-registry/image-registry-66df7c8f76-pmssh" Nov 26 15:14:29 crc kubenswrapper[4785]: I1126 15:14:29.834585 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rjgqs\" (UniqueName: \"kubernetes.io/projected/034d2d06-5575-4466-9b44-e1a08901189b-kube-api-access-rjgqs\") pod \"image-registry-66df7c8f76-pmssh\" (UID: \"034d2d06-5575-4466-9b44-e1a08901189b\") " pod="openshift-image-registry/image-registry-66df7c8f76-pmssh" Nov 26 15:14:29 crc kubenswrapper[4785]: I1126 15:14:29.834610 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/034d2d06-5575-4466-9b44-e1a08901189b-ca-trust-extracted\") pod \"image-registry-66df7c8f76-pmssh\" (UID: \"034d2d06-5575-4466-9b44-e1a08901189b\") " pod="openshift-image-registry/image-registry-66df7c8f76-pmssh" Nov 26 15:14:29 crc kubenswrapper[4785]: I1126 15:14:29.834633 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-pmssh\" (UID: \"034d2d06-5575-4466-9b44-e1a08901189b\") " pod="openshift-image-registry/image-registry-66df7c8f76-pmssh" Nov 26 15:14:29 crc kubenswrapper[4785]: I1126 15:14:29.853318 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-pmssh\" (UID: \"034d2d06-5575-4466-9b44-e1a08901189b\") " pod="openshift-image-registry/image-registry-66df7c8f76-pmssh" Nov 26 15:14:29 crc kubenswrapper[4785]: I1126 15:14:29.936042 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/034d2d06-5575-4466-9b44-e1a08901189b-trusted-ca\") pod \"image-registry-66df7c8f76-pmssh\" (UID: \"034d2d06-5575-4466-9b44-e1a08901189b\") " pod="openshift-image-registry/image-registry-66df7c8f76-pmssh" Nov 26 15:14:29 crc kubenswrapper[4785]: I1126 15:14:29.936126 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rjgqs\" (UniqueName: \"kubernetes.io/projected/034d2d06-5575-4466-9b44-e1a08901189b-kube-api-access-rjgqs\") pod \"image-registry-66df7c8f76-pmssh\" (UID: \"034d2d06-5575-4466-9b44-e1a08901189b\") " pod="openshift-image-registry/image-registry-66df7c8f76-pmssh" Nov 26 15:14:29 crc kubenswrapper[4785]: I1126 15:14:29.936164 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/034d2d06-5575-4466-9b44-e1a08901189b-ca-trust-extracted\") pod \"image-registry-66df7c8f76-pmssh\" (UID: \"034d2d06-5575-4466-9b44-e1a08901189b\") " pod="openshift-image-registry/image-registry-66df7c8f76-pmssh" Nov 26 15:14:29 crc kubenswrapper[4785]: I1126 15:14:29.936277 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/034d2d06-5575-4466-9b44-e1a08901189b-bound-sa-token\") pod \"image-registry-66df7c8f76-pmssh\" (UID: \"034d2d06-5575-4466-9b44-e1a08901189b\") " pod="openshift-image-registry/image-registry-66df7c8f76-pmssh" Nov 26 15:14:29 crc kubenswrapper[4785]: I1126 15:14:29.936336 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/034d2d06-5575-4466-9b44-e1a08901189b-installation-pull-secrets\") pod \"image-registry-66df7c8f76-pmssh\" (UID: \"034d2d06-5575-4466-9b44-e1a08901189b\") " pod="openshift-image-registry/image-registry-66df7c8f76-pmssh" Nov 26 15:14:29 crc kubenswrapper[4785]: I1126 15:14:29.936383 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/034d2d06-5575-4466-9b44-e1a08901189b-registry-tls\") pod \"image-registry-66df7c8f76-pmssh\" (UID: \"034d2d06-5575-4466-9b44-e1a08901189b\") " pod="openshift-image-registry/image-registry-66df7c8f76-pmssh" Nov 26 15:14:29 crc kubenswrapper[4785]: I1126 15:14:29.936418 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/034d2d06-5575-4466-9b44-e1a08901189b-registry-certificates\") pod \"image-registry-66df7c8f76-pmssh\" (UID: \"034d2d06-5575-4466-9b44-e1a08901189b\") " pod="openshift-image-registry/image-registry-66df7c8f76-pmssh" Nov 26 15:14:29 crc kubenswrapper[4785]: I1126 15:14:29.937524 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/034d2d06-5575-4466-9b44-e1a08901189b-ca-trust-extracted\") pod \"image-registry-66df7c8f76-pmssh\" (UID: \"034d2d06-5575-4466-9b44-e1a08901189b\") " pod="openshift-image-registry/image-registry-66df7c8f76-pmssh" Nov 26 15:14:29 crc kubenswrapper[4785]: I1126 15:14:29.938138 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/034d2d06-5575-4466-9b44-e1a08901189b-trusted-ca\") pod \"image-registry-66df7c8f76-pmssh\" (UID: \"034d2d06-5575-4466-9b44-e1a08901189b\") " pod="openshift-image-registry/image-registry-66df7c8f76-pmssh" Nov 26 15:14:29 crc kubenswrapper[4785]: I1126 15:14:29.938484 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/034d2d06-5575-4466-9b44-e1a08901189b-registry-certificates\") pod \"image-registry-66df7c8f76-pmssh\" (UID: \"034d2d06-5575-4466-9b44-e1a08901189b\") " pod="openshift-image-registry/image-registry-66df7c8f76-pmssh" Nov 26 15:14:29 crc kubenswrapper[4785]: I1126 15:14:29.942292 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/034d2d06-5575-4466-9b44-e1a08901189b-installation-pull-secrets\") pod \"image-registry-66df7c8f76-pmssh\" (UID: \"034d2d06-5575-4466-9b44-e1a08901189b\") " pod="openshift-image-registry/image-registry-66df7c8f76-pmssh" Nov 26 15:14:29 crc kubenswrapper[4785]: I1126 15:14:29.945009 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/034d2d06-5575-4466-9b44-e1a08901189b-registry-tls\") pod \"image-registry-66df7c8f76-pmssh\" (UID: \"034d2d06-5575-4466-9b44-e1a08901189b\") " pod="openshift-image-registry/image-registry-66df7c8f76-pmssh" Nov 26 15:14:29 crc kubenswrapper[4785]: I1126 15:14:29.951475 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rjgqs\" (UniqueName: \"kubernetes.io/projected/034d2d06-5575-4466-9b44-e1a08901189b-kube-api-access-rjgqs\") pod \"image-registry-66df7c8f76-pmssh\" (UID: \"034d2d06-5575-4466-9b44-e1a08901189b\") " pod="openshift-image-registry/image-registry-66df7c8f76-pmssh" Nov 26 15:14:29 crc kubenswrapper[4785]: I1126 15:14:29.955865 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/034d2d06-5575-4466-9b44-e1a08901189b-bound-sa-token\") pod \"image-registry-66df7c8f76-pmssh\" (UID: \"034d2d06-5575-4466-9b44-e1a08901189b\") " pod="openshift-image-registry/image-registry-66df7c8f76-pmssh" Nov 26 15:14:30 crc kubenswrapper[4785]: I1126 15:14:30.040901 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-pmssh" Nov 26 15:14:30 crc kubenswrapper[4785]: I1126 15:14:30.524303 4785 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-pmssh"] Nov 26 15:14:30 crc kubenswrapper[4785]: I1126 15:14:30.668975 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-pmssh" event={"ID":"034d2d06-5575-4466-9b44-e1a08901189b","Type":"ContainerStarted","Data":"23231aef9fd70aa154da7775d561ac2e6d8998c9f69d159b8880c8befd23ff28"} Nov 26 15:14:31 crc kubenswrapper[4785]: I1126 15:14:31.676547 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-pmssh" event={"ID":"034d2d06-5575-4466-9b44-e1a08901189b","Type":"ContainerStarted","Data":"60c0db310f416463d5b67fa97f00a76d0fedd084b5ed2fba2f1b26aaeacca775"} Nov 26 15:14:31 crc kubenswrapper[4785]: I1126 15:14:31.676738 4785 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-pmssh" Nov 26 15:14:31 crc kubenswrapper[4785]: I1126 15:14:31.698199 4785 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-pmssh" podStartSLOduration=2.698173201 podStartE2EDuration="2.698173201s" podCreationTimestamp="2025-11-26 15:14:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 15:14:31.693530167 +0000 UTC m=+435.371895951" watchObservedRunningTime="2025-11-26 15:14:31.698173201 +0000 UTC m=+435.376538985" Nov 26 15:14:37 crc kubenswrapper[4785]: I1126 15:14:37.288752 4785 patch_prober.go:28] interesting pod/machine-config-daemon-gkxdl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 26 15:14:37 crc kubenswrapper[4785]: I1126 15:14:37.289733 4785 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-gkxdl" podUID="5539d39a-e2bc-4e7f-8b5a-a5e3e10c4ba4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 26 15:14:37 crc kubenswrapper[4785]: I1126 15:14:37.289876 4785 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-gkxdl" Nov 26 15:14:37 crc kubenswrapper[4785]: I1126 15:14:37.290533 4785 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"5cffe927e2d5a0446788b5fd7d127ce9dcbaf8f7bdc56ee8038410996d1c3424"} pod="openshift-machine-config-operator/machine-config-daemon-gkxdl" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 26 15:14:37 crc kubenswrapper[4785]: I1126 15:14:37.290846 4785 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-gkxdl" podUID="5539d39a-e2bc-4e7f-8b5a-a5e3e10c4ba4" containerName="machine-config-daemon" containerID="cri-o://5cffe927e2d5a0446788b5fd7d127ce9dcbaf8f7bdc56ee8038410996d1c3424" gracePeriod=600 Nov 26 15:14:37 crc kubenswrapper[4785]: I1126 15:14:37.718017 4785 generic.go:334] "Generic (PLEG): container finished" podID="5539d39a-e2bc-4e7f-8b5a-a5e3e10c4ba4" containerID="5cffe927e2d5a0446788b5fd7d127ce9dcbaf8f7bdc56ee8038410996d1c3424" exitCode=0 Nov 26 15:14:37 crc kubenswrapper[4785]: I1126 15:14:37.718119 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-gkxdl" event={"ID":"5539d39a-e2bc-4e7f-8b5a-a5e3e10c4ba4","Type":"ContainerDied","Data":"5cffe927e2d5a0446788b5fd7d127ce9dcbaf8f7bdc56ee8038410996d1c3424"} Nov 26 15:14:37 crc kubenswrapper[4785]: I1126 15:14:37.718406 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-gkxdl" event={"ID":"5539d39a-e2bc-4e7f-8b5a-a5e3e10c4ba4","Type":"ContainerStarted","Data":"535a7eec54b083cab8f1c3d8c54d6dd26d51d22ccda14e30a0448b01a72f1e1c"} Nov 26 15:14:37 crc kubenswrapper[4785]: I1126 15:14:37.718437 4785 scope.go:117] "RemoveContainer" containerID="5538d579d2e0ac36c778f58da6e8176f8f7a0995797e8bc7c3ad21a054346867" Nov 26 15:14:50 crc kubenswrapper[4785]: I1126 15:14:50.052191 4785 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-pmssh" Nov 26 15:14:50 crc kubenswrapper[4785]: I1126 15:14:50.111730 4785 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-nrsd2"] Nov 26 15:15:00 crc kubenswrapper[4785]: I1126 15:15:00.137985 4785 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29402835-9thb9"] Nov 26 15:15:00 crc kubenswrapper[4785]: I1126 15:15:00.140109 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29402835-9thb9" Nov 26 15:15:00 crc kubenswrapper[4785]: I1126 15:15:00.143934 4785 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Nov 26 15:15:00 crc kubenswrapper[4785]: I1126 15:15:00.144259 4785 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29402835-9thb9"] Nov 26 15:15:00 crc kubenswrapper[4785]: I1126 15:15:00.150906 4785 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Nov 26 15:15:00 crc kubenswrapper[4785]: I1126 15:15:00.269540 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6f971399-8e11-43ce-a968-007fd384f2d9-config-volume\") pod \"collect-profiles-29402835-9thb9\" (UID: \"6f971399-8e11-43ce-a968-007fd384f2d9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29402835-9thb9" Nov 26 15:15:00 crc kubenswrapper[4785]: I1126 15:15:00.269621 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6f971399-8e11-43ce-a968-007fd384f2d9-secret-volume\") pod \"collect-profiles-29402835-9thb9\" (UID: \"6f971399-8e11-43ce-a968-007fd384f2d9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29402835-9thb9" Nov 26 15:15:00 crc kubenswrapper[4785]: I1126 15:15:00.269667 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-glbxh\" (UniqueName: \"kubernetes.io/projected/6f971399-8e11-43ce-a968-007fd384f2d9-kube-api-access-glbxh\") pod \"collect-profiles-29402835-9thb9\" (UID: \"6f971399-8e11-43ce-a968-007fd384f2d9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29402835-9thb9" Nov 26 15:15:00 crc kubenswrapper[4785]: I1126 15:15:00.371417 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-glbxh\" (UniqueName: \"kubernetes.io/projected/6f971399-8e11-43ce-a968-007fd384f2d9-kube-api-access-glbxh\") pod \"collect-profiles-29402835-9thb9\" (UID: \"6f971399-8e11-43ce-a968-007fd384f2d9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29402835-9thb9" Nov 26 15:15:00 crc kubenswrapper[4785]: I1126 15:15:00.371621 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6f971399-8e11-43ce-a968-007fd384f2d9-config-volume\") pod \"collect-profiles-29402835-9thb9\" (UID: \"6f971399-8e11-43ce-a968-007fd384f2d9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29402835-9thb9" Nov 26 15:15:00 crc kubenswrapper[4785]: I1126 15:15:00.371683 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6f971399-8e11-43ce-a968-007fd384f2d9-secret-volume\") pod \"collect-profiles-29402835-9thb9\" (UID: \"6f971399-8e11-43ce-a968-007fd384f2d9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29402835-9thb9" Nov 26 15:15:00 crc kubenswrapper[4785]: I1126 15:15:00.372857 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6f971399-8e11-43ce-a968-007fd384f2d9-config-volume\") pod \"collect-profiles-29402835-9thb9\" (UID: \"6f971399-8e11-43ce-a968-007fd384f2d9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29402835-9thb9" Nov 26 15:15:00 crc kubenswrapper[4785]: I1126 15:15:00.377960 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6f971399-8e11-43ce-a968-007fd384f2d9-secret-volume\") pod \"collect-profiles-29402835-9thb9\" (UID: \"6f971399-8e11-43ce-a968-007fd384f2d9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29402835-9thb9" Nov 26 15:15:00 crc kubenswrapper[4785]: I1126 15:15:00.394413 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-glbxh\" (UniqueName: \"kubernetes.io/projected/6f971399-8e11-43ce-a968-007fd384f2d9-kube-api-access-glbxh\") pod \"collect-profiles-29402835-9thb9\" (UID: \"6f971399-8e11-43ce-a968-007fd384f2d9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29402835-9thb9" Nov 26 15:15:00 crc kubenswrapper[4785]: I1126 15:15:00.466837 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29402835-9thb9" Nov 26 15:15:00 crc kubenswrapper[4785]: I1126 15:15:00.679697 4785 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29402835-9thb9"] Nov 26 15:15:00 crc kubenswrapper[4785]: I1126 15:15:00.851796 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29402835-9thb9" event={"ID":"6f971399-8e11-43ce-a968-007fd384f2d9","Type":"ContainerStarted","Data":"bd0582bf2fab8f1db7ef95f7523623c7440461032966c493afe5aa56bdf55ec1"} Nov 26 15:15:00 crc kubenswrapper[4785]: I1126 15:15:00.852007 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29402835-9thb9" event={"ID":"6f971399-8e11-43ce-a968-007fd384f2d9","Type":"ContainerStarted","Data":"2bafeb6dfae82d75768f011ff5108060d282bc20abefbdf1301f6588ea376cef"} Nov 26 15:15:00 crc kubenswrapper[4785]: I1126 15:15:00.871990 4785 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29402835-9thb9" podStartSLOduration=0.871963392 podStartE2EDuration="871.963392ms" podCreationTimestamp="2025-11-26 15:15:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 15:15:00.865638294 +0000 UTC m=+464.544004058" watchObservedRunningTime="2025-11-26 15:15:00.871963392 +0000 UTC m=+464.550329166" Nov 26 15:15:01 crc kubenswrapper[4785]: I1126 15:15:01.862151 4785 generic.go:334] "Generic (PLEG): container finished" podID="6f971399-8e11-43ce-a968-007fd384f2d9" containerID="bd0582bf2fab8f1db7ef95f7523623c7440461032966c493afe5aa56bdf55ec1" exitCode=0 Nov 26 15:15:01 crc kubenswrapper[4785]: I1126 15:15:01.862359 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29402835-9thb9" event={"ID":"6f971399-8e11-43ce-a968-007fd384f2d9","Type":"ContainerDied","Data":"bd0582bf2fab8f1db7ef95f7523623c7440461032966c493afe5aa56bdf55ec1"} Nov 26 15:15:03 crc kubenswrapper[4785]: I1126 15:15:03.062671 4785 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29402835-9thb9" Nov 26 15:15:03 crc kubenswrapper[4785]: I1126 15:15:03.218202 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6f971399-8e11-43ce-a968-007fd384f2d9-config-volume\") pod \"6f971399-8e11-43ce-a968-007fd384f2d9\" (UID: \"6f971399-8e11-43ce-a968-007fd384f2d9\") " Nov 26 15:15:03 crc kubenswrapper[4785]: I1126 15:15:03.218276 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-glbxh\" (UniqueName: \"kubernetes.io/projected/6f971399-8e11-43ce-a968-007fd384f2d9-kube-api-access-glbxh\") pod \"6f971399-8e11-43ce-a968-007fd384f2d9\" (UID: \"6f971399-8e11-43ce-a968-007fd384f2d9\") " Nov 26 15:15:03 crc kubenswrapper[4785]: I1126 15:15:03.218400 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6f971399-8e11-43ce-a968-007fd384f2d9-secret-volume\") pod \"6f971399-8e11-43ce-a968-007fd384f2d9\" (UID: \"6f971399-8e11-43ce-a968-007fd384f2d9\") " Nov 26 15:15:03 crc kubenswrapper[4785]: I1126 15:15:03.219014 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6f971399-8e11-43ce-a968-007fd384f2d9-config-volume" (OuterVolumeSpecName: "config-volume") pod "6f971399-8e11-43ce-a968-007fd384f2d9" (UID: "6f971399-8e11-43ce-a968-007fd384f2d9"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 15:15:03 crc kubenswrapper[4785]: I1126 15:15:03.236107 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6f971399-8e11-43ce-a968-007fd384f2d9-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "6f971399-8e11-43ce-a968-007fd384f2d9" (UID: "6f971399-8e11-43ce-a968-007fd384f2d9"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 15:15:03 crc kubenswrapper[4785]: I1126 15:15:03.236239 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6f971399-8e11-43ce-a968-007fd384f2d9-kube-api-access-glbxh" (OuterVolumeSpecName: "kube-api-access-glbxh") pod "6f971399-8e11-43ce-a968-007fd384f2d9" (UID: "6f971399-8e11-43ce-a968-007fd384f2d9"). InnerVolumeSpecName "kube-api-access-glbxh". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 15:15:03 crc kubenswrapper[4785]: I1126 15:15:03.319927 4785 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-glbxh\" (UniqueName: \"kubernetes.io/projected/6f971399-8e11-43ce-a968-007fd384f2d9-kube-api-access-glbxh\") on node \"crc\" DevicePath \"\"" Nov 26 15:15:03 crc kubenswrapper[4785]: I1126 15:15:03.319979 4785 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6f971399-8e11-43ce-a968-007fd384f2d9-secret-volume\") on node \"crc\" DevicePath \"\"" Nov 26 15:15:03 crc kubenswrapper[4785]: I1126 15:15:03.320001 4785 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6f971399-8e11-43ce-a968-007fd384f2d9-config-volume\") on node \"crc\" DevicePath \"\"" Nov 26 15:15:03 crc kubenswrapper[4785]: I1126 15:15:03.875257 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29402835-9thb9" event={"ID":"6f971399-8e11-43ce-a968-007fd384f2d9","Type":"ContainerDied","Data":"2bafeb6dfae82d75768f011ff5108060d282bc20abefbdf1301f6588ea376cef"} Nov 26 15:15:03 crc kubenswrapper[4785]: I1126 15:15:03.875486 4785 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2bafeb6dfae82d75768f011ff5108060d282bc20abefbdf1301f6588ea376cef" Nov 26 15:15:03 crc kubenswrapper[4785]: I1126 15:15:03.875385 4785 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29402835-9thb9" Nov 26 15:15:15 crc kubenswrapper[4785]: I1126 15:15:15.147644 4785 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-nrsd2" podUID="80f8d801-cad5-4d41-b2c7-1bb2306b1b25" containerName="registry" containerID="cri-o://a0ad4ef719ebe6a761940a42e8a92bac899b06bfe2099b399c607dfdf4a1eff7" gracePeriod=30 Nov 26 15:15:15 crc kubenswrapper[4785]: I1126 15:15:15.528882 4785 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-nrsd2" Nov 26 15:15:15 crc kubenswrapper[4785]: I1126 15:15:15.686711 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/80f8d801-cad5-4d41-b2c7-1bb2306b1b25-registry-certificates\") pod \"80f8d801-cad5-4d41-b2c7-1bb2306b1b25\" (UID: \"80f8d801-cad5-4d41-b2c7-1bb2306b1b25\") " Nov 26 15:15:15 crc kubenswrapper[4785]: I1126 15:15:15.686831 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/80f8d801-cad5-4d41-b2c7-1bb2306b1b25-registry-tls\") pod \"80f8d801-cad5-4d41-b2c7-1bb2306b1b25\" (UID: \"80f8d801-cad5-4d41-b2c7-1bb2306b1b25\") " Nov 26 15:15:15 crc kubenswrapper[4785]: I1126 15:15:15.686898 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kp6vh\" (UniqueName: \"kubernetes.io/projected/80f8d801-cad5-4d41-b2c7-1bb2306b1b25-kube-api-access-kp6vh\") pod \"80f8d801-cad5-4d41-b2c7-1bb2306b1b25\" (UID: \"80f8d801-cad5-4d41-b2c7-1bb2306b1b25\") " Nov 26 15:15:15 crc kubenswrapper[4785]: I1126 15:15:15.686978 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/80f8d801-cad5-4d41-b2c7-1bb2306b1b25-ca-trust-extracted\") pod \"80f8d801-cad5-4d41-b2c7-1bb2306b1b25\" (UID: \"80f8d801-cad5-4d41-b2c7-1bb2306b1b25\") " Nov 26 15:15:15 crc kubenswrapper[4785]: I1126 15:15:15.687021 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/80f8d801-cad5-4d41-b2c7-1bb2306b1b25-bound-sa-token\") pod \"80f8d801-cad5-4d41-b2c7-1bb2306b1b25\" (UID: \"80f8d801-cad5-4d41-b2c7-1bb2306b1b25\") " Nov 26 15:15:15 crc kubenswrapper[4785]: I1126 15:15:15.687087 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/80f8d801-cad5-4d41-b2c7-1bb2306b1b25-trusted-ca\") pod \"80f8d801-cad5-4d41-b2c7-1bb2306b1b25\" (UID: \"80f8d801-cad5-4d41-b2c7-1bb2306b1b25\") " Nov 26 15:15:15 crc kubenswrapper[4785]: I1126 15:15:15.687131 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/80f8d801-cad5-4d41-b2c7-1bb2306b1b25-installation-pull-secrets\") pod \"80f8d801-cad5-4d41-b2c7-1bb2306b1b25\" (UID: \"80f8d801-cad5-4d41-b2c7-1bb2306b1b25\") " Nov 26 15:15:15 crc kubenswrapper[4785]: I1126 15:15:15.687344 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"80f8d801-cad5-4d41-b2c7-1bb2306b1b25\" (UID: \"80f8d801-cad5-4d41-b2c7-1bb2306b1b25\") " Nov 26 15:15:15 crc kubenswrapper[4785]: I1126 15:15:15.687857 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/80f8d801-cad5-4d41-b2c7-1bb2306b1b25-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "80f8d801-cad5-4d41-b2c7-1bb2306b1b25" (UID: "80f8d801-cad5-4d41-b2c7-1bb2306b1b25"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 15:15:15 crc kubenswrapper[4785]: I1126 15:15:15.689056 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/80f8d801-cad5-4d41-b2c7-1bb2306b1b25-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "80f8d801-cad5-4d41-b2c7-1bb2306b1b25" (UID: "80f8d801-cad5-4d41-b2c7-1bb2306b1b25"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 15:15:15 crc kubenswrapper[4785]: I1126 15:15:15.693537 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/80f8d801-cad5-4d41-b2c7-1bb2306b1b25-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "80f8d801-cad5-4d41-b2c7-1bb2306b1b25" (UID: "80f8d801-cad5-4d41-b2c7-1bb2306b1b25"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 15:15:15 crc kubenswrapper[4785]: I1126 15:15:15.694118 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/80f8d801-cad5-4d41-b2c7-1bb2306b1b25-kube-api-access-kp6vh" (OuterVolumeSpecName: "kube-api-access-kp6vh") pod "80f8d801-cad5-4d41-b2c7-1bb2306b1b25" (UID: "80f8d801-cad5-4d41-b2c7-1bb2306b1b25"). InnerVolumeSpecName "kube-api-access-kp6vh". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 15:15:15 crc kubenswrapper[4785]: I1126 15:15:15.694654 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/80f8d801-cad5-4d41-b2c7-1bb2306b1b25-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "80f8d801-cad5-4d41-b2c7-1bb2306b1b25" (UID: "80f8d801-cad5-4d41-b2c7-1bb2306b1b25"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 15:15:15 crc kubenswrapper[4785]: I1126 15:15:15.694979 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/80f8d801-cad5-4d41-b2c7-1bb2306b1b25-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "80f8d801-cad5-4d41-b2c7-1bb2306b1b25" (UID: "80f8d801-cad5-4d41-b2c7-1bb2306b1b25"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 15:15:15 crc kubenswrapper[4785]: I1126 15:15:15.700965 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "80f8d801-cad5-4d41-b2c7-1bb2306b1b25" (UID: "80f8d801-cad5-4d41-b2c7-1bb2306b1b25"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Nov 26 15:15:15 crc kubenswrapper[4785]: I1126 15:15:15.705056 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/80f8d801-cad5-4d41-b2c7-1bb2306b1b25-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "80f8d801-cad5-4d41-b2c7-1bb2306b1b25" (UID: "80f8d801-cad5-4d41-b2c7-1bb2306b1b25"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 15:15:15 crc kubenswrapper[4785]: I1126 15:15:15.788939 4785 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kp6vh\" (UniqueName: \"kubernetes.io/projected/80f8d801-cad5-4d41-b2c7-1bb2306b1b25-kube-api-access-kp6vh\") on node \"crc\" DevicePath \"\"" Nov 26 15:15:15 crc kubenswrapper[4785]: I1126 15:15:15.788985 4785 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/80f8d801-cad5-4d41-b2c7-1bb2306b1b25-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Nov 26 15:15:15 crc kubenswrapper[4785]: I1126 15:15:15.788998 4785 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/80f8d801-cad5-4d41-b2c7-1bb2306b1b25-bound-sa-token\") on node \"crc\" DevicePath \"\"" Nov 26 15:15:15 crc kubenswrapper[4785]: I1126 15:15:15.789012 4785 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/80f8d801-cad5-4d41-b2c7-1bb2306b1b25-trusted-ca\") on node \"crc\" DevicePath \"\"" Nov 26 15:15:15 crc kubenswrapper[4785]: I1126 15:15:15.789026 4785 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/80f8d801-cad5-4d41-b2c7-1bb2306b1b25-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Nov 26 15:15:15 crc kubenswrapper[4785]: I1126 15:15:15.789038 4785 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/80f8d801-cad5-4d41-b2c7-1bb2306b1b25-registry-certificates\") on node \"crc\" DevicePath \"\"" Nov 26 15:15:15 crc kubenswrapper[4785]: I1126 15:15:15.789049 4785 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/80f8d801-cad5-4d41-b2c7-1bb2306b1b25-registry-tls\") on node \"crc\" DevicePath \"\"" Nov 26 15:15:15 crc kubenswrapper[4785]: I1126 15:15:15.955872 4785 generic.go:334] "Generic (PLEG): container finished" podID="80f8d801-cad5-4d41-b2c7-1bb2306b1b25" containerID="a0ad4ef719ebe6a761940a42e8a92bac899b06bfe2099b399c607dfdf4a1eff7" exitCode=0 Nov 26 15:15:15 crc kubenswrapper[4785]: I1126 15:15:15.955960 4785 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-nrsd2" Nov 26 15:15:15 crc kubenswrapper[4785]: I1126 15:15:15.955945 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-nrsd2" event={"ID":"80f8d801-cad5-4d41-b2c7-1bb2306b1b25","Type":"ContainerDied","Data":"a0ad4ef719ebe6a761940a42e8a92bac899b06bfe2099b399c607dfdf4a1eff7"} Nov 26 15:15:15 crc kubenswrapper[4785]: I1126 15:15:15.956417 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-nrsd2" event={"ID":"80f8d801-cad5-4d41-b2c7-1bb2306b1b25","Type":"ContainerDied","Data":"7060ad56096ad31d3e5870ece508a21a402da0ff8f11416b4889b83f0913d76e"} Nov 26 15:15:15 crc kubenswrapper[4785]: I1126 15:15:15.956532 4785 scope.go:117] "RemoveContainer" containerID="a0ad4ef719ebe6a761940a42e8a92bac899b06bfe2099b399c607dfdf4a1eff7" Nov 26 15:15:15 crc kubenswrapper[4785]: I1126 15:15:15.975836 4785 scope.go:117] "RemoveContainer" containerID="a0ad4ef719ebe6a761940a42e8a92bac899b06bfe2099b399c607dfdf4a1eff7" Nov 26 15:15:15 crc kubenswrapper[4785]: E1126 15:15:15.976414 4785 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a0ad4ef719ebe6a761940a42e8a92bac899b06bfe2099b399c607dfdf4a1eff7\": container with ID starting with a0ad4ef719ebe6a761940a42e8a92bac899b06bfe2099b399c607dfdf4a1eff7 not found: ID does not exist" containerID="a0ad4ef719ebe6a761940a42e8a92bac899b06bfe2099b399c607dfdf4a1eff7" Nov 26 15:15:15 crc kubenswrapper[4785]: I1126 15:15:15.976498 4785 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a0ad4ef719ebe6a761940a42e8a92bac899b06bfe2099b399c607dfdf4a1eff7"} err="failed to get container status \"a0ad4ef719ebe6a761940a42e8a92bac899b06bfe2099b399c607dfdf4a1eff7\": rpc error: code = NotFound desc = could not find container \"a0ad4ef719ebe6a761940a42e8a92bac899b06bfe2099b399c607dfdf4a1eff7\": container with ID starting with a0ad4ef719ebe6a761940a42e8a92bac899b06bfe2099b399c607dfdf4a1eff7 not found: ID does not exist" Nov 26 15:15:16 crc kubenswrapper[4785]: I1126 15:15:16.004461 4785 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-nrsd2"] Nov 26 15:15:16 crc kubenswrapper[4785]: I1126 15:15:16.010341 4785 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-nrsd2"] Nov 26 15:15:17 crc kubenswrapper[4785]: I1126 15:15:17.042834 4785 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="80f8d801-cad5-4d41-b2c7-1bb2306b1b25" path="/var/lib/kubelet/pods/80f8d801-cad5-4d41-b2c7-1bb2306b1b25/volumes" Nov 26 15:16:37 crc kubenswrapper[4785]: I1126 15:16:37.288540 4785 patch_prober.go:28] interesting pod/machine-config-daemon-gkxdl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 26 15:16:37 crc kubenswrapper[4785]: I1126 15:16:37.289363 4785 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-gkxdl" podUID="5539d39a-e2bc-4e7f-8b5a-a5e3e10c4ba4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 26 15:17:07 crc kubenswrapper[4785]: I1126 15:17:07.288629 4785 patch_prober.go:28] interesting pod/machine-config-daemon-gkxdl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 26 15:17:07 crc kubenswrapper[4785]: I1126 15:17:07.289232 4785 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-gkxdl" podUID="5539d39a-e2bc-4e7f-8b5a-a5e3e10c4ba4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 26 15:17:36 crc kubenswrapper[4785]: I1126 15:17:36.977536 4785 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-925q9"] Nov 26 15:17:36 crc kubenswrapper[4785]: I1126 15:17:36.980918 4785 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-925q9" podUID="862c58fd-3f79-4276-bd76-ce689d32cbd6" containerName="ovn-controller" containerID="cri-o://47be007cc11c8c2db5bc6dd5786cad4e45c1f8f08f872d060be0211012aad935" gracePeriod=30 Nov 26 15:17:36 crc kubenswrapper[4785]: I1126 15:17:36.980992 4785 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-925q9" podUID="862c58fd-3f79-4276-bd76-ce689d32cbd6" containerName="nbdb" containerID="cri-o://e1a2013c12cbd23c8c9e4e0d5bd9f76df1c2c2f30f0d55fdb8c8cb03698ea576" gracePeriod=30 Nov 26 15:17:36 crc kubenswrapper[4785]: I1126 15:17:36.981033 4785 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-925q9" podUID="862c58fd-3f79-4276-bd76-ce689d32cbd6" containerName="sbdb" containerID="cri-o://84b98c7c476182b515072f71c6d81656debe5c03855324623b1eb31f53e6c5c1" gracePeriod=30 Nov 26 15:17:36 crc kubenswrapper[4785]: I1126 15:17:36.981052 4785 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-925q9" podUID="862c58fd-3f79-4276-bd76-ce689d32cbd6" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://018b59a73e9d454f9156de17b00a8c5cccc3f321164cac374187afa18466bc45" gracePeriod=30 Nov 26 15:17:36 crc kubenswrapper[4785]: I1126 15:17:36.981071 4785 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-925q9" podUID="862c58fd-3f79-4276-bd76-ce689d32cbd6" containerName="northd" containerID="cri-o://5bc34631383b7470b2d582d037ea922f2c46d09ca9e37011d758e16837336a78" gracePeriod=30 Nov 26 15:17:36 crc kubenswrapper[4785]: I1126 15:17:36.981088 4785 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-925q9" podUID="862c58fd-3f79-4276-bd76-ce689d32cbd6" containerName="kube-rbac-proxy-node" containerID="cri-o://7b754df1d5f2d20f3afdb10f2e77f4716de9d953c7322bb3e289d989d2699d17" gracePeriod=30 Nov 26 15:17:36 crc kubenswrapper[4785]: I1126 15:17:36.981104 4785 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-925q9" podUID="862c58fd-3f79-4276-bd76-ce689d32cbd6" containerName="ovn-acl-logging" containerID="cri-o://77387f2d556a76807d809c75aa4d9e2fdf5de82d7a04e23d3f5461ccef860196" gracePeriod=30 Nov 26 15:17:37 crc kubenswrapper[4785]: I1126 15:17:37.034359 4785 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-925q9" podUID="862c58fd-3f79-4276-bd76-ce689d32cbd6" containerName="ovnkube-controller" containerID="cri-o://e3659c263897812b210f9a05d4de256ad029c456cea625a09d22e4a93c71ba6a" gracePeriod=30 Nov 26 15:17:37 crc kubenswrapper[4785]: I1126 15:17:37.288635 4785 patch_prober.go:28] interesting pod/machine-config-daemon-gkxdl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 26 15:17:37 crc kubenswrapper[4785]: I1126 15:17:37.288679 4785 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-gkxdl" podUID="5539d39a-e2bc-4e7f-8b5a-a5e3e10c4ba4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 26 15:17:37 crc kubenswrapper[4785]: I1126 15:17:37.288719 4785 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-gkxdl" Nov 26 15:17:37 crc kubenswrapper[4785]: I1126 15:17:37.289331 4785 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"535a7eec54b083cab8f1c3d8c54d6dd26d51d22ccda14e30a0448b01a72f1e1c"} pod="openshift-machine-config-operator/machine-config-daemon-gkxdl" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 26 15:17:37 crc kubenswrapper[4785]: I1126 15:17:37.289380 4785 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-gkxdl" podUID="5539d39a-e2bc-4e7f-8b5a-a5e3e10c4ba4" containerName="machine-config-daemon" containerID="cri-o://535a7eec54b083cab8f1c3d8c54d6dd26d51d22ccda14e30a0448b01a72f1e1c" gracePeriod=600 Nov 26 15:17:37 crc kubenswrapper[4785]: I1126 15:17:37.317411 4785 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-925q9_862c58fd-3f79-4276-bd76-ce689d32cbd6/ovnkube-controller/3.log" Nov 26 15:17:37 crc kubenswrapper[4785]: I1126 15:17:37.319336 4785 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-925q9_862c58fd-3f79-4276-bd76-ce689d32cbd6/ovn-acl-logging/0.log" Nov 26 15:17:37 crc kubenswrapper[4785]: I1126 15:17:37.319887 4785 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-925q9_862c58fd-3f79-4276-bd76-ce689d32cbd6/ovn-controller/0.log" Nov 26 15:17:37 crc kubenswrapper[4785]: I1126 15:17:37.320284 4785 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-925q9" Nov 26 15:17:37 crc kubenswrapper[4785]: I1126 15:17:37.371822 4785 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-kjgdh"] Nov 26 15:17:37 crc kubenswrapper[4785]: E1126 15:17:37.372168 4785 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="862c58fd-3f79-4276-bd76-ce689d32cbd6" containerName="ovnkube-controller" Nov 26 15:17:37 crc kubenswrapper[4785]: I1126 15:17:37.372191 4785 state_mem.go:107] "Deleted CPUSet assignment" podUID="862c58fd-3f79-4276-bd76-ce689d32cbd6" containerName="ovnkube-controller" Nov 26 15:17:37 crc kubenswrapper[4785]: E1126 15:17:37.372209 4785 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="862c58fd-3f79-4276-bd76-ce689d32cbd6" containerName="kubecfg-setup" Nov 26 15:17:37 crc kubenswrapper[4785]: I1126 15:17:37.372250 4785 state_mem.go:107] "Deleted CPUSet assignment" podUID="862c58fd-3f79-4276-bd76-ce689d32cbd6" containerName="kubecfg-setup" Nov 26 15:17:37 crc kubenswrapper[4785]: E1126 15:17:37.372265 4785 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="862c58fd-3f79-4276-bd76-ce689d32cbd6" containerName="kube-rbac-proxy-node" Nov 26 15:17:37 crc kubenswrapper[4785]: I1126 15:17:37.372277 4785 state_mem.go:107] "Deleted CPUSet assignment" podUID="862c58fd-3f79-4276-bd76-ce689d32cbd6" containerName="kube-rbac-proxy-node" Nov 26 15:17:37 crc kubenswrapper[4785]: E1126 15:17:37.372290 4785 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="862c58fd-3f79-4276-bd76-ce689d32cbd6" containerName="northd" Nov 26 15:17:37 crc kubenswrapper[4785]: I1126 15:17:37.372298 4785 state_mem.go:107] "Deleted CPUSet assignment" podUID="862c58fd-3f79-4276-bd76-ce689d32cbd6" containerName="northd" Nov 26 15:17:37 crc kubenswrapper[4785]: E1126 15:17:37.372331 4785 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="862c58fd-3f79-4276-bd76-ce689d32cbd6" containerName="nbdb" Nov 26 15:17:37 crc kubenswrapper[4785]: I1126 15:17:37.372339 4785 state_mem.go:107] "Deleted CPUSet assignment" podUID="862c58fd-3f79-4276-bd76-ce689d32cbd6" containerName="nbdb" Nov 26 15:17:37 crc kubenswrapper[4785]: E1126 15:17:37.372366 4785 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="80f8d801-cad5-4d41-b2c7-1bb2306b1b25" containerName="registry" Nov 26 15:17:37 crc kubenswrapper[4785]: I1126 15:17:37.372374 4785 state_mem.go:107] "Deleted CPUSet assignment" podUID="80f8d801-cad5-4d41-b2c7-1bb2306b1b25" containerName="registry" Nov 26 15:17:37 crc kubenswrapper[4785]: E1126 15:17:37.372410 4785 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="862c58fd-3f79-4276-bd76-ce689d32cbd6" containerName="sbdb" Nov 26 15:17:37 crc kubenswrapper[4785]: I1126 15:17:37.372418 4785 state_mem.go:107] "Deleted CPUSet assignment" podUID="862c58fd-3f79-4276-bd76-ce689d32cbd6" containerName="sbdb" Nov 26 15:17:37 crc kubenswrapper[4785]: E1126 15:17:37.372433 4785 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="862c58fd-3f79-4276-bd76-ce689d32cbd6" containerName="ovnkube-controller" Nov 26 15:17:37 crc kubenswrapper[4785]: I1126 15:17:37.372443 4785 state_mem.go:107] "Deleted CPUSet assignment" podUID="862c58fd-3f79-4276-bd76-ce689d32cbd6" containerName="ovnkube-controller" Nov 26 15:17:37 crc kubenswrapper[4785]: E1126 15:17:37.372455 4785 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="862c58fd-3f79-4276-bd76-ce689d32cbd6" containerName="ovn-controller" Nov 26 15:17:37 crc kubenswrapper[4785]: I1126 15:17:37.372488 4785 state_mem.go:107] "Deleted CPUSet assignment" podUID="862c58fd-3f79-4276-bd76-ce689d32cbd6" containerName="ovn-controller" Nov 26 15:17:37 crc kubenswrapper[4785]: E1126 15:17:37.372497 4785 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="862c58fd-3f79-4276-bd76-ce689d32cbd6" containerName="ovnkube-controller" Nov 26 15:17:37 crc kubenswrapper[4785]: I1126 15:17:37.372504 4785 state_mem.go:107] "Deleted CPUSet assignment" podUID="862c58fd-3f79-4276-bd76-ce689d32cbd6" containerName="ovnkube-controller" Nov 26 15:17:37 crc kubenswrapper[4785]: E1126 15:17:37.372514 4785 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f971399-8e11-43ce-a968-007fd384f2d9" containerName="collect-profiles" Nov 26 15:17:37 crc kubenswrapper[4785]: I1126 15:17:37.372522 4785 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f971399-8e11-43ce-a968-007fd384f2d9" containerName="collect-profiles" Nov 26 15:17:37 crc kubenswrapper[4785]: E1126 15:17:37.372531 4785 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="862c58fd-3f79-4276-bd76-ce689d32cbd6" containerName="kube-rbac-proxy-ovn-metrics" Nov 26 15:17:37 crc kubenswrapper[4785]: I1126 15:17:37.372539 4785 state_mem.go:107] "Deleted CPUSet assignment" podUID="862c58fd-3f79-4276-bd76-ce689d32cbd6" containerName="kube-rbac-proxy-ovn-metrics" Nov 26 15:17:37 crc kubenswrapper[4785]: E1126 15:17:37.372576 4785 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="862c58fd-3f79-4276-bd76-ce689d32cbd6" containerName="ovnkube-controller" Nov 26 15:17:37 crc kubenswrapper[4785]: I1126 15:17:37.372584 4785 state_mem.go:107] "Deleted CPUSet assignment" podUID="862c58fd-3f79-4276-bd76-ce689d32cbd6" containerName="ovnkube-controller" Nov 26 15:17:37 crc kubenswrapper[4785]: E1126 15:17:37.372596 4785 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="862c58fd-3f79-4276-bd76-ce689d32cbd6" containerName="ovn-acl-logging" Nov 26 15:17:37 crc kubenswrapper[4785]: I1126 15:17:37.372604 4785 state_mem.go:107] "Deleted CPUSet assignment" podUID="862c58fd-3f79-4276-bd76-ce689d32cbd6" containerName="ovn-acl-logging" Nov 26 15:17:37 crc kubenswrapper[4785]: I1126 15:17:37.372767 4785 memory_manager.go:354] "RemoveStaleState removing state" podUID="862c58fd-3f79-4276-bd76-ce689d32cbd6" containerName="kube-rbac-proxy-ovn-metrics" Nov 26 15:17:37 crc kubenswrapper[4785]: I1126 15:17:37.372787 4785 memory_manager.go:354] "RemoveStaleState removing state" podUID="862c58fd-3f79-4276-bd76-ce689d32cbd6" containerName="ovn-controller" Nov 26 15:17:37 crc kubenswrapper[4785]: I1126 15:17:37.372797 4785 memory_manager.go:354] "RemoveStaleState removing state" podUID="862c58fd-3f79-4276-bd76-ce689d32cbd6" containerName="northd" Nov 26 15:17:37 crc kubenswrapper[4785]: I1126 15:17:37.372830 4785 memory_manager.go:354] "RemoveStaleState removing state" podUID="862c58fd-3f79-4276-bd76-ce689d32cbd6" containerName="ovnkube-controller" Nov 26 15:17:37 crc kubenswrapper[4785]: I1126 15:17:37.372839 4785 memory_manager.go:354] "RemoveStaleState removing state" podUID="6f971399-8e11-43ce-a968-007fd384f2d9" containerName="collect-profiles" Nov 26 15:17:37 crc kubenswrapper[4785]: I1126 15:17:37.372849 4785 memory_manager.go:354] "RemoveStaleState removing state" podUID="862c58fd-3f79-4276-bd76-ce689d32cbd6" containerName="ovnkube-controller" Nov 26 15:17:37 crc kubenswrapper[4785]: I1126 15:17:37.372858 4785 memory_manager.go:354] "RemoveStaleState removing state" podUID="80f8d801-cad5-4d41-b2c7-1bb2306b1b25" containerName="registry" Nov 26 15:17:37 crc kubenswrapper[4785]: I1126 15:17:37.372868 4785 memory_manager.go:354] "RemoveStaleState removing state" podUID="862c58fd-3f79-4276-bd76-ce689d32cbd6" containerName="sbdb" Nov 26 15:17:37 crc kubenswrapper[4785]: I1126 15:17:37.372879 4785 memory_manager.go:354] "RemoveStaleState removing state" podUID="862c58fd-3f79-4276-bd76-ce689d32cbd6" containerName="ovnkube-controller" Nov 26 15:17:37 crc kubenswrapper[4785]: I1126 15:17:37.372913 4785 memory_manager.go:354] "RemoveStaleState removing state" podUID="862c58fd-3f79-4276-bd76-ce689d32cbd6" containerName="kube-rbac-proxy-node" Nov 26 15:17:37 crc kubenswrapper[4785]: I1126 15:17:37.372924 4785 memory_manager.go:354] "RemoveStaleState removing state" podUID="862c58fd-3f79-4276-bd76-ce689d32cbd6" containerName="ovn-acl-logging" Nov 26 15:17:37 crc kubenswrapper[4785]: I1126 15:17:37.372932 4785 memory_manager.go:354] "RemoveStaleState removing state" podUID="862c58fd-3f79-4276-bd76-ce689d32cbd6" containerName="nbdb" Nov 26 15:17:37 crc kubenswrapper[4785]: E1126 15:17:37.373088 4785 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="862c58fd-3f79-4276-bd76-ce689d32cbd6" containerName="ovnkube-controller" Nov 26 15:17:37 crc kubenswrapper[4785]: I1126 15:17:37.373098 4785 state_mem.go:107] "Deleted CPUSet assignment" podUID="862c58fd-3f79-4276-bd76-ce689d32cbd6" containerName="ovnkube-controller" Nov 26 15:17:37 crc kubenswrapper[4785]: I1126 15:17:37.373263 4785 memory_manager.go:354] "RemoveStaleState removing state" podUID="862c58fd-3f79-4276-bd76-ce689d32cbd6" containerName="ovnkube-controller" Nov 26 15:17:37 crc kubenswrapper[4785]: I1126 15:17:37.373279 4785 memory_manager.go:354] "RemoveStaleState removing state" podUID="862c58fd-3f79-4276-bd76-ce689d32cbd6" containerName="ovnkube-controller" Nov 26 15:17:37 crc kubenswrapper[4785]: I1126 15:17:37.375834 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-kjgdh" Nov 26 15:17:37 crc kubenswrapper[4785]: I1126 15:17:37.452311 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/862c58fd-3f79-4276-bd76-ce689d32cbd6-systemd-units\") pod \"862c58fd-3f79-4276-bd76-ce689d32cbd6\" (UID: \"862c58fd-3f79-4276-bd76-ce689d32cbd6\") " Nov 26 15:17:37 crc kubenswrapper[4785]: I1126 15:17:37.452494 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/862c58fd-3f79-4276-bd76-ce689d32cbd6-run-ovn\") pod \"862c58fd-3f79-4276-bd76-ce689d32cbd6\" (UID: \"862c58fd-3f79-4276-bd76-ce689d32cbd6\") " Nov 26 15:17:37 crc kubenswrapper[4785]: I1126 15:17:37.452408 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/862c58fd-3f79-4276-bd76-ce689d32cbd6-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "862c58fd-3f79-4276-bd76-ce689d32cbd6" (UID: "862c58fd-3f79-4276-bd76-ce689d32cbd6"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 15:17:37 crc kubenswrapper[4785]: I1126 15:17:37.452515 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/862c58fd-3f79-4276-bd76-ce689d32cbd6-host-run-ovn-kubernetes\") pod \"862c58fd-3f79-4276-bd76-ce689d32cbd6\" (UID: \"862c58fd-3f79-4276-bd76-ce689d32cbd6\") " Nov 26 15:17:37 crc kubenswrapper[4785]: I1126 15:17:37.452561 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/862c58fd-3f79-4276-bd76-ce689d32cbd6-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "862c58fd-3f79-4276-bd76-ce689d32cbd6" (UID: "862c58fd-3f79-4276-bd76-ce689d32cbd6"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 15:17:37 crc kubenswrapper[4785]: I1126 15:17:37.452609 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/862c58fd-3f79-4276-bd76-ce689d32cbd6-var-lib-openvswitch\") pod \"862c58fd-3f79-4276-bd76-ce689d32cbd6\" (UID: \"862c58fd-3f79-4276-bd76-ce689d32cbd6\") " Nov 26 15:17:37 crc kubenswrapper[4785]: I1126 15:17:37.452655 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/862c58fd-3f79-4276-bd76-ce689d32cbd6-ovnkube-script-lib\") pod \"862c58fd-3f79-4276-bd76-ce689d32cbd6\" (UID: \"862c58fd-3f79-4276-bd76-ce689d32cbd6\") " Nov 26 15:17:37 crc kubenswrapper[4785]: I1126 15:17:37.452656 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/862c58fd-3f79-4276-bd76-ce689d32cbd6-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "862c58fd-3f79-4276-bd76-ce689d32cbd6" (UID: "862c58fd-3f79-4276-bd76-ce689d32cbd6"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 15:17:37 crc kubenswrapper[4785]: I1126 15:17:37.452678 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/862c58fd-3f79-4276-bd76-ce689d32cbd6-log-socket\") pod \"862c58fd-3f79-4276-bd76-ce689d32cbd6\" (UID: \"862c58fd-3f79-4276-bd76-ce689d32cbd6\") " Nov 26 15:17:37 crc kubenswrapper[4785]: I1126 15:17:37.452687 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/862c58fd-3f79-4276-bd76-ce689d32cbd6-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "862c58fd-3f79-4276-bd76-ce689d32cbd6" (UID: "862c58fd-3f79-4276-bd76-ce689d32cbd6"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 15:17:37 crc kubenswrapper[4785]: I1126 15:17:37.452701 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/862c58fd-3f79-4276-bd76-ce689d32cbd6-run-systemd\") pod \"862c58fd-3f79-4276-bd76-ce689d32cbd6\" (UID: \"862c58fd-3f79-4276-bd76-ce689d32cbd6\") " Nov 26 15:17:37 crc kubenswrapper[4785]: I1126 15:17:37.452745 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/862c58fd-3f79-4276-bd76-ce689d32cbd6-host-cni-netd\") pod \"862c58fd-3f79-4276-bd76-ce689d32cbd6\" (UID: \"862c58fd-3f79-4276-bd76-ce689d32cbd6\") " Nov 26 15:17:37 crc kubenswrapper[4785]: I1126 15:17:37.452776 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/862c58fd-3f79-4276-bd76-ce689d32cbd6-ovn-node-metrics-cert\") pod \"862c58fd-3f79-4276-bd76-ce689d32cbd6\" (UID: \"862c58fd-3f79-4276-bd76-ce689d32cbd6\") " Nov 26 15:17:37 crc kubenswrapper[4785]: I1126 15:17:37.452821 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/862c58fd-3f79-4276-bd76-ce689d32cbd6-host-cni-bin\") pod \"862c58fd-3f79-4276-bd76-ce689d32cbd6\" (UID: \"862c58fd-3f79-4276-bd76-ce689d32cbd6\") " Nov 26 15:17:37 crc kubenswrapper[4785]: I1126 15:17:37.452883 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/862c58fd-3f79-4276-bd76-ce689d32cbd6-env-overrides\") pod \"862c58fd-3f79-4276-bd76-ce689d32cbd6\" (UID: \"862c58fd-3f79-4276-bd76-ce689d32cbd6\") " Nov 26 15:17:37 crc kubenswrapper[4785]: I1126 15:17:37.452918 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/862c58fd-3f79-4276-bd76-ce689d32cbd6-node-log\") pod \"862c58fd-3f79-4276-bd76-ce689d32cbd6\" (UID: \"862c58fd-3f79-4276-bd76-ce689d32cbd6\") " Nov 26 15:17:37 crc kubenswrapper[4785]: I1126 15:17:37.452973 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/862c58fd-3f79-4276-bd76-ce689d32cbd6-host-slash\") pod \"862c58fd-3f79-4276-bd76-ce689d32cbd6\" (UID: \"862c58fd-3f79-4276-bd76-ce689d32cbd6\") " Nov 26 15:17:37 crc kubenswrapper[4785]: I1126 15:17:37.452988 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/862c58fd-3f79-4276-bd76-ce689d32cbd6-run-openvswitch\") pod \"862c58fd-3f79-4276-bd76-ce689d32cbd6\" (UID: \"862c58fd-3f79-4276-bd76-ce689d32cbd6\") " Nov 26 15:17:37 crc kubenswrapper[4785]: I1126 15:17:37.453005 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dgxb5\" (UniqueName: \"kubernetes.io/projected/862c58fd-3f79-4276-bd76-ce689d32cbd6-kube-api-access-dgxb5\") pod \"862c58fd-3f79-4276-bd76-ce689d32cbd6\" (UID: \"862c58fd-3f79-4276-bd76-ce689d32cbd6\") " Nov 26 15:17:37 crc kubenswrapper[4785]: I1126 15:17:37.453060 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/862c58fd-3f79-4276-bd76-ce689d32cbd6-ovnkube-config\") pod \"862c58fd-3f79-4276-bd76-ce689d32cbd6\" (UID: \"862c58fd-3f79-4276-bd76-ce689d32cbd6\") " Nov 26 15:17:37 crc kubenswrapper[4785]: I1126 15:17:37.453134 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/862c58fd-3f79-4276-bd76-ce689d32cbd6-host-run-netns\") pod \"862c58fd-3f79-4276-bd76-ce689d32cbd6\" (UID: \"862c58fd-3f79-4276-bd76-ce689d32cbd6\") " Nov 26 15:17:37 crc kubenswrapper[4785]: I1126 15:17:37.453152 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/862c58fd-3f79-4276-bd76-ce689d32cbd6-host-var-lib-cni-networks-ovn-kubernetes\") pod \"862c58fd-3f79-4276-bd76-ce689d32cbd6\" (UID: \"862c58fd-3f79-4276-bd76-ce689d32cbd6\") " Nov 26 15:17:37 crc kubenswrapper[4785]: I1126 15:17:37.453166 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/862c58fd-3f79-4276-bd76-ce689d32cbd6-host-kubelet\") pod \"862c58fd-3f79-4276-bd76-ce689d32cbd6\" (UID: \"862c58fd-3f79-4276-bd76-ce689d32cbd6\") " Nov 26 15:17:37 crc kubenswrapper[4785]: I1126 15:17:37.453180 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/862c58fd-3f79-4276-bd76-ce689d32cbd6-etc-openvswitch\") pod \"862c58fd-3f79-4276-bd76-ce689d32cbd6\" (UID: \"862c58fd-3f79-4276-bd76-ce689d32cbd6\") " Nov 26 15:17:37 crc kubenswrapper[4785]: I1126 15:17:37.453450 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/862c58fd-3f79-4276-bd76-ce689d32cbd6-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "862c58fd-3f79-4276-bd76-ce689d32cbd6" (UID: "862c58fd-3f79-4276-bd76-ce689d32cbd6"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 15:17:37 crc kubenswrapper[4785]: I1126 15:17:37.453488 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/862c58fd-3f79-4276-bd76-ce689d32cbd6-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "862c58fd-3f79-4276-bd76-ce689d32cbd6" (UID: "862c58fd-3f79-4276-bd76-ce689d32cbd6"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 15:17:37 crc kubenswrapper[4785]: I1126 15:17:37.453509 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/862c58fd-3f79-4276-bd76-ce689d32cbd6-log-socket" (OuterVolumeSpecName: "log-socket") pod "862c58fd-3f79-4276-bd76-ce689d32cbd6" (UID: "862c58fd-3f79-4276-bd76-ce689d32cbd6"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 15:17:37 crc kubenswrapper[4785]: I1126 15:17:37.453519 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/862c58fd-3f79-4276-bd76-ce689d32cbd6-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "862c58fd-3f79-4276-bd76-ce689d32cbd6" (UID: "862c58fd-3f79-4276-bd76-ce689d32cbd6"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 15:17:37 crc kubenswrapper[4785]: I1126 15:17:37.453540 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/862c58fd-3f79-4276-bd76-ce689d32cbd6-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "862c58fd-3f79-4276-bd76-ce689d32cbd6" (UID: "862c58fd-3f79-4276-bd76-ce689d32cbd6"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 15:17:37 crc kubenswrapper[4785]: I1126 15:17:37.453542 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/862c58fd-3f79-4276-bd76-ce689d32cbd6-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "862c58fd-3f79-4276-bd76-ce689d32cbd6" (UID: "862c58fd-3f79-4276-bd76-ce689d32cbd6"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 15:17:37 crc kubenswrapper[4785]: I1126 15:17:37.453588 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/862c58fd-3f79-4276-bd76-ce689d32cbd6-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "862c58fd-3f79-4276-bd76-ce689d32cbd6" (UID: "862c58fd-3f79-4276-bd76-ce689d32cbd6"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 15:17:37 crc kubenswrapper[4785]: I1126 15:17:37.453613 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/862c58fd-3f79-4276-bd76-ce689d32cbd6-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "862c58fd-3f79-4276-bd76-ce689d32cbd6" (UID: "862c58fd-3f79-4276-bd76-ce689d32cbd6"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 15:17:37 crc kubenswrapper[4785]: I1126 15:17:37.453644 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/862c58fd-3f79-4276-bd76-ce689d32cbd6-node-log" (OuterVolumeSpecName: "node-log") pod "862c58fd-3f79-4276-bd76-ce689d32cbd6" (UID: "862c58fd-3f79-4276-bd76-ce689d32cbd6"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 15:17:37 crc kubenswrapper[4785]: I1126 15:17:37.453689 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/862c58fd-3f79-4276-bd76-ce689d32cbd6-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "862c58fd-3f79-4276-bd76-ce689d32cbd6" (UID: "862c58fd-3f79-4276-bd76-ce689d32cbd6"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 15:17:37 crc kubenswrapper[4785]: I1126 15:17:37.453748 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/862c58fd-3f79-4276-bd76-ce689d32cbd6-host-slash" (OuterVolumeSpecName: "host-slash") pod "862c58fd-3f79-4276-bd76-ce689d32cbd6" (UID: "862c58fd-3f79-4276-bd76-ce689d32cbd6"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 15:17:37 crc kubenswrapper[4785]: I1126 15:17:37.453926 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/862c58fd-3f79-4276-bd76-ce689d32cbd6-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "862c58fd-3f79-4276-bd76-ce689d32cbd6" (UID: "862c58fd-3f79-4276-bd76-ce689d32cbd6"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 15:17:37 crc kubenswrapper[4785]: I1126 15:17:37.453978 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/862c58fd-3f79-4276-bd76-ce689d32cbd6-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "862c58fd-3f79-4276-bd76-ce689d32cbd6" (UID: "862c58fd-3f79-4276-bd76-ce689d32cbd6"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 15:17:37 crc kubenswrapper[4785]: I1126 15:17:37.454168 4785 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/862c58fd-3f79-4276-bd76-ce689d32cbd6-host-run-netns\") on node \"crc\" DevicePath \"\"" Nov 26 15:17:37 crc kubenswrapper[4785]: I1126 15:17:37.454188 4785 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/862c58fd-3f79-4276-bd76-ce689d32cbd6-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Nov 26 15:17:37 crc kubenswrapper[4785]: I1126 15:17:37.454203 4785 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/862c58fd-3f79-4276-bd76-ce689d32cbd6-host-kubelet\") on node \"crc\" DevicePath \"\"" Nov 26 15:17:37 crc kubenswrapper[4785]: I1126 15:17:37.454213 4785 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/862c58fd-3f79-4276-bd76-ce689d32cbd6-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Nov 26 15:17:37 crc kubenswrapper[4785]: I1126 15:17:37.454222 4785 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/862c58fd-3f79-4276-bd76-ce689d32cbd6-systemd-units\") on node \"crc\" DevicePath \"\"" Nov 26 15:17:37 crc kubenswrapper[4785]: I1126 15:17:37.454232 4785 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/862c58fd-3f79-4276-bd76-ce689d32cbd6-run-ovn\") on node \"crc\" DevicePath \"\"" Nov 26 15:17:37 crc kubenswrapper[4785]: I1126 15:17:37.454242 4785 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/862c58fd-3f79-4276-bd76-ce689d32cbd6-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Nov 26 15:17:37 crc kubenswrapper[4785]: I1126 15:17:37.454253 4785 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/862c58fd-3f79-4276-bd76-ce689d32cbd6-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Nov 26 15:17:37 crc kubenswrapper[4785]: I1126 15:17:37.454263 4785 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/862c58fd-3f79-4276-bd76-ce689d32cbd6-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Nov 26 15:17:37 crc kubenswrapper[4785]: I1126 15:17:37.454273 4785 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/862c58fd-3f79-4276-bd76-ce689d32cbd6-log-socket\") on node \"crc\" DevicePath \"\"" Nov 26 15:17:37 crc kubenswrapper[4785]: I1126 15:17:37.454282 4785 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/862c58fd-3f79-4276-bd76-ce689d32cbd6-host-cni-netd\") on node \"crc\" DevicePath \"\"" Nov 26 15:17:37 crc kubenswrapper[4785]: I1126 15:17:37.454292 4785 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/862c58fd-3f79-4276-bd76-ce689d32cbd6-host-cni-bin\") on node \"crc\" DevicePath \"\"" Nov 26 15:17:37 crc kubenswrapper[4785]: I1126 15:17:37.454301 4785 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/862c58fd-3f79-4276-bd76-ce689d32cbd6-env-overrides\") on node \"crc\" DevicePath \"\"" Nov 26 15:17:37 crc kubenswrapper[4785]: I1126 15:17:37.454310 4785 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/862c58fd-3f79-4276-bd76-ce689d32cbd6-node-log\") on node \"crc\" DevicePath \"\"" Nov 26 15:17:37 crc kubenswrapper[4785]: I1126 15:17:37.454319 4785 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/862c58fd-3f79-4276-bd76-ce689d32cbd6-host-slash\") on node \"crc\" DevicePath \"\"" Nov 26 15:17:37 crc kubenswrapper[4785]: I1126 15:17:37.454329 4785 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/862c58fd-3f79-4276-bd76-ce689d32cbd6-run-openvswitch\") on node \"crc\" DevicePath \"\"" Nov 26 15:17:37 crc kubenswrapper[4785]: I1126 15:17:37.454338 4785 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/862c58fd-3f79-4276-bd76-ce689d32cbd6-ovnkube-config\") on node \"crc\" DevicePath \"\"" Nov 26 15:17:37 crc kubenswrapper[4785]: I1126 15:17:37.458945 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/862c58fd-3f79-4276-bd76-ce689d32cbd6-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "862c58fd-3f79-4276-bd76-ce689d32cbd6" (UID: "862c58fd-3f79-4276-bd76-ce689d32cbd6"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 15:17:37 crc kubenswrapper[4785]: I1126 15:17:37.461538 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/862c58fd-3f79-4276-bd76-ce689d32cbd6-kube-api-access-dgxb5" (OuterVolumeSpecName: "kube-api-access-dgxb5") pod "862c58fd-3f79-4276-bd76-ce689d32cbd6" (UID: "862c58fd-3f79-4276-bd76-ce689d32cbd6"). InnerVolumeSpecName "kube-api-access-dgxb5". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 15:17:37 crc kubenswrapper[4785]: I1126 15:17:37.471131 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/862c58fd-3f79-4276-bd76-ce689d32cbd6-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "862c58fd-3f79-4276-bd76-ce689d32cbd6" (UID: "862c58fd-3f79-4276-bd76-ce689d32cbd6"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 15:17:37 crc kubenswrapper[4785]: I1126 15:17:37.555388 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/527d00ca-66d0-4c94-8b9e-cb65f3dcccc7-run-systemd\") pod \"ovnkube-node-kjgdh\" (UID: \"527d00ca-66d0-4c94-8b9e-cb65f3dcccc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-kjgdh" Nov 26 15:17:37 crc kubenswrapper[4785]: I1126 15:17:37.555440 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/527d00ca-66d0-4c94-8b9e-cb65f3dcccc7-run-openvswitch\") pod \"ovnkube-node-kjgdh\" (UID: \"527d00ca-66d0-4c94-8b9e-cb65f3dcccc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-kjgdh" Nov 26 15:17:37 crc kubenswrapper[4785]: I1126 15:17:37.555481 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/527d00ca-66d0-4c94-8b9e-cb65f3dcccc7-ovnkube-script-lib\") pod \"ovnkube-node-kjgdh\" (UID: \"527d00ca-66d0-4c94-8b9e-cb65f3dcccc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-kjgdh" Nov 26 15:17:37 crc kubenswrapper[4785]: I1126 15:17:37.555641 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/527d00ca-66d0-4c94-8b9e-cb65f3dcccc7-env-overrides\") pod \"ovnkube-node-kjgdh\" (UID: \"527d00ca-66d0-4c94-8b9e-cb65f3dcccc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-kjgdh" Nov 26 15:17:37 crc kubenswrapper[4785]: I1126 15:17:37.555705 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/527d00ca-66d0-4c94-8b9e-cb65f3dcccc7-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-kjgdh\" (UID: \"527d00ca-66d0-4c94-8b9e-cb65f3dcccc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-kjgdh" Nov 26 15:17:37 crc kubenswrapper[4785]: I1126 15:17:37.555725 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/527d00ca-66d0-4c94-8b9e-cb65f3dcccc7-run-ovn\") pod \"ovnkube-node-kjgdh\" (UID: \"527d00ca-66d0-4c94-8b9e-cb65f3dcccc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-kjgdh" Nov 26 15:17:37 crc kubenswrapper[4785]: I1126 15:17:37.555746 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/527d00ca-66d0-4c94-8b9e-cb65f3dcccc7-var-lib-openvswitch\") pod \"ovnkube-node-kjgdh\" (UID: \"527d00ca-66d0-4c94-8b9e-cb65f3dcccc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-kjgdh" Nov 26 15:17:37 crc kubenswrapper[4785]: I1126 15:17:37.555776 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/527d00ca-66d0-4c94-8b9e-cb65f3dcccc7-host-run-ovn-kubernetes\") pod \"ovnkube-node-kjgdh\" (UID: \"527d00ca-66d0-4c94-8b9e-cb65f3dcccc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-kjgdh" Nov 26 15:17:37 crc kubenswrapper[4785]: I1126 15:17:37.555864 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/527d00ca-66d0-4c94-8b9e-cb65f3dcccc7-log-socket\") pod \"ovnkube-node-kjgdh\" (UID: \"527d00ca-66d0-4c94-8b9e-cb65f3dcccc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-kjgdh" Nov 26 15:17:37 crc kubenswrapper[4785]: I1126 15:17:37.555956 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/527d00ca-66d0-4c94-8b9e-cb65f3dcccc7-host-cni-bin\") pod \"ovnkube-node-kjgdh\" (UID: \"527d00ca-66d0-4c94-8b9e-cb65f3dcccc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-kjgdh" Nov 26 15:17:37 crc kubenswrapper[4785]: I1126 15:17:37.555997 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/527d00ca-66d0-4c94-8b9e-cb65f3dcccc7-host-kubelet\") pod \"ovnkube-node-kjgdh\" (UID: \"527d00ca-66d0-4c94-8b9e-cb65f3dcccc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-kjgdh" Nov 26 15:17:37 crc kubenswrapper[4785]: I1126 15:17:37.556034 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/527d00ca-66d0-4c94-8b9e-cb65f3dcccc7-systemd-units\") pod \"ovnkube-node-kjgdh\" (UID: \"527d00ca-66d0-4c94-8b9e-cb65f3dcccc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-kjgdh" Nov 26 15:17:37 crc kubenswrapper[4785]: I1126 15:17:37.556075 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/527d00ca-66d0-4c94-8b9e-cb65f3dcccc7-host-cni-netd\") pod \"ovnkube-node-kjgdh\" (UID: \"527d00ca-66d0-4c94-8b9e-cb65f3dcccc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-kjgdh" Nov 26 15:17:37 crc kubenswrapper[4785]: I1126 15:17:37.556116 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/527d00ca-66d0-4c94-8b9e-cb65f3dcccc7-ovn-node-metrics-cert\") pod \"ovnkube-node-kjgdh\" (UID: \"527d00ca-66d0-4c94-8b9e-cb65f3dcccc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-kjgdh" Nov 26 15:17:37 crc kubenswrapper[4785]: I1126 15:17:37.556268 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/527d00ca-66d0-4c94-8b9e-cb65f3dcccc7-host-slash\") pod \"ovnkube-node-kjgdh\" (UID: \"527d00ca-66d0-4c94-8b9e-cb65f3dcccc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-kjgdh" Nov 26 15:17:37 crc kubenswrapper[4785]: I1126 15:17:37.556337 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h4sll\" (UniqueName: \"kubernetes.io/projected/527d00ca-66d0-4c94-8b9e-cb65f3dcccc7-kube-api-access-h4sll\") pod \"ovnkube-node-kjgdh\" (UID: \"527d00ca-66d0-4c94-8b9e-cb65f3dcccc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-kjgdh" Nov 26 15:17:37 crc kubenswrapper[4785]: I1126 15:17:37.556412 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/527d00ca-66d0-4c94-8b9e-cb65f3dcccc7-node-log\") pod \"ovnkube-node-kjgdh\" (UID: \"527d00ca-66d0-4c94-8b9e-cb65f3dcccc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-kjgdh" Nov 26 15:17:37 crc kubenswrapper[4785]: I1126 15:17:37.556465 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/527d00ca-66d0-4c94-8b9e-cb65f3dcccc7-etc-openvswitch\") pod \"ovnkube-node-kjgdh\" (UID: \"527d00ca-66d0-4c94-8b9e-cb65f3dcccc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-kjgdh" Nov 26 15:17:37 crc kubenswrapper[4785]: I1126 15:17:37.556486 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/527d00ca-66d0-4c94-8b9e-cb65f3dcccc7-ovnkube-config\") pod \"ovnkube-node-kjgdh\" (UID: \"527d00ca-66d0-4c94-8b9e-cb65f3dcccc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-kjgdh" Nov 26 15:17:37 crc kubenswrapper[4785]: I1126 15:17:37.556508 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/527d00ca-66d0-4c94-8b9e-cb65f3dcccc7-host-run-netns\") pod \"ovnkube-node-kjgdh\" (UID: \"527d00ca-66d0-4c94-8b9e-cb65f3dcccc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-kjgdh" Nov 26 15:17:37 crc kubenswrapper[4785]: I1126 15:17:37.556566 4785 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dgxb5\" (UniqueName: \"kubernetes.io/projected/862c58fd-3f79-4276-bd76-ce689d32cbd6-kube-api-access-dgxb5\") on node \"crc\" DevicePath \"\"" Nov 26 15:17:37 crc kubenswrapper[4785]: I1126 15:17:37.556580 4785 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/862c58fd-3f79-4276-bd76-ce689d32cbd6-run-systemd\") on node \"crc\" DevicePath \"\"" Nov 26 15:17:37 crc kubenswrapper[4785]: I1126 15:17:37.556589 4785 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/862c58fd-3f79-4276-bd76-ce689d32cbd6-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Nov 26 15:17:37 crc kubenswrapper[4785]: I1126 15:17:37.658085 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/527d00ca-66d0-4c94-8b9e-cb65f3dcccc7-host-cni-bin\") pod \"ovnkube-node-kjgdh\" (UID: \"527d00ca-66d0-4c94-8b9e-cb65f3dcccc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-kjgdh" Nov 26 15:17:37 crc kubenswrapper[4785]: I1126 15:17:37.658133 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/527d00ca-66d0-4c94-8b9e-cb65f3dcccc7-host-kubelet\") pod \"ovnkube-node-kjgdh\" (UID: \"527d00ca-66d0-4c94-8b9e-cb65f3dcccc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-kjgdh" Nov 26 15:17:37 crc kubenswrapper[4785]: I1126 15:17:37.658159 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/527d00ca-66d0-4c94-8b9e-cb65f3dcccc7-systemd-units\") pod \"ovnkube-node-kjgdh\" (UID: \"527d00ca-66d0-4c94-8b9e-cb65f3dcccc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-kjgdh" Nov 26 15:17:37 crc kubenswrapper[4785]: I1126 15:17:37.658186 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/527d00ca-66d0-4c94-8b9e-cb65f3dcccc7-host-cni-netd\") pod \"ovnkube-node-kjgdh\" (UID: \"527d00ca-66d0-4c94-8b9e-cb65f3dcccc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-kjgdh" Nov 26 15:17:37 crc kubenswrapper[4785]: I1126 15:17:37.658210 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/527d00ca-66d0-4c94-8b9e-cb65f3dcccc7-ovn-node-metrics-cert\") pod \"ovnkube-node-kjgdh\" (UID: \"527d00ca-66d0-4c94-8b9e-cb65f3dcccc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-kjgdh" Nov 26 15:17:37 crc kubenswrapper[4785]: I1126 15:17:37.658233 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/527d00ca-66d0-4c94-8b9e-cb65f3dcccc7-host-slash\") pod \"ovnkube-node-kjgdh\" (UID: \"527d00ca-66d0-4c94-8b9e-cb65f3dcccc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-kjgdh" Nov 26 15:17:37 crc kubenswrapper[4785]: I1126 15:17:37.658239 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/527d00ca-66d0-4c94-8b9e-cb65f3dcccc7-host-cni-bin\") pod \"ovnkube-node-kjgdh\" (UID: \"527d00ca-66d0-4c94-8b9e-cb65f3dcccc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-kjgdh" Nov 26 15:17:37 crc kubenswrapper[4785]: I1126 15:17:37.658254 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h4sll\" (UniqueName: \"kubernetes.io/projected/527d00ca-66d0-4c94-8b9e-cb65f3dcccc7-kube-api-access-h4sll\") pod \"ovnkube-node-kjgdh\" (UID: \"527d00ca-66d0-4c94-8b9e-cb65f3dcccc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-kjgdh" Nov 26 15:17:37 crc kubenswrapper[4785]: I1126 15:17:37.658287 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/527d00ca-66d0-4c94-8b9e-cb65f3dcccc7-host-kubelet\") pod \"ovnkube-node-kjgdh\" (UID: \"527d00ca-66d0-4c94-8b9e-cb65f3dcccc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-kjgdh" Nov 26 15:17:37 crc kubenswrapper[4785]: I1126 15:17:37.658349 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/527d00ca-66d0-4c94-8b9e-cb65f3dcccc7-node-log\") pod \"ovnkube-node-kjgdh\" (UID: \"527d00ca-66d0-4c94-8b9e-cb65f3dcccc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-kjgdh" Nov 26 15:17:37 crc kubenswrapper[4785]: I1126 15:17:37.658364 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/527d00ca-66d0-4c94-8b9e-cb65f3dcccc7-host-cni-netd\") pod \"ovnkube-node-kjgdh\" (UID: \"527d00ca-66d0-4c94-8b9e-cb65f3dcccc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-kjgdh" Nov 26 15:17:37 crc kubenswrapper[4785]: I1126 15:17:37.658387 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/527d00ca-66d0-4c94-8b9e-cb65f3dcccc7-etc-openvswitch\") pod \"ovnkube-node-kjgdh\" (UID: \"527d00ca-66d0-4c94-8b9e-cb65f3dcccc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-kjgdh" Nov 26 15:17:37 crc kubenswrapper[4785]: I1126 15:17:37.658385 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/527d00ca-66d0-4c94-8b9e-cb65f3dcccc7-systemd-units\") pod \"ovnkube-node-kjgdh\" (UID: \"527d00ca-66d0-4c94-8b9e-cb65f3dcccc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-kjgdh" Nov 26 15:17:37 crc kubenswrapper[4785]: I1126 15:17:37.658418 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/527d00ca-66d0-4c94-8b9e-cb65f3dcccc7-node-log\") pod \"ovnkube-node-kjgdh\" (UID: \"527d00ca-66d0-4c94-8b9e-cb65f3dcccc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-kjgdh" Nov 26 15:17:37 crc kubenswrapper[4785]: I1126 15:17:37.658450 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/527d00ca-66d0-4c94-8b9e-cb65f3dcccc7-etc-openvswitch\") pod \"ovnkube-node-kjgdh\" (UID: \"527d00ca-66d0-4c94-8b9e-cb65f3dcccc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-kjgdh" Nov 26 15:17:37 crc kubenswrapper[4785]: I1126 15:17:37.658425 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/527d00ca-66d0-4c94-8b9e-cb65f3dcccc7-ovnkube-config\") pod \"ovnkube-node-kjgdh\" (UID: \"527d00ca-66d0-4c94-8b9e-cb65f3dcccc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-kjgdh" Nov 26 15:17:37 crc kubenswrapper[4785]: I1126 15:17:37.658410 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/527d00ca-66d0-4c94-8b9e-cb65f3dcccc7-host-slash\") pod \"ovnkube-node-kjgdh\" (UID: \"527d00ca-66d0-4c94-8b9e-cb65f3dcccc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-kjgdh" Nov 26 15:17:37 crc kubenswrapper[4785]: I1126 15:17:37.658491 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/527d00ca-66d0-4c94-8b9e-cb65f3dcccc7-host-run-netns\") pod \"ovnkube-node-kjgdh\" (UID: \"527d00ca-66d0-4c94-8b9e-cb65f3dcccc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-kjgdh" Nov 26 15:17:37 crc kubenswrapper[4785]: I1126 15:17:37.658517 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/527d00ca-66d0-4c94-8b9e-cb65f3dcccc7-run-systemd\") pod \"ovnkube-node-kjgdh\" (UID: \"527d00ca-66d0-4c94-8b9e-cb65f3dcccc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-kjgdh" Nov 26 15:17:37 crc kubenswrapper[4785]: I1126 15:17:37.658538 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/527d00ca-66d0-4c94-8b9e-cb65f3dcccc7-run-openvswitch\") pod \"ovnkube-node-kjgdh\" (UID: \"527d00ca-66d0-4c94-8b9e-cb65f3dcccc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-kjgdh" Nov 26 15:17:37 crc kubenswrapper[4785]: I1126 15:17:37.658581 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/527d00ca-66d0-4c94-8b9e-cb65f3dcccc7-host-run-netns\") pod \"ovnkube-node-kjgdh\" (UID: \"527d00ca-66d0-4c94-8b9e-cb65f3dcccc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-kjgdh" Nov 26 15:17:37 crc kubenswrapper[4785]: I1126 15:17:37.658615 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/527d00ca-66d0-4c94-8b9e-cb65f3dcccc7-ovnkube-script-lib\") pod \"ovnkube-node-kjgdh\" (UID: \"527d00ca-66d0-4c94-8b9e-cb65f3dcccc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-kjgdh" Nov 26 15:17:37 crc kubenswrapper[4785]: I1126 15:17:37.658642 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/527d00ca-66d0-4c94-8b9e-cb65f3dcccc7-env-overrides\") pod \"ovnkube-node-kjgdh\" (UID: \"527d00ca-66d0-4c94-8b9e-cb65f3dcccc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-kjgdh" Nov 26 15:17:37 crc kubenswrapper[4785]: I1126 15:17:37.658647 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/527d00ca-66d0-4c94-8b9e-cb65f3dcccc7-run-openvswitch\") pod \"ovnkube-node-kjgdh\" (UID: \"527d00ca-66d0-4c94-8b9e-cb65f3dcccc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-kjgdh" Nov 26 15:17:37 crc kubenswrapper[4785]: I1126 15:17:37.658666 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/527d00ca-66d0-4c94-8b9e-cb65f3dcccc7-run-ovn\") pod \"ovnkube-node-kjgdh\" (UID: \"527d00ca-66d0-4c94-8b9e-cb65f3dcccc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-kjgdh" Nov 26 15:17:37 crc kubenswrapper[4785]: I1126 15:17:37.658615 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/527d00ca-66d0-4c94-8b9e-cb65f3dcccc7-run-systemd\") pod \"ovnkube-node-kjgdh\" (UID: \"527d00ca-66d0-4c94-8b9e-cb65f3dcccc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-kjgdh" Nov 26 15:17:37 crc kubenswrapper[4785]: I1126 15:17:37.658685 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/527d00ca-66d0-4c94-8b9e-cb65f3dcccc7-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-kjgdh\" (UID: \"527d00ca-66d0-4c94-8b9e-cb65f3dcccc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-kjgdh" Nov 26 15:17:37 crc kubenswrapper[4785]: I1126 15:17:37.658706 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/527d00ca-66d0-4c94-8b9e-cb65f3dcccc7-var-lib-openvswitch\") pod \"ovnkube-node-kjgdh\" (UID: \"527d00ca-66d0-4c94-8b9e-cb65f3dcccc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-kjgdh" Nov 26 15:17:37 crc kubenswrapper[4785]: I1126 15:17:37.658725 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/527d00ca-66d0-4c94-8b9e-cb65f3dcccc7-host-run-ovn-kubernetes\") pod \"ovnkube-node-kjgdh\" (UID: \"527d00ca-66d0-4c94-8b9e-cb65f3dcccc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-kjgdh" Nov 26 15:17:37 crc kubenswrapper[4785]: I1126 15:17:37.658739 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/527d00ca-66d0-4c94-8b9e-cb65f3dcccc7-log-socket\") pod \"ovnkube-node-kjgdh\" (UID: \"527d00ca-66d0-4c94-8b9e-cb65f3dcccc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-kjgdh" Nov 26 15:17:37 crc kubenswrapper[4785]: I1126 15:17:37.658784 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/527d00ca-66d0-4c94-8b9e-cb65f3dcccc7-log-socket\") pod \"ovnkube-node-kjgdh\" (UID: \"527d00ca-66d0-4c94-8b9e-cb65f3dcccc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-kjgdh" Nov 26 15:17:37 crc kubenswrapper[4785]: I1126 15:17:37.658809 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/527d00ca-66d0-4c94-8b9e-cb65f3dcccc7-run-ovn\") pod \"ovnkube-node-kjgdh\" (UID: \"527d00ca-66d0-4c94-8b9e-cb65f3dcccc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-kjgdh" Nov 26 15:17:37 crc kubenswrapper[4785]: I1126 15:17:37.658828 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/527d00ca-66d0-4c94-8b9e-cb65f3dcccc7-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-kjgdh\" (UID: \"527d00ca-66d0-4c94-8b9e-cb65f3dcccc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-kjgdh" Nov 26 15:17:37 crc kubenswrapper[4785]: I1126 15:17:37.658849 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/527d00ca-66d0-4c94-8b9e-cb65f3dcccc7-var-lib-openvswitch\") pod \"ovnkube-node-kjgdh\" (UID: \"527d00ca-66d0-4c94-8b9e-cb65f3dcccc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-kjgdh" Nov 26 15:17:37 crc kubenswrapper[4785]: I1126 15:17:37.658869 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/527d00ca-66d0-4c94-8b9e-cb65f3dcccc7-host-run-ovn-kubernetes\") pod \"ovnkube-node-kjgdh\" (UID: \"527d00ca-66d0-4c94-8b9e-cb65f3dcccc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-kjgdh" Nov 26 15:17:37 crc kubenswrapper[4785]: I1126 15:17:37.659455 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/527d00ca-66d0-4c94-8b9e-cb65f3dcccc7-ovnkube-config\") pod \"ovnkube-node-kjgdh\" (UID: \"527d00ca-66d0-4c94-8b9e-cb65f3dcccc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-kjgdh" Nov 26 15:17:37 crc kubenswrapper[4785]: I1126 15:17:37.659605 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/527d00ca-66d0-4c94-8b9e-cb65f3dcccc7-env-overrides\") pod \"ovnkube-node-kjgdh\" (UID: \"527d00ca-66d0-4c94-8b9e-cb65f3dcccc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-kjgdh" Nov 26 15:17:37 crc kubenswrapper[4785]: I1126 15:17:37.659633 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/527d00ca-66d0-4c94-8b9e-cb65f3dcccc7-ovnkube-script-lib\") pod \"ovnkube-node-kjgdh\" (UID: \"527d00ca-66d0-4c94-8b9e-cb65f3dcccc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-kjgdh" Nov 26 15:17:37 crc kubenswrapper[4785]: I1126 15:17:37.664917 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/527d00ca-66d0-4c94-8b9e-cb65f3dcccc7-ovn-node-metrics-cert\") pod \"ovnkube-node-kjgdh\" (UID: \"527d00ca-66d0-4c94-8b9e-cb65f3dcccc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-kjgdh" Nov 26 15:17:37 crc kubenswrapper[4785]: I1126 15:17:37.679823 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h4sll\" (UniqueName: \"kubernetes.io/projected/527d00ca-66d0-4c94-8b9e-cb65f3dcccc7-kube-api-access-h4sll\") pod \"ovnkube-node-kjgdh\" (UID: \"527d00ca-66d0-4c94-8b9e-cb65f3dcccc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-kjgdh" Nov 26 15:17:37 crc kubenswrapper[4785]: I1126 15:17:37.702150 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-kjgdh" Nov 26 15:17:37 crc kubenswrapper[4785]: W1126 15:17:37.727496 4785 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod527d00ca_66d0_4c94_8b9e_cb65f3dcccc7.slice/crio-f54beace0df8cd7dbf6f556cfe8f3a010e431d515758c28dfdbdfb0e1c878f02 WatchSource:0}: Error finding container f54beace0df8cd7dbf6f556cfe8f3a010e431d515758c28dfdbdfb0e1c878f02: Status 404 returned error can't find the container with id f54beace0df8cd7dbf6f556cfe8f3a010e431d515758c28dfdbdfb0e1c878f02 Nov 26 15:17:37 crc kubenswrapper[4785]: I1126 15:17:37.814360 4785 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-6q4xd_855bd894-cca9-4fe1-a0d5-8b72afe7c93a/kube-multus/2.log" Nov 26 15:17:37 crc kubenswrapper[4785]: I1126 15:17:37.815537 4785 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-6q4xd_855bd894-cca9-4fe1-a0d5-8b72afe7c93a/kube-multus/1.log" Nov 26 15:17:37 crc kubenswrapper[4785]: I1126 15:17:37.815616 4785 generic.go:334] "Generic (PLEG): container finished" podID="855bd894-cca9-4fe1-a0d5-8b72afe7c93a" containerID="72a7e155fb846c9d6ff923ffa51d0d8a953f1aa548ab41ffc0d7339cd228d85b" exitCode=2 Nov 26 15:17:37 crc kubenswrapper[4785]: I1126 15:17:37.815727 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-6q4xd" event={"ID":"855bd894-cca9-4fe1-a0d5-8b72afe7c93a","Type":"ContainerDied","Data":"72a7e155fb846c9d6ff923ffa51d0d8a953f1aa548ab41ffc0d7339cd228d85b"} Nov 26 15:17:37 crc kubenswrapper[4785]: I1126 15:17:37.815811 4785 scope.go:117] "RemoveContainer" containerID="96bab32cef462780b564492521f2bdddda092ed263374eb243d3e20ed8a068ff" Nov 26 15:17:37 crc kubenswrapper[4785]: I1126 15:17:37.816681 4785 scope.go:117] "RemoveContainer" containerID="72a7e155fb846c9d6ff923ffa51d0d8a953f1aa548ab41ffc0d7339cd228d85b" Nov 26 15:17:37 crc kubenswrapper[4785]: E1126 15:17:37.817035 4785 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-6q4xd_openshift-multus(855bd894-cca9-4fe1-a0d5-8b72afe7c93a)\"" pod="openshift-multus/multus-6q4xd" podUID="855bd894-cca9-4fe1-a0d5-8b72afe7c93a" Nov 26 15:17:37 crc kubenswrapper[4785]: I1126 15:17:37.822598 4785 generic.go:334] "Generic (PLEG): container finished" podID="5539d39a-e2bc-4e7f-8b5a-a5e3e10c4ba4" containerID="535a7eec54b083cab8f1c3d8c54d6dd26d51d22ccda14e30a0448b01a72f1e1c" exitCode=0 Nov 26 15:17:37 crc kubenswrapper[4785]: I1126 15:17:37.822693 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-gkxdl" event={"ID":"5539d39a-e2bc-4e7f-8b5a-a5e3e10c4ba4","Type":"ContainerDied","Data":"535a7eec54b083cab8f1c3d8c54d6dd26d51d22ccda14e30a0448b01a72f1e1c"} Nov 26 15:17:37 crc kubenswrapper[4785]: I1126 15:17:37.822742 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-gkxdl" event={"ID":"5539d39a-e2bc-4e7f-8b5a-a5e3e10c4ba4","Type":"ContainerStarted","Data":"5335547fe096a0c8db3ef049b0bdabe9e8e04a213e13c79d8696fbb738fcc9fa"} Nov 26 15:17:37 crc kubenswrapper[4785]: I1126 15:17:37.838957 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kjgdh" event={"ID":"527d00ca-66d0-4c94-8b9e-cb65f3dcccc7","Type":"ContainerStarted","Data":"f54beace0df8cd7dbf6f556cfe8f3a010e431d515758c28dfdbdfb0e1c878f02"} Nov 26 15:17:37 crc kubenswrapper[4785]: I1126 15:17:37.849911 4785 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-925q9_862c58fd-3f79-4276-bd76-ce689d32cbd6/ovnkube-controller/3.log" Nov 26 15:17:37 crc kubenswrapper[4785]: I1126 15:17:37.875325 4785 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-925q9_862c58fd-3f79-4276-bd76-ce689d32cbd6/ovn-acl-logging/0.log" Nov 26 15:17:37 crc kubenswrapper[4785]: I1126 15:17:37.876661 4785 scope.go:117] "RemoveContainer" containerID="5cffe927e2d5a0446788b5fd7d127ce9dcbaf8f7bdc56ee8038410996d1c3424" Nov 26 15:17:37 crc kubenswrapper[4785]: I1126 15:17:37.876902 4785 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-925q9_862c58fd-3f79-4276-bd76-ce689d32cbd6/ovn-controller/0.log" Nov 26 15:17:37 crc kubenswrapper[4785]: I1126 15:17:37.878230 4785 generic.go:334] "Generic (PLEG): container finished" podID="862c58fd-3f79-4276-bd76-ce689d32cbd6" containerID="e3659c263897812b210f9a05d4de256ad029c456cea625a09d22e4a93c71ba6a" exitCode=0 Nov 26 15:17:37 crc kubenswrapper[4785]: I1126 15:17:37.878262 4785 generic.go:334] "Generic (PLEG): container finished" podID="862c58fd-3f79-4276-bd76-ce689d32cbd6" containerID="84b98c7c476182b515072f71c6d81656debe5c03855324623b1eb31f53e6c5c1" exitCode=0 Nov 26 15:17:37 crc kubenswrapper[4785]: I1126 15:17:37.878271 4785 generic.go:334] "Generic (PLEG): container finished" podID="862c58fd-3f79-4276-bd76-ce689d32cbd6" containerID="e1a2013c12cbd23c8c9e4e0d5bd9f76df1c2c2f30f0d55fdb8c8cb03698ea576" exitCode=0 Nov 26 15:17:37 crc kubenswrapper[4785]: I1126 15:17:37.878281 4785 generic.go:334] "Generic (PLEG): container finished" podID="862c58fd-3f79-4276-bd76-ce689d32cbd6" containerID="5bc34631383b7470b2d582d037ea922f2c46d09ca9e37011d758e16837336a78" exitCode=0 Nov 26 15:17:37 crc kubenswrapper[4785]: I1126 15:17:37.878289 4785 generic.go:334] "Generic (PLEG): container finished" podID="862c58fd-3f79-4276-bd76-ce689d32cbd6" containerID="018b59a73e9d454f9156de17b00a8c5cccc3f321164cac374187afa18466bc45" exitCode=0 Nov 26 15:17:37 crc kubenswrapper[4785]: I1126 15:17:37.878298 4785 generic.go:334] "Generic (PLEG): container finished" podID="862c58fd-3f79-4276-bd76-ce689d32cbd6" containerID="7b754df1d5f2d20f3afdb10f2e77f4716de9d953c7322bb3e289d989d2699d17" exitCode=0 Nov 26 15:17:37 crc kubenswrapper[4785]: I1126 15:17:37.878307 4785 generic.go:334] "Generic (PLEG): container finished" podID="862c58fd-3f79-4276-bd76-ce689d32cbd6" containerID="77387f2d556a76807d809c75aa4d9e2fdf5de82d7a04e23d3f5461ccef860196" exitCode=143 Nov 26 15:17:37 crc kubenswrapper[4785]: I1126 15:17:37.878315 4785 generic.go:334] "Generic (PLEG): container finished" podID="862c58fd-3f79-4276-bd76-ce689d32cbd6" containerID="47be007cc11c8c2db5bc6dd5786cad4e45c1f8f08f872d060be0211012aad935" exitCode=143 Nov 26 15:17:37 crc kubenswrapper[4785]: I1126 15:17:37.878336 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-925q9" event={"ID":"862c58fd-3f79-4276-bd76-ce689d32cbd6","Type":"ContainerDied","Data":"e3659c263897812b210f9a05d4de256ad029c456cea625a09d22e4a93c71ba6a"} Nov 26 15:17:37 crc kubenswrapper[4785]: I1126 15:17:37.878371 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-925q9" event={"ID":"862c58fd-3f79-4276-bd76-ce689d32cbd6","Type":"ContainerDied","Data":"84b98c7c476182b515072f71c6d81656debe5c03855324623b1eb31f53e6c5c1"} Nov 26 15:17:37 crc kubenswrapper[4785]: I1126 15:17:37.878385 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-925q9" event={"ID":"862c58fd-3f79-4276-bd76-ce689d32cbd6","Type":"ContainerDied","Data":"e1a2013c12cbd23c8c9e4e0d5bd9f76df1c2c2f30f0d55fdb8c8cb03698ea576"} Nov 26 15:17:37 crc kubenswrapper[4785]: I1126 15:17:37.878401 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-925q9" event={"ID":"862c58fd-3f79-4276-bd76-ce689d32cbd6","Type":"ContainerDied","Data":"5bc34631383b7470b2d582d037ea922f2c46d09ca9e37011d758e16837336a78"} Nov 26 15:17:37 crc kubenswrapper[4785]: I1126 15:17:37.878419 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-925q9" event={"ID":"862c58fd-3f79-4276-bd76-ce689d32cbd6","Type":"ContainerDied","Data":"018b59a73e9d454f9156de17b00a8c5cccc3f321164cac374187afa18466bc45"} Nov 26 15:17:37 crc kubenswrapper[4785]: I1126 15:17:37.878431 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-925q9" event={"ID":"862c58fd-3f79-4276-bd76-ce689d32cbd6","Type":"ContainerDied","Data":"7b754df1d5f2d20f3afdb10f2e77f4716de9d953c7322bb3e289d989d2699d17"} Nov 26 15:17:37 crc kubenswrapper[4785]: I1126 15:17:37.878446 4785 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e3659c263897812b210f9a05d4de256ad029c456cea625a09d22e4a93c71ba6a"} Nov 26 15:17:37 crc kubenswrapper[4785]: I1126 15:17:37.878459 4785 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c8d76c8252001a8224d06473aabbbc9d9f0b4a73a415750b8414635ebf2b19da"} Nov 26 15:17:37 crc kubenswrapper[4785]: I1126 15:17:37.878468 4785 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"84b98c7c476182b515072f71c6d81656debe5c03855324623b1eb31f53e6c5c1"} Nov 26 15:17:37 crc kubenswrapper[4785]: I1126 15:17:37.878475 4785 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e1a2013c12cbd23c8c9e4e0d5bd9f76df1c2c2f30f0d55fdb8c8cb03698ea576"} Nov 26 15:17:37 crc kubenswrapper[4785]: I1126 15:17:37.878483 4785 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5bc34631383b7470b2d582d037ea922f2c46d09ca9e37011d758e16837336a78"} Nov 26 15:17:37 crc kubenswrapper[4785]: I1126 15:17:37.878495 4785 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"018b59a73e9d454f9156de17b00a8c5cccc3f321164cac374187afa18466bc45"} Nov 26 15:17:37 crc kubenswrapper[4785]: I1126 15:17:37.878502 4785 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"7b754df1d5f2d20f3afdb10f2e77f4716de9d953c7322bb3e289d989d2699d17"} Nov 26 15:17:37 crc kubenswrapper[4785]: I1126 15:17:37.878509 4785 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"77387f2d556a76807d809c75aa4d9e2fdf5de82d7a04e23d3f5461ccef860196"} Nov 26 15:17:37 crc kubenswrapper[4785]: I1126 15:17:37.878516 4785 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"47be007cc11c8c2db5bc6dd5786cad4e45c1f8f08f872d060be0211012aad935"} Nov 26 15:17:37 crc kubenswrapper[4785]: I1126 15:17:37.878523 4785 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"049fc6de07d4b51e1c898035109c3427dff189cc509f5973db7cb75c3a022bc1"} Nov 26 15:17:37 crc kubenswrapper[4785]: I1126 15:17:37.878533 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-925q9" event={"ID":"862c58fd-3f79-4276-bd76-ce689d32cbd6","Type":"ContainerDied","Data":"77387f2d556a76807d809c75aa4d9e2fdf5de82d7a04e23d3f5461ccef860196"} Nov 26 15:17:37 crc kubenswrapper[4785]: I1126 15:17:37.878545 4785 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e3659c263897812b210f9a05d4de256ad029c456cea625a09d22e4a93c71ba6a"} Nov 26 15:17:37 crc kubenswrapper[4785]: I1126 15:17:37.878570 4785 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c8d76c8252001a8224d06473aabbbc9d9f0b4a73a415750b8414635ebf2b19da"} Nov 26 15:17:37 crc kubenswrapper[4785]: I1126 15:17:37.878576 4785 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"84b98c7c476182b515072f71c6d81656debe5c03855324623b1eb31f53e6c5c1"} Nov 26 15:17:37 crc kubenswrapper[4785]: I1126 15:17:37.878581 4785 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e1a2013c12cbd23c8c9e4e0d5bd9f76df1c2c2f30f0d55fdb8c8cb03698ea576"} Nov 26 15:17:37 crc kubenswrapper[4785]: I1126 15:17:37.878586 4785 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5bc34631383b7470b2d582d037ea922f2c46d09ca9e37011d758e16837336a78"} Nov 26 15:17:37 crc kubenswrapper[4785]: I1126 15:17:37.878592 4785 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"018b59a73e9d454f9156de17b00a8c5cccc3f321164cac374187afa18466bc45"} Nov 26 15:17:37 crc kubenswrapper[4785]: I1126 15:17:37.878597 4785 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"7b754df1d5f2d20f3afdb10f2e77f4716de9d953c7322bb3e289d989d2699d17"} Nov 26 15:17:37 crc kubenswrapper[4785]: I1126 15:17:37.878602 4785 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"77387f2d556a76807d809c75aa4d9e2fdf5de82d7a04e23d3f5461ccef860196"} Nov 26 15:17:37 crc kubenswrapper[4785]: I1126 15:17:37.878610 4785 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"47be007cc11c8c2db5bc6dd5786cad4e45c1f8f08f872d060be0211012aad935"} Nov 26 15:17:37 crc kubenswrapper[4785]: I1126 15:17:37.878616 4785 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"049fc6de07d4b51e1c898035109c3427dff189cc509f5973db7cb75c3a022bc1"} Nov 26 15:17:37 crc kubenswrapper[4785]: I1126 15:17:37.878624 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-925q9" event={"ID":"862c58fd-3f79-4276-bd76-ce689d32cbd6","Type":"ContainerDied","Data":"47be007cc11c8c2db5bc6dd5786cad4e45c1f8f08f872d060be0211012aad935"} Nov 26 15:17:37 crc kubenswrapper[4785]: I1126 15:17:37.878632 4785 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e3659c263897812b210f9a05d4de256ad029c456cea625a09d22e4a93c71ba6a"} Nov 26 15:17:37 crc kubenswrapper[4785]: I1126 15:17:37.878639 4785 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c8d76c8252001a8224d06473aabbbc9d9f0b4a73a415750b8414635ebf2b19da"} Nov 26 15:17:37 crc kubenswrapper[4785]: I1126 15:17:37.878646 4785 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"84b98c7c476182b515072f71c6d81656debe5c03855324623b1eb31f53e6c5c1"} Nov 26 15:17:37 crc kubenswrapper[4785]: I1126 15:17:37.878651 4785 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e1a2013c12cbd23c8c9e4e0d5bd9f76df1c2c2f30f0d55fdb8c8cb03698ea576"} Nov 26 15:17:37 crc kubenswrapper[4785]: I1126 15:17:37.878656 4785 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5bc34631383b7470b2d582d037ea922f2c46d09ca9e37011d758e16837336a78"} Nov 26 15:17:37 crc kubenswrapper[4785]: I1126 15:17:37.878661 4785 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"018b59a73e9d454f9156de17b00a8c5cccc3f321164cac374187afa18466bc45"} Nov 26 15:17:37 crc kubenswrapper[4785]: I1126 15:17:37.878666 4785 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"7b754df1d5f2d20f3afdb10f2e77f4716de9d953c7322bb3e289d989d2699d17"} Nov 26 15:17:37 crc kubenswrapper[4785]: I1126 15:17:37.878671 4785 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"77387f2d556a76807d809c75aa4d9e2fdf5de82d7a04e23d3f5461ccef860196"} Nov 26 15:17:37 crc kubenswrapper[4785]: I1126 15:17:37.878676 4785 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"47be007cc11c8c2db5bc6dd5786cad4e45c1f8f08f872d060be0211012aad935"} Nov 26 15:17:37 crc kubenswrapper[4785]: I1126 15:17:37.878681 4785 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"049fc6de07d4b51e1c898035109c3427dff189cc509f5973db7cb75c3a022bc1"} Nov 26 15:17:37 crc kubenswrapper[4785]: I1126 15:17:37.878688 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-925q9" event={"ID":"862c58fd-3f79-4276-bd76-ce689d32cbd6","Type":"ContainerDied","Data":"eb4ce55002c9c00ad0f8578d4ed01703cb91814179fe559da5b366bd016953b8"} Nov 26 15:17:37 crc kubenswrapper[4785]: I1126 15:17:37.878699 4785 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e3659c263897812b210f9a05d4de256ad029c456cea625a09d22e4a93c71ba6a"} Nov 26 15:17:37 crc kubenswrapper[4785]: I1126 15:17:37.878704 4785 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c8d76c8252001a8224d06473aabbbc9d9f0b4a73a415750b8414635ebf2b19da"} Nov 26 15:17:37 crc kubenswrapper[4785]: I1126 15:17:37.878709 4785 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"84b98c7c476182b515072f71c6d81656debe5c03855324623b1eb31f53e6c5c1"} Nov 26 15:17:37 crc kubenswrapper[4785]: I1126 15:17:37.878714 4785 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e1a2013c12cbd23c8c9e4e0d5bd9f76df1c2c2f30f0d55fdb8c8cb03698ea576"} Nov 26 15:17:37 crc kubenswrapper[4785]: I1126 15:17:37.878720 4785 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5bc34631383b7470b2d582d037ea922f2c46d09ca9e37011d758e16837336a78"} Nov 26 15:17:37 crc kubenswrapper[4785]: I1126 15:17:37.878725 4785 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"018b59a73e9d454f9156de17b00a8c5cccc3f321164cac374187afa18466bc45"} Nov 26 15:17:37 crc kubenswrapper[4785]: I1126 15:17:37.878731 4785 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"7b754df1d5f2d20f3afdb10f2e77f4716de9d953c7322bb3e289d989d2699d17"} Nov 26 15:17:37 crc kubenswrapper[4785]: I1126 15:17:37.878736 4785 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"77387f2d556a76807d809c75aa4d9e2fdf5de82d7a04e23d3f5461ccef860196"} Nov 26 15:17:37 crc kubenswrapper[4785]: I1126 15:17:37.878741 4785 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"47be007cc11c8c2db5bc6dd5786cad4e45c1f8f08f872d060be0211012aad935"} Nov 26 15:17:37 crc kubenswrapper[4785]: I1126 15:17:37.878746 4785 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"049fc6de07d4b51e1c898035109c3427dff189cc509f5973db7cb75c3a022bc1"} Nov 26 15:17:37 crc kubenswrapper[4785]: I1126 15:17:37.878875 4785 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-925q9" Nov 26 15:17:37 crc kubenswrapper[4785]: I1126 15:17:37.953827 4785 scope.go:117] "RemoveContainer" containerID="e3659c263897812b210f9a05d4de256ad029c456cea625a09d22e4a93c71ba6a" Nov 26 15:17:37 crc kubenswrapper[4785]: I1126 15:17:37.968106 4785 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-925q9"] Nov 26 15:17:37 crc kubenswrapper[4785]: I1126 15:17:37.971766 4785 scope.go:117] "RemoveContainer" containerID="c8d76c8252001a8224d06473aabbbc9d9f0b4a73a415750b8414635ebf2b19da" Nov 26 15:17:37 crc kubenswrapper[4785]: I1126 15:17:37.978606 4785 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-925q9"] Nov 26 15:17:37 crc kubenswrapper[4785]: I1126 15:17:37.989453 4785 scope.go:117] "RemoveContainer" containerID="84b98c7c476182b515072f71c6d81656debe5c03855324623b1eb31f53e6c5c1" Nov 26 15:17:38 crc kubenswrapper[4785]: I1126 15:17:38.004241 4785 scope.go:117] "RemoveContainer" containerID="e1a2013c12cbd23c8c9e4e0d5bd9f76df1c2c2f30f0d55fdb8c8cb03698ea576" Nov 26 15:17:38 crc kubenswrapper[4785]: I1126 15:17:38.017795 4785 scope.go:117] "RemoveContainer" containerID="5bc34631383b7470b2d582d037ea922f2c46d09ca9e37011d758e16837336a78" Nov 26 15:17:38 crc kubenswrapper[4785]: I1126 15:17:38.028656 4785 scope.go:117] "RemoveContainer" containerID="018b59a73e9d454f9156de17b00a8c5cccc3f321164cac374187afa18466bc45" Nov 26 15:17:38 crc kubenswrapper[4785]: I1126 15:17:38.038685 4785 scope.go:117] "RemoveContainer" containerID="7b754df1d5f2d20f3afdb10f2e77f4716de9d953c7322bb3e289d989d2699d17" Nov 26 15:17:38 crc kubenswrapper[4785]: I1126 15:17:38.048798 4785 scope.go:117] "RemoveContainer" containerID="77387f2d556a76807d809c75aa4d9e2fdf5de82d7a04e23d3f5461ccef860196" Nov 26 15:17:38 crc kubenswrapper[4785]: I1126 15:17:38.060464 4785 scope.go:117] "RemoveContainer" containerID="47be007cc11c8c2db5bc6dd5786cad4e45c1f8f08f872d060be0211012aad935" Nov 26 15:17:38 crc kubenswrapper[4785]: I1126 15:17:38.077289 4785 scope.go:117] "RemoveContainer" containerID="049fc6de07d4b51e1c898035109c3427dff189cc509f5973db7cb75c3a022bc1" Nov 26 15:17:38 crc kubenswrapper[4785]: I1126 15:17:38.093732 4785 scope.go:117] "RemoveContainer" containerID="e3659c263897812b210f9a05d4de256ad029c456cea625a09d22e4a93c71ba6a" Nov 26 15:17:38 crc kubenswrapper[4785]: E1126 15:17:38.094127 4785 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e3659c263897812b210f9a05d4de256ad029c456cea625a09d22e4a93c71ba6a\": container with ID starting with e3659c263897812b210f9a05d4de256ad029c456cea625a09d22e4a93c71ba6a not found: ID does not exist" containerID="e3659c263897812b210f9a05d4de256ad029c456cea625a09d22e4a93c71ba6a" Nov 26 15:17:38 crc kubenswrapper[4785]: I1126 15:17:38.094190 4785 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e3659c263897812b210f9a05d4de256ad029c456cea625a09d22e4a93c71ba6a"} err="failed to get container status \"e3659c263897812b210f9a05d4de256ad029c456cea625a09d22e4a93c71ba6a\": rpc error: code = NotFound desc = could not find container \"e3659c263897812b210f9a05d4de256ad029c456cea625a09d22e4a93c71ba6a\": container with ID starting with e3659c263897812b210f9a05d4de256ad029c456cea625a09d22e4a93c71ba6a not found: ID does not exist" Nov 26 15:17:38 crc kubenswrapper[4785]: I1126 15:17:38.094231 4785 scope.go:117] "RemoveContainer" containerID="c8d76c8252001a8224d06473aabbbc9d9f0b4a73a415750b8414635ebf2b19da" Nov 26 15:17:38 crc kubenswrapper[4785]: E1126 15:17:38.094702 4785 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c8d76c8252001a8224d06473aabbbc9d9f0b4a73a415750b8414635ebf2b19da\": container with ID starting with c8d76c8252001a8224d06473aabbbc9d9f0b4a73a415750b8414635ebf2b19da not found: ID does not exist" containerID="c8d76c8252001a8224d06473aabbbc9d9f0b4a73a415750b8414635ebf2b19da" Nov 26 15:17:38 crc kubenswrapper[4785]: I1126 15:17:38.094739 4785 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c8d76c8252001a8224d06473aabbbc9d9f0b4a73a415750b8414635ebf2b19da"} err="failed to get container status \"c8d76c8252001a8224d06473aabbbc9d9f0b4a73a415750b8414635ebf2b19da\": rpc error: code = NotFound desc = could not find container \"c8d76c8252001a8224d06473aabbbc9d9f0b4a73a415750b8414635ebf2b19da\": container with ID starting with c8d76c8252001a8224d06473aabbbc9d9f0b4a73a415750b8414635ebf2b19da not found: ID does not exist" Nov 26 15:17:38 crc kubenswrapper[4785]: I1126 15:17:38.094763 4785 scope.go:117] "RemoveContainer" containerID="84b98c7c476182b515072f71c6d81656debe5c03855324623b1eb31f53e6c5c1" Nov 26 15:17:38 crc kubenswrapper[4785]: E1126 15:17:38.094977 4785 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"84b98c7c476182b515072f71c6d81656debe5c03855324623b1eb31f53e6c5c1\": container with ID starting with 84b98c7c476182b515072f71c6d81656debe5c03855324623b1eb31f53e6c5c1 not found: ID does not exist" containerID="84b98c7c476182b515072f71c6d81656debe5c03855324623b1eb31f53e6c5c1" Nov 26 15:17:38 crc kubenswrapper[4785]: I1126 15:17:38.095016 4785 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"84b98c7c476182b515072f71c6d81656debe5c03855324623b1eb31f53e6c5c1"} err="failed to get container status \"84b98c7c476182b515072f71c6d81656debe5c03855324623b1eb31f53e6c5c1\": rpc error: code = NotFound desc = could not find container \"84b98c7c476182b515072f71c6d81656debe5c03855324623b1eb31f53e6c5c1\": container with ID starting with 84b98c7c476182b515072f71c6d81656debe5c03855324623b1eb31f53e6c5c1 not found: ID does not exist" Nov 26 15:17:38 crc kubenswrapper[4785]: I1126 15:17:38.095039 4785 scope.go:117] "RemoveContainer" containerID="e1a2013c12cbd23c8c9e4e0d5bd9f76df1c2c2f30f0d55fdb8c8cb03698ea576" Nov 26 15:17:38 crc kubenswrapper[4785]: E1126 15:17:38.095288 4785 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e1a2013c12cbd23c8c9e4e0d5bd9f76df1c2c2f30f0d55fdb8c8cb03698ea576\": container with ID starting with e1a2013c12cbd23c8c9e4e0d5bd9f76df1c2c2f30f0d55fdb8c8cb03698ea576 not found: ID does not exist" containerID="e1a2013c12cbd23c8c9e4e0d5bd9f76df1c2c2f30f0d55fdb8c8cb03698ea576" Nov 26 15:17:38 crc kubenswrapper[4785]: I1126 15:17:38.095322 4785 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e1a2013c12cbd23c8c9e4e0d5bd9f76df1c2c2f30f0d55fdb8c8cb03698ea576"} err="failed to get container status \"e1a2013c12cbd23c8c9e4e0d5bd9f76df1c2c2f30f0d55fdb8c8cb03698ea576\": rpc error: code = NotFound desc = could not find container \"e1a2013c12cbd23c8c9e4e0d5bd9f76df1c2c2f30f0d55fdb8c8cb03698ea576\": container with ID starting with e1a2013c12cbd23c8c9e4e0d5bd9f76df1c2c2f30f0d55fdb8c8cb03698ea576 not found: ID does not exist" Nov 26 15:17:38 crc kubenswrapper[4785]: I1126 15:17:38.095349 4785 scope.go:117] "RemoveContainer" containerID="5bc34631383b7470b2d582d037ea922f2c46d09ca9e37011d758e16837336a78" Nov 26 15:17:38 crc kubenswrapper[4785]: E1126 15:17:38.095619 4785 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5bc34631383b7470b2d582d037ea922f2c46d09ca9e37011d758e16837336a78\": container with ID starting with 5bc34631383b7470b2d582d037ea922f2c46d09ca9e37011d758e16837336a78 not found: ID does not exist" containerID="5bc34631383b7470b2d582d037ea922f2c46d09ca9e37011d758e16837336a78" Nov 26 15:17:38 crc kubenswrapper[4785]: I1126 15:17:38.095655 4785 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5bc34631383b7470b2d582d037ea922f2c46d09ca9e37011d758e16837336a78"} err="failed to get container status \"5bc34631383b7470b2d582d037ea922f2c46d09ca9e37011d758e16837336a78\": rpc error: code = NotFound desc = could not find container \"5bc34631383b7470b2d582d037ea922f2c46d09ca9e37011d758e16837336a78\": container with ID starting with 5bc34631383b7470b2d582d037ea922f2c46d09ca9e37011d758e16837336a78 not found: ID does not exist" Nov 26 15:17:38 crc kubenswrapper[4785]: I1126 15:17:38.095678 4785 scope.go:117] "RemoveContainer" containerID="018b59a73e9d454f9156de17b00a8c5cccc3f321164cac374187afa18466bc45" Nov 26 15:17:38 crc kubenswrapper[4785]: E1126 15:17:38.096391 4785 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"018b59a73e9d454f9156de17b00a8c5cccc3f321164cac374187afa18466bc45\": container with ID starting with 018b59a73e9d454f9156de17b00a8c5cccc3f321164cac374187afa18466bc45 not found: ID does not exist" containerID="018b59a73e9d454f9156de17b00a8c5cccc3f321164cac374187afa18466bc45" Nov 26 15:17:38 crc kubenswrapper[4785]: I1126 15:17:38.096426 4785 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"018b59a73e9d454f9156de17b00a8c5cccc3f321164cac374187afa18466bc45"} err="failed to get container status \"018b59a73e9d454f9156de17b00a8c5cccc3f321164cac374187afa18466bc45\": rpc error: code = NotFound desc = could not find container \"018b59a73e9d454f9156de17b00a8c5cccc3f321164cac374187afa18466bc45\": container with ID starting with 018b59a73e9d454f9156de17b00a8c5cccc3f321164cac374187afa18466bc45 not found: ID does not exist" Nov 26 15:17:38 crc kubenswrapper[4785]: I1126 15:17:38.096449 4785 scope.go:117] "RemoveContainer" containerID="7b754df1d5f2d20f3afdb10f2e77f4716de9d953c7322bb3e289d989d2699d17" Nov 26 15:17:38 crc kubenswrapper[4785]: E1126 15:17:38.096699 4785 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7b754df1d5f2d20f3afdb10f2e77f4716de9d953c7322bb3e289d989d2699d17\": container with ID starting with 7b754df1d5f2d20f3afdb10f2e77f4716de9d953c7322bb3e289d989d2699d17 not found: ID does not exist" containerID="7b754df1d5f2d20f3afdb10f2e77f4716de9d953c7322bb3e289d989d2699d17" Nov 26 15:17:38 crc kubenswrapper[4785]: I1126 15:17:38.096740 4785 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7b754df1d5f2d20f3afdb10f2e77f4716de9d953c7322bb3e289d989d2699d17"} err="failed to get container status \"7b754df1d5f2d20f3afdb10f2e77f4716de9d953c7322bb3e289d989d2699d17\": rpc error: code = NotFound desc = could not find container \"7b754df1d5f2d20f3afdb10f2e77f4716de9d953c7322bb3e289d989d2699d17\": container with ID starting with 7b754df1d5f2d20f3afdb10f2e77f4716de9d953c7322bb3e289d989d2699d17 not found: ID does not exist" Nov 26 15:17:38 crc kubenswrapper[4785]: I1126 15:17:38.096766 4785 scope.go:117] "RemoveContainer" containerID="77387f2d556a76807d809c75aa4d9e2fdf5de82d7a04e23d3f5461ccef860196" Nov 26 15:17:38 crc kubenswrapper[4785]: E1126 15:17:38.097036 4785 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"77387f2d556a76807d809c75aa4d9e2fdf5de82d7a04e23d3f5461ccef860196\": container with ID starting with 77387f2d556a76807d809c75aa4d9e2fdf5de82d7a04e23d3f5461ccef860196 not found: ID does not exist" containerID="77387f2d556a76807d809c75aa4d9e2fdf5de82d7a04e23d3f5461ccef860196" Nov 26 15:17:38 crc kubenswrapper[4785]: I1126 15:17:38.097069 4785 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"77387f2d556a76807d809c75aa4d9e2fdf5de82d7a04e23d3f5461ccef860196"} err="failed to get container status \"77387f2d556a76807d809c75aa4d9e2fdf5de82d7a04e23d3f5461ccef860196\": rpc error: code = NotFound desc = could not find container \"77387f2d556a76807d809c75aa4d9e2fdf5de82d7a04e23d3f5461ccef860196\": container with ID starting with 77387f2d556a76807d809c75aa4d9e2fdf5de82d7a04e23d3f5461ccef860196 not found: ID does not exist" Nov 26 15:17:38 crc kubenswrapper[4785]: I1126 15:17:38.097092 4785 scope.go:117] "RemoveContainer" containerID="47be007cc11c8c2db5bc6dd5786cad4e45c1f8f08f872d060be0211012aad935" Nov 26 15:17:38 crc kubenswrapper[4785]: E1126 15:17:38.097321 4785 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"47be007cc11c8c2db5bc6dd5786cad4e45c1f8f08f872d060be0211012aad935\": container with ID starting with 47be007cc11c8c2db5bc6dd5786cad4e45c1f8f08f872d060be0211012aad935 not found: ID does not exist" containerID="47be007cc11c8c2db5bc6dd5786cad4e45c1f8f08f872d060be0211012aad935" Nov 26 15:17:38 crc kubenswrapper[4785]: I1126 15:17:38.097355 4785 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"47be007cc11c8c2db5bc6dd5786cad4e45c1f8f08f872d060be0211012aad935"} err="failed to get container status \"47be007cc11c8c2db5bc6dd5786cad4e45c1f8f08f872d060be0211012aad935\": rpc error: code = NotFound desc = could not find container \"47be007cc11c8c2db5bc6dd5786cad4e45c1f8f08f872d060be0211012aad935\": container with ID starting with 47be007cc11c8c2db5bc6dd5786cad4e45c1f8f08f872d060be0211012aad935 not found: ID does not exist" Nov 26 15:17:38 crc kubenswrapper[4785]: I1126 15:17:38.097379 4785 scope.go:117] "RemoveContainer" containerID="049fc6de07d4b51e1c898035109c3427dff189cc509f5973db7cb75c3a022bc1" Nov 26 15:17:38 crc kubenswrapper[4785]: E1126 15:17:38.097642 4785 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"049fc6de07d4b51e1c898035109c3427dff189cc509f5973db7cb75c3a022bc1\": container with ID starting with 049fc6de07d4b51e1c898035109c3427dff189cc509f5973db7cb75c3a022bc1 not found: ID does not exist" containerID="049fc6de07d4b51e1c898035109c3427dff189cc509f5973db7cb75c3a022bc1" Nov 26 15:17:38 crc kubenswrapper[4785]: I1126 15:17:38.097665 4785 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"049fc6de07d4b51e1c898035109c3427dff189cc509f5973db7cb75c3a022bc1"} err="failed to get container status \"049fc6de07d4b51e1c898035109c3427dff189cc509f5973db7cb75c3a022bc1\": rpc error: code = NotFound desc = could not find container \"049fc6de07d4b51e1c898035109c3427dff189cc509f5973db7cb75c3a022bc1\": container with ID starting with 049fc6de07d4b51e1c898035109c3427dff189cc509f5973db7cb75c3a022bc1 not found: ID does not exist" Nov 26 15:17:38 crc kubenswrapper[4785]: I1126 15:17:38.097680 4785 scope.go:117] "RemoveContainer" containerID="e3659c263897812b210f9a05d4de256ad029c456cea625a09d22e4a93c71ba6a" Nov 26 15:17:38 crc kubenswrapper[4785]: I1126 15:17:38.098062 4785 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e3659c263897812b210f9a05d4de256ad029c456cea625a09d22e4a93c71ba6a"} err="failed to get container status \"e3659c263897812b210f9a05d4de256ad029c456cea625a09d22e4a93c71ba6a\": rpc error: code = NotFound desc = could not find container \"e3659c263897812b210f9a05d4de256ad029c456cea625a09d22e4a93c71ba6a\": container with ID starting with e3659c263897812b210f9a05d4de256ad029c456cea625a09d22e4a93c71ba6a not found: ID does not exist" Nov 26 15:17:38 crc kubenswrapper[4785]: I1126 15:17:38.098085 4785 scope.go:117] "RemoveContainer" containerID="c8d76c8252001a8224d06473aabbbc9d9f0b4a73a415750b8414635ebf2b19da" Nov 26 15:17:38 crc kubenswrapper[4785]: I1126 15:17:38.098359 4785 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c8d76c8252001a8224d06473aabbbc9d9f0b4a73a415750b8414635ebf2b19da"} err="failed to get container status \"c8d76c8252001a8224d06473aabbbc9d9f0b4a73a415750b8414635ebf2b19da\": rpc error: code = NotFound desc = could not find container \"c8d76c8252001a8224d06473aabbbc9d9f0b4a73a415750b8414635ebf2b19da\": container with ID starting with c8d76c8252001a8224d06473aabbbc9d9f0b4a73a415750b8414635ebf2b19da not found: ID does not exist" Nov 26 15:17:38 crc kubenswrapper[4785]: I1126 15:17:38.098403 4785 scope.go:117] "RemoveContainer" containerID="84b98c7c476182b515072f71c6d81656debe5c03855324623b1eb31f53e6c5c1" Nov 26 15:17:38 crc kubenswrapper[4785]: I1126 15:17:38.098641 4785 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"84b98c7c476182b515072f71c6d81656debe5c03855324623b1eb31f53e6c5c1"} err="failed to get container status \"84b98c7c476182b515072f71c6d81656debe5c03855324623b1eb31f53e6c5c1\": rpc error: code = NotFound desc = could not find container \"84b98c7c476182b515072f71c6d81656debe5c03855324623b1eb31f53e6c5c1\": container with ID starting with 84b98c7c476182b515072f71c6d81656debe5c03855324623b1eb31f53e6c5c1 not found: ID does not exist" Nov 26 15:17:38 crc kubenswrapper[4785]: I1126 15:17:38.098671 4785 scope.go:117] "RemoveContainer" containerID="e1a2013c12cbd23c8c9e4e0d5bd9f76df1c2c2f30f0d55fdb8c8cb03698ea576" Nov 26 15:17:38 crc kubenswrapper[4785]: I1126 15:17:38.098935 4785 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e1a2013c12cbd23c8c9e4e0d5bd9f76df1c2c2f30f0d55fdb8c8cb03698ea576"} err="failed to get container status \"e1a2013c12cbd23c8c9e4e0d5bd9f76df1c2c2f30f0d55fdb8c8cb03698ea576\": rpc error: code = NotFound desc = could not find container \"e1a2013c12cbd23c8c9e4e0d5bd9f76df1c2c2f30f0d55fdb8c8cb03698ea576\": container with ID starting with e1a2013c12cbd23c8c9e4e0d5bd9f76df1c2c2f30f0d55fdb8c8cb03698ea576 not found: ID does not exist" Nov 26 15:17:38 crc kubenswrapper[4785]: I1126 15:17:38.098968 4785 scope.go:117] "RemoveContainer" containerID="5bc34631383b7470b2d582d037ea922f2c46d09ca9e37011d758e16837336a78" Nov 26 15:17:38 crc kubenswrapper[4785]: I1126 15:17:38.099203 4785 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5bc34631383b7470b2d582d037ea922f2c46d09ca9e37011d758e16837336a78"} err="failed to get container status \"5bc34631383b7470b2d582d037ea922f2c46d09ca9e37011d758e16837336a78\": rpc error: code = NotFound desc = could not find container \"5bc34631383b7470b2d582d037ea922f2c46d09ca9e37011d758e16837336a78\": container with ID starting with 5bc34631383b7470b2d582d037ea922f2c46d09ca9e37011d758e16837336a78 not found: ID does not exist" Nov 26 15:17:38 crc kubenswrapper[4785]: I1126 15:17:38.099228 4785 scope.go:117] "RemoveContainer" containerID="018b59a73e9d454f9156de17b00a8c5cccc3f321164cac374187afa18466bc45" Nov 26 15:17:38 crc kubenswrapper[4785]: I1126 15:17:38.099420 4785 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"018b59a73e9d454f9156de17b00a8c5cccc3f321164cac374187afa18466bc45"} err="failed to get container status \"018b59a73e9d454f9156de17b00a8c5cccc3f321164cac374187afa18466bc45\": rpc error: code = NotFound desc = could not find container \"018b59a73e9d454f9156de17b00a8c5cccc3f321164cac374187afa18466bc45\": container with ID starting with 018b59a73e9d454f9156de17b00a8c5cccc3f321164cac374187afa18466bc45 not found: ID does not exist" Nov 26 15:17:38 crc kubenswrapper[4785]: I1126 15:17:38.099443 4785 scope.go:117] "RemoveContainer" containerID="7b754df1d5f2d20f3afdb10f2e77f4716de9d953c7322bb3e289d989d2699d17" Nov 26 15:17:38 crc kubenswrapper[4785]: I1126 15:17:38.099730 4785 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7b754df1d5f2d20f3afdb10f2e77f4716de9d953c7322bb3e289d989d2699d17"} err="failed to get container status \"7b754df1d5f2d20f3afdb10f2e77f4716de9d953c7322bb3e289d989d2699d17\": rpc error: code = NotFound desc = could not find container \"7b754df1d5f2d20f3afdb10f2e77f4716de9d953c7322bb3e289d989d2699d17\": container with ID starting with 7b754df1d5f2d20f3afdb10f2e77f4716de9d953c7322bb3e289d989d2699d17 not found: ID does not exist" Nov 26 15:17:38 crc kubenswrapper[4785]: I1126 15:17:38.099746 4785 scope.go:117] "RemoveContainer" containerID="77387f2d556a76807d809c75aa4d9e2fdf5de82d7a04e23d3f5461ccef860196" Nov 26 15:17:38 crc kubenswrapper[4785]: I1126 15:17:38.100031 4785 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"77387f2d556a76807d809c75aa4d9e2fdf5de82d7a04e23d3f5461ccef860196"} err="failed to get container status \"77387f2d556a76807d809c75aa4d9e2fdf5de82d7a04e23d3f5461ccef860196\": rpc error: code = NotFound desc = could not find container \"77387f2d556a76807d809c75aa4d9e2fdf5de82d7a04e23d3f5461ccef860196\": container with ID starting with 77387f2d556a76807d809c75aa4d9e2fdf5de82d7a04e23d3f5461ccef860196 not found: ID does not exist" Nov 26 15:17:38 crc kubenswrapper[4785]: I1126 15:17:38.100052 4785 scope.go:117] "RemoveContainer" containerID="47be007cc11c8c2db5bc6dd5786cad4e45c1f8f08f872d060be0211012aad935" Nov 26 15:17:38 crc kubenswrapper[4785]: I1126 15:17:38.100370 4785 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"47be007cc11c8c2db5bc6dd5786cad4e45c1f8f08f872d060be0211012aad935"} err="failed to get container status \"47be007cc11c8c2db5bc6dd5786cad4e45c1f8f08f872d060be0211012aad935\": rpc error: code = NotFound desc = could not find container \"47be007cc11c8c2db5bc6dd5786cad4e45c1f8f08f872d060be0211012aad935\": container with ID starting with 47be007cc11c8c2db5bc6dd5786cad4e45c1f8f08f872d060be0211012aad935 not found: ID does not exist" Nov 26 15:17:38 crc kubenswrapper[4785]: I1126 15:17:38.100439 4785 scope.go:117] "RemoveContainer" containerID="049fc6de07d4b51e1c898035109c3427dff189cc509f5973db7cb75c3a022bc1" Nov 26 15:17:38 crc kubenswrapper[4785]: I1126 15:17:38.100811 4785 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"049fc6de07d4b51e1c898035109c3427dff189cc509f5973db7cb75c3a022bc1"} err="failed to get container status \"049fc6de07d4b51e1c898035109c3427dff189cc509f5973db7cb75c3a022bc1\": rpc error: code = NotFound desc = could not find container \"049fc6de07d4b51e1c898035109c3427dff189cc509f5973db7cb75c3a022bc1\": container with ID starting with 049fc6de07d4b51e1c898035109c3427dff189cc509f5973db7cb75c3a022bc1 not found: ID does not exist" Nov 26 15:17:38 crc kubenswrapper[4785]: I1126 15:17:38.100837 4785 scope.go:117] "RemoveContainer" containerID="e3659c263897812b210f9a05d4de256ad029c456cea625a09d22e4a93c71ba6a" Nov 26 15:17:38 crc kubenswrapper[4785]: I1126 15:17:38.101100 4785 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e3659c263897812b210f9a05d4de256ad029c456cea625a09d22e4a93c71ba6a"} err="failed to get container status \"e3659c263897812b210f9a05d4de256ad029c456cea625a09d22e4a93c71ba6a\": rpc error: code = NotFound desc = could not find container \"e3659c263897812b210f9a05d4de256ad029c456cea625a09d22e4a93c71ba6a\": container with ID starting with e3659c263897812b210f9a05d4de256ad029c456cea625a09d22e4a93c71ba6a not found: ID does not exist" Nov 26 15:17:38 crc kubenswrapper[4785]: I1126 15:17:38.101128 4785 scope.go:117] "RemoveContainer" containerID="c8d76c8252001a8224d06473aabbbc9d9f0b4a73a415750b8414635ebf2b19da" Nov 26 15:17:38 crc kubenswrapper[4785]: I1126 15:17:38.101415 4785 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c8d76c8252001a8224d06473aabbbc9d9f0b4a73a415750b8414635ebf2b19da"} err="failed to get container status \"c8d76c8252001a8224d06473aabbbc9d9f0b4a73a415750b8414635ebf2b19da\": rpc error: code = NotFound desc = could not find container \"c8d76c8252001a8224d06473aabbbc9d9f0b4a73a415750b8414635ebf2b19da\": container with ID starting with c8d76c8252001a8224d06473aabbbc9d9f0b4a73a415750b8414635ebf2b19da not found: ID does not exist" Nov 26 15:17:38 crc kubenswrapper[4785]: I1126 15:17:38.101439 4785 scope.go:117] "RemoveContainer" containerID="84b98c7c476182b515072f71c6d81656debe5c03855324623b1eb31f53e6c5c1" Nov 26 15:17:38 crc kubenswrapper[4785]: I1126 15:17:38.101797 4785 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"84b98c7c476182b515072f71c6d81656debe5c03855324623b1eb31f53e6c5c1"} err="failed to get container status \"84b98c7c476182b515072f71c6d81656debe5c03855324623b1eb31f53e6c5c1\": rpc error: code = NotFound desc = could not find container \"84b98c7c476182b515072f71c6d81656debe5c03855324623b1eb31f53e6c5c1\": container with ID starting with 84b98c7c476182b515072f71c6d81656debe5c03855324623b1eb31f53e6c5c1 not found: ID does not exist" Nov 26 15:17:38 crc kubenswrapper[4785]: I1126 15:17:38.101814 4785 scope.go:117] "RemoveContainer" containerID="e1a2013c12cbd23c8c9e4e0d5bd9f76df1c2c2f30f0d55fdb8c8cb03698ea576" Nov 26 15:17:38 crc kubenswrapper[4785]: I1126 15:17:38.102188 4785 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e1a2013c12cbd23c8c9e4e0d5bd9f76df1c2c2f30f0d55fdb8c8cb03698ea576"} err="failed to get container status \"e1a2013c12cbd23c8c9e4e0d5bd9f76df1c2c2f30f0d55fdb8c8cb03698ea576\": rpc error: code = NotFound desc = could not find container \"e1a2013c12cbd23c8c9e4e0d5bd9f76df1c2c2f30f0d55fdb8c8cb03698ea576\": container with ID starting with e1a2013c12cbd23c8c9e4e0d5bd9f76df1c2c2f30f0d55fdb8c8cb03698ea576 not found: ID does not exist" Nov 26 15:17:38 crc kubenswrapper[4785]: I1126 15:17:38.102214 4785 scope.go:117] "RemoveContainer" containerID="5bc34631383b7470b2d582d037ea922f2c46d09ca9e37011d758e16837336a78" Nov 26 15:17:38 crc kubenswrapper[4785]: I1126 15:17:38.102478 4785 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5bc34631383b7470b2d582d037ea922f2c46d09ca9e37011d758e16837336a78"} err="failed to get container status \"5bc34631383b7470b2d582d037ea922f2c46d09ca9e37011d758e16837336a78\": rpc error: code = NotFound desc = could not find container \"5bc34631383b7470b2d582d037ea922f2c46d09ca9e37011d758e16837336a78\": container with ID starting with 5bc34631383b7470b2d582d037ea922f2c46d09ca9e37011d758e16837336a78 not found: ID does not exist" Nov 26 15:17:38 crc kubenswrapper[4785]: I1126 15:17:38.102501 4785 scope.go:117] "RemoveContainer" containerID="018b59a73e9d454f9156de17b00a8c5cccc3f321164cac374187afa18466bc45" Nov 26 15:17:38 crc kubenswrapper[4785]: I1126 15:17:38.102839 4785 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"018b59a73e9d454f9156de17b00a8c5cccc3f321164cac374187afa18466bc45"} err="failed to get container status \"018b59a73e9d454f9156de17b00a8c5cccc3f321164cac374187afa18466bc45\": rpc error: code = NotFound desc = could not find container \"018b59a73e9d454f9156de17b00a8c5cccc3f321164cac374187afa18466bc45\": container with ID starting with 018b59a73e9d454f9156de17b00a8c5cccc3f321164cac374187afa18466bc45 not found: ID does not exist" Nov 26 15:17:38 crc kubenswrapper[4785]: I1126 15:17:38.102889 4785 scope.go:117] "RemoveContainer" containerID="7b754df1d5f2d20f3afdb10f2e77f4716de9d953c7322bb3e289d989d2699d17" Nov 26 15:17:38 crc kubenswrapper[4785]: I1126 15:17:38.103082 4785 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7b754df1d5f2d20f3afdb10f2e77f4716de9d953c7322bb3e289d989d2699d17"} err="failed to get container status \"7b754df1d5f2d20f3afdb10f2e77f4716de9d953c7322bb3e289d989d2699d17\": rpc error: code = NotFound desc = could not find container \"7b754df1d5f2d20f3afdb10f2e77f4716de9d953c7322bb3e289d989d2699d17\": container with ID starting with 7b754df1d5f2d20f3afdb10f2e77f4716de9d953c7322bb3e289d989d2699d17 not found: ID does not exist" Nov 26 15:17:38 crc kubenswrapper[4785]: I1126 15:17:38.103123 4785 scope.go:117] "RemoveContainer" containerID="77387f2d556a76807d809c75aa4d9e2fdf5de82d7a04e23d3f5461ccef860196" Nov 26 15:17:38 crc kubenswrapper[4785]: I1126 15:17:38.103318 4785 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"77387f2d556a76807d809c75aa4d9e2fdf5de82d7a04e23d3f5461ccef860196"} err="failed to get container status \"77387f2d556a76807d809c75aa4d9e2fdf5de82d7a04e23d3f5461ccef860196\": rpc error: code = NotFound desc = could not find container \"77387f2d556a76807d809c75aa4d9e2fdf5de82d7a04e23d3f5461ccef860196\": container with ID starting with 77387f2d556a76807d809c75aa4d9e2fdf5de82d7a04e23d3f5461ccef860196 not found: ID does not exist" Nov 26 15:17:38 crc kubenswrapper[4785]: I1126 15:17:38.103335 4785 scope.go:117] "RemoveContainer" containerID="47be007cc11c8c2db5bc6dd5786cad4e45c1f8f08f872d060be0211012aad935" Nov 26 15:17:38 crc kubenswrapper[4785]: I1126 15:17:38.103586 4785 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"47be007cc11c8c2db5bc6dd5786cad4e45c1f8f08f872d060be0211012aad935"} err="failed to get container status \"47be007cc11c8c2db5bc6dd5786cad4e45c1f8f08f872d060be0211012aad935\": rpc error: code = NotFound desc = could not find container \"47be007cc11c8c2db5bc6dd5786cad4e45c1f8f08f872d060be0211012aad935\": container with ID starting with 47be007cc11c8c2db5bc6dd5786cad4e45c1f8f08f872d060be0211012aad935 not found: ID does not exist" Nov 26 15:17:38 crc kubenswrapper[4785]: I1126 15:17:38.103606 4785 scope.go:117] "RemoveContainer" containerID="049fc6de07d4b51e1c898035109c3427dff189cc509f5973db7cb75c3a022bc1" Nov 26 15:17:38 crc kubenswrapper[4785]: I1126 15:17:38.104042 4785 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"049fc6de07d4b51e1c898035109c3427dff189cc509f5973db7cb75c3a022bc1"} err="failed to get container status \"049fc6de07d4b51e1c898035109c3427dff189cc509f5973db7cb75c3a022bc1\": rpc error: code = NotFound desc = could not find container \"049fc6de07d4b51e1c898035109c3427dff189cc509f5973db7cb75c3a022bc1\": container with ID starting with 049fc6de07d4b51e1c898035109c3427dff189cc509f5973db7cb75c3a022bc1 not found: ID does not exist" Nov 26 15:17:38 crc kubenswrapper[4785]: I1126 15:17:38.104060 4785 scope.go:117] "RemoveContainer" containerID="e3659c263897812b210f9a05d4de256ad029c456cea625a09d22e4a93c71ba6a" Nov 26 15:17:38 crc kubenswrapper[4785]: I1126 15:17:38.104415 4785 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e3659c263897812b210f9a05d4de256ad029c456cea625a09d22e4a93c71ba6a"} err="failed to get container status \"e3659c263897812b210f9a05d4de256ad029c456cea625a09d22e4a93c71ba6a\": rpc error: code = NotFound desc = could not find container \"e3659c263897812b210f9a05d4de256ad029c456cea625a09d22e4a93c71ba6a\": container with ID starting with e3659c263897812b210f9a05d4de256ad029c456cea625a09d22e4a93c71ba6a not found: ID does not exist" Nov 26 15:17:38 crc kubenswrapper[4785]: I1126 15:17:38.104439 4785 scope.go:117] "RemoveContainer" containerID="c8d76c8252001a8224d06473aabbbc9d9f0b4a73a415750b8414635ebf2b19da" Nov 26 15:17:38 crc kubenswrapper[4785]: I1126 15:17:38.104683 4785 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c8d76c8252001a8224d06473aabbbc9d9f0b4a73a415750b8414635ebf2b19da"} err="failed to get container status \"c8d76c8252001a8224d06473aabbbc9d9f0b4a73a415750b8414635ebf2b19da\": rpc error: code = NotFound desc = could not find container \"c8d76c8252001a8224d06473aabbbc9d9f0b4a73a415750b8414635ebf2b19da\": container with ID starting with c8d76c8252001a8224d06473aabbbc9d9f0b4a73a415750b8414635ebf2b19da not found: ID does not exist" Nov 26 15:17:38 crc kubenswrapper[4785]: I1126 15:17:38.104708 4785 scope.go:117] "RemoveContainer" containerID="84b98c7c476182b515072f71c6d81656debe5c03855324623b1eb31f53e6c5c1" Nov 26 15:17:38 crc kubenswrapper[4785]: I1126 15:17:38.104951 4785 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"84b98c7c476182b515072f71c6d81656debe5c03855324623b1eb31f53e6c5c1"} err="failed to get container status \"84b98c7c476182b515072f71c6d81656debe5c03855324623b1eb31f53e6c5c1\": rpc error: code = NotFound desc = could not find container \"84b98c7c476182b515072f71c6d81656debe5c03855324623b1eb31f53e6c5c1\": container with ID starting with 84b98c7c476182b515072f71c6d81656debe5c03855324623b1eb31f53e6c5c1 not found: ID does not exist" Nov 26 15:17:38 crc kubenswrapper[4785]: I1126 15:17:38.104996 4785 scope.go:117] "RemoveContainer" containerID="e1a2013c12cbd23c8c9e4e0d5bd9f76df1c2c2f30f0d55fdb8c8cb03698ea576" Nov 26 15:17:38 crc kubenswrapper[4785]: I1126 15:17:38.105203 4785 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e1a2013c12cbd23c8c9e4e0d5bd9f76df1c2c2f30f0d55fdb8c8cb03698ea576"} err="failed to get container status \"e1a2013c12cbd23c8c9e4e0d5bd9f76df1c2c2f30f0d55fdb8c8cb03698ea576\": rpc error: code = NotFound desc = could not find container \"e1a2013c12cbd23c8c9e4e0d5bd9f76df1c2c2f30f0d55fdb8c8cb03698ea576\": container with ID starting with e1a2013c12cbd23c8c9e4e0d5bd9f76df1c2c2f30f0d55fdb8c8cb03698ea576 not found: ID does not exist" Nov 26 15:17:38 crc kubenswrapper[4785]: I1126 15:17:38.105237 4785 scope.go:117] "RemoveContainer" containerID="5bc34631383b7470b2d582d037ea922f2c46d09ca9e37011d758e16837336a78" Nov 26 15:17:38 crc kubenswrapper[4785]: I1126 15:17:38.105469 4785 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5bc34631383b7470b2d582d037ea922f2c46d09ca9e37011d758e16837336a78"} err="failed to get container status \"5bc34631383b7470b2d582d037ea922f2c46d09ca9e37011d758e16837336a78\": rpc error: code = NotFound desc = could not find container \"5bc34631383b7470b2d582d037ea922f2c46d09ca9e37011d758e16837336a78\": container with ID starting with 5bc34631383b7470b2d582d037ea922f2c46d09ca9e37011d758e16837336a78 not found: ID does not exist" Nov 26 15:17:38 crc kubenswrapper[4785]: I1126 15:17:38.105497 4785 scope.go:117] "RemoveContainer" containerID="018b59a73e9d454f9156de17b00a8c5cccc3f321164cac374187afa18466bc45" Nov 26 15:17:38 crc kubenswrapper[4785]: I1126 15:17:38.105757 4785 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"018b59a73e9d454f9156de17b00a8c5cccc3f321164cac374187afa18466bc45"} err="failed to get container status \"018b59a73e9d454f9156de17b00a8c5cccc3f321164cac374187afa18466bc45\": rpc error: code = NotFound desc = could not find container \"018b59a73e9d454f9156de17b00a8c5cccc3f321164cac374187afa18466bc45\": container with ID starting with 018b59a73e9d454f9156de17b00a8c5cccc3f321164cac374187afa18466bc45 not found: ID does not exist" Nov 26 15:17:38 crc kubenswrapper[4785]: I1126 15:17:38.105780 4785 scope.go:117] "RemoveContainer" containerID="7b754df1d5f2d20f3afdb10f2e77f4716de9d953c7322bb3e289d989d2699d17" Nov 26 15:17:38 crc kubenswrapper[4785]: I1126 15:17:38.106129 4785 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7b754df1d5f2d20f3afdb10f2e77f4716de9d953c7322bb3e289d989d2699d17"} err="failed to get container status \"7b754df1d5f2d20f3afdb10f2e77f4716de9d953c7322bb3e289d989d2699d17\": rpc error: code = NotFound desc = could not find container \"7b754df1d5f2d20f3afdb10f2e77f4716de9d953c7322bb3e289d989d2699d17\": container with ID starting with 7b754df1d5f2d20f3afdb10f2e77f4716de9d953c7322bb3e289d989d2699d17 not found: ID does not exist" Nov 26 15:17:38 crc kubenswrapper[4785]: I1126 15:17:38.106149 4785 scope.go:117] "RemoveContainer" containerID="77387f2d556a76807d809c75aa4d9e2fdf5de82d7a04e23d3f5461ccef860196" Nov 26 15:17:38 crc kubenswrapper[4785]: I1126 15:17:38.106358 4785 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"77387f2d556a76807d809c75aa4d9e2fdf5de82d7a04e23d3f5461ccef860196"} err="failed to get container status \"77387f2d556a76807d809c75aa4d9e2fdf5de82d7a04e23d3f5461ccef860196\": rpc error: code = NotFound desc = could not find container \"77387f2d556a76807d809c75aa4d9e2fdf5de82d7a04e23d3f5461ccef860196\": container with ID starting with 77387f2d556a76807d809c75aa4d9e2fdf5de82d7a04e23d3f5461ccef860196 not found: ID does not exist" Nov 26 15:17:38 crc kubenswrapper[4785]: I1126 15:17:38.106379 4785 scope.go:117] "RemoveContainer" containerID="47be007cc11c8c2db5bc6dd5786cad4e45c1f8f08f872d060be0211012aad935" Nov 26 15:17:38 crc kubenswrapper[4785]: I1126 15:17:38.106675 4785 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"47be007cc11c8c2db5bc6dd5786cad4e45c1f8f08f872d060be0211012aad935"} err="failed to get container status \"47be007cc11c8c2db5bc6dd5786cad4e45c1f8f08f872d060be0211012aad935\": rpc error: code = NotFound desc = could not find container \"47be007cc11c8c2db5bc6dd5786cad4e45c1f8f08f872d060be0211012aad935\": container with ID starting with 47be007cc11c8c2db5bc6dd5786cad4e45c1f8f08f872d060be0211012aad935 not found: ID does not exist" Nov 26 15:17:38 crc kubenswrapper[4785]: I1126 15:17:38.106733 4785 scope.go:117] "RemoveContainer" containerID="049fc6de07d4b51e1c898035109c3427dff189cc509f5973db7cb75c3a022bc1" Nov 26 15:17:38 crc kubenswrapper[4785]: I1126 15:17:38.106956 4785 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"049fc6de07d4b51e1c898035109c3427dff189cc509f5973db7cb75c3a022bc1"} err="failed to get container status \"049fc6de07d4b51e1c898035109c3427dff189cc509f5973db7cb75c3a022bc1\": rpc error: code = NotFound desc = could not find container \"049fc6de07d4b51e1c898035109c3427dff189cc509f5973db7cb75c3a022bc1\": container with ID starting with 049fc6de07d4b51e1c898035109c3427dff189cc509f5973db7cb75c3a022bc1 not found: ID does not exist" Nov 26 15:17:38 crc kubenswrapper[4785]: I1126 15:17:38.887461 4785 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-6q4xd_855bd894-cca9-4fe1-a0d5-8b72afe7c93a/kube-multus/2.log" Nov 26 15:17:38 crc kubenswrapper[4785]: I1126 15:17:38.893775 4785 generic.go:334] "Generic (PLEG): container finished" podID="527d00ca-66d0-4c94-8b9e-cb65f3dcccc7" containerID="0ce23c1f271fd80a3e0d079344d555f1460bc489305bbf4fff942bc2fb94fb82" exitCode=0 Nov 26 15:17:38 crc kubenswrapper[4785]: I1126 15:17:38.893907 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kjgdh" event={"ID":"527d00ca-66d0-4c94-8b9e-cb65f3dcccc7","Type":"ContainerDied","Data":"0ce23c1f271fd80a3e0d079344d555f1460bc489305bbf4fff942bc2fb94fb82"} Nov 26 15:17:39 crc kubenswrapper[4785]: I1126 15:17:39.043912 4785 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="862c58fd-3f79-4276-bd76-ce689d32cbd6" path="/var/lib/kubelet/pods/862c58fd-3f79-4276-bd76-ce689d32cbd6/volumes" Nov 26 15:17:39 crc kubenswrapper[4785]: I1126 15:17:39.905230 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kjgdh" event={"ID":"527d00ca-66d0-4c94-8b9e-cb65f3dcccc7","Type":"ContainerStarted","Data":"628800948fae12fdce3b9f4b04c4a784e96131d7df8123f99b32aa7d5a057cd3"} Nov 26 15:17:39 crc kubenswrapper[4785]: I1126 15:17:39.905908 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kjgdh" event={"ID":"527d00ca-66d0-4c94-8b9e-cb65f3dcccc7","Type":"ContainerStarted","Data":"7cdbac816fa0e9759d11590c9c68533fe26d9d7f7a7c5ea2f32263cddec60b66"} Nov 26 15:17:39 crc kubenswrapper[4785]: I1126 15:17:39.905931 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kjgdh" event={"ID":"527d00ca-66d0-4c94-8b9e-cb65f3dcccc7","Type":"ContainerStarted","Data":"ceebdf5fb66990ab74365cf10e9151de2b4bfd303db0b1f3f111fecdfa603af2"} Nov 26 15:17:39 crc kubenswrapper[4785]: I1126 15:17:39.905957 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kjgdh" event={"ID":"527d00ca-66d0-4c94-8b9e-cb65f3dcccc7","Type":"ContainerStarted","Data":"23d345bf6033a58c5f21a4a3ab19d4b02adef412b10ebeac882ffcc18018be3c"} Nov 26 15:17:40 crc kubenswrapper[4785]: I1126 15:17:40.919100 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kjgdh" event={"ID":"527d00ca-66d0-4c94-8b9e-cb65f3dcccc7","Type":"ContainerStarted","Data":"2002d030cbfe16c84265c876e580a8aa0a9eb979d5e628f2ca464a9483f644fa"} Nov 26 15:17:40 crc kubenswrapper[4785]: I1126 15:17:40.919175 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kjgdh" event={"ID":"527d00ca-66d0-4c94-8b9e-cb65f3dcccc7","Type":"ContainerStarted","Data":"9402c0270a425d7da59984cb1f69de1ade455646df61fe6dbe82c302b28f8f91"} Nov 26 15:17:43 crc kubenswrapper[4785]: I1126 15:17:43.940015 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kjgdh" event={"ID":"527d00ca-66d0-4c94-8b9e-cb65f3dcccc7","Type":"ContainerStarted","Data":"4d631c8f78e43458bb908e0c6f5135d2d353bc0276095950686f1da0d4a0ea6f"} Nov 26 15:17:45 crc kubenswrapper[4785]: I1126 15:17:45.956463 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kjgdh" event={"ID":"527d00ca-66d0-4c94-8b9e-cb65f3dcccc7","Type":"ContainerStarted","Data":"3e4620b6688dc6dbad3be82d5f96141a5f7013e4466dacad2e06f56be2c7c37e"} Nov 26 15:17:45 crc kubenswrapper[4785]: I1126 15:17:45.957041 4785 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-kjgdh" Nov 26 15:17:45 crc kubenswrapper[4785]: I1126 15:17:45.957055 4785 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-kjgdh" Nov 26 15:17:45 crc kubenswrapper[4785]: I1126 15:17:45.957064 4785 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-kjgdh" Nov 26 15:17:45 crc kubenswrapper[4785]: I1126 15:17:45.985663 4785 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-kjgdh" Nov 26 15:17:45 crc kubenswrapper[4785]: I1126 15:17:45.985885 4785 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-kjgdh" podStartSLOduration=8.985872227 podStartE2EDuration="8.985872227s" podCreationTimestamp="2025-11-26 15:17:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 15:17:45.982847988 +0000 UTC m=+629.661213772" watchObservedRunningTime="2025-11-26 15:17:45.985872227 +0000 UTC m=+629.664238011" Nov 26 15:17:45 crc kubenswrapper[4785]: I1126 15:17:45.985943 4785 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-kjgdh" Nov 26 15:17:49 crc kubenswrapper[4785]: I1126 15:17:49.036596 4785 scope.go:117] "RemoveContainer" containerID="72a7e155fb846c9d6ff923ffa51d0d8a953f1aa548ab41ffc0d7339cd228d85b" Nov 26 15:17:49 crc kubenswrapper[4785]: E1126 15:17:49.037378 4785 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-6q4xd_openshift-multus(855bd894-cca9-4fe1-a0d5-8b72afe7c93a)\"" pod="openshift-multus/multus-6q4xd" podUID="855bd894-cca9-4fe1-a0d5-8b72afe7c93a" Nov 26 15:18:01 crc kubenswrapper[4785]: I1126 15:18:01.036419 4785 scope.go:117] "RemoveContainer" containerID="72a7e155fb846c9d6ff923ffa51d0d8a953f1aa548ab41ffc0d7339cd228d85b" Nov 26 15:18:02 crc kubenswrapper[4785]: I1126 15:18:02.067617 4785 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-6q4xd_855bd894-cca9-4fe1-a0d5-8b72afe7c93a/kube-multus/2.log" Nov 26 15:18:02 crc kubenswrapper[4785]: I1126 15:18:02.068025 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-6q4xd" event={"ID":"855bd894-cca9-4fe1-a0d5-8b72afe7c93a","Type":"ContainerStarted","Data":"7d72587b782f9e5f800cd27ac221ed03ade164fca675449624a8d4e754cd824b"} Nov 26 15:18:02 crc kubenswrapper[4785]: I1126 15:18:02.803892 4785 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6zpfj7"] Nov 26 15:18:02 crc kubenswrapper[4785]: I1126 15:18:02.805002 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6zpfj7" Nov 26 15:18:02 crc kubenswrapper[4785]: I1126 15:18:02.808041 4785 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Nov 26 15:18:02 crc kubenswrapper[4785]: I1126 15:18:02.811040 4785 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6zpfj7"] Nov 26 15:18:02 crc kubenswrapper[4785]: I1126 15:18:02.907091 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/bcaf9d05-38af-46ec-b475-37ba51771361-util\") pod \"e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6zpfj7\" (UID: \"bcaf9d05-38af-46ec-b475-37ba51771361\") " pod="openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6zpfj7" Nov 26 15:18:02 crc kubenswrapper[4785]: I1126 15:18:02.907486 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vjth7\" (UniqueName: \"kubernetes.io/projected/bcaf9d05-38af-46ec-b475-37ba51771361-kube-api-access-vjth7\") pod \"e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6zpfj7\" (UID: \"bcaf9d05-38af-46ec-b475-37ba51771361\") " pod="openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6zpfj7" Nov 26 15:18:02 crc kubenswrapper[4785]: I1126 15:18:02.907625 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/bcaf9d05-38af-46ec-b475-37ba51771361-bundle\") pod \"e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6zpfj7\" (UID: \"bcaf9d05-38af-46ec-b475-37ba51771361\") " pod="openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6zpfj7" Nov 26 15:18:03 crc kubenswrapper[4785]: I1126 15:18:03.009785 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/bcaf9d05-38af-46ec-b475-37ba51771361-util\") pod \"e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6zpfj7\" (UID: \"bcaf9d05-38af-46ec-b475-37ba51771361\") " pod="openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6zpfj7" Nov 26 15:18:03 crc kubenswrapper[4785]: I1126 15:18:03.009898 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vjth7\" (UniqueName: \"kubernetes.io/projected/bcaf9d05-38af-46ec-b475-37ba51771361-kube-api-access-vjth7\") pod \"e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6zpfj7\" (UID: \"bcaf9d05-38af-46ec-b475-37ba51771361\") " pod="openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6zpfj7" Nov 26 15:18:03 crc kubenswrapper[4785]: I1126 15:18:03.009952 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/bcaf9d05-38af-46ec-b475-37ba51771361-bundle\") pod \"e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6zpfj7\" (UID: \"bcaf9d05-38af-46ec-b475-37ba51771361\") " pod="openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6zpfj7" Nov 26 15:18:03 crc kubenswrapper[4785]: I1126 15:18:03.010691 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/bcaf9d05-38af-46ec-b475-37ba51771361-bundle\") pod \"e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6zpfj7\" (UID: \"bcaf9d05-38af-46ec-b475-37ba51771361\") " pod="openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6zpfj7" Nov 26 15:18:03 crc kubenswrapper[4785]: I1126 15:18:03.011289 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/bcaf9d05-38af-46ec-b475-37ba51771361-util\") pod \"e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6zpfj7\" (UID: \"bcaf9d05-38af-46ec-b475-37ba51771361\") " pod="openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6zpfj7" Nov 26 15:18:03 crc kubenswrapper[4785]: I1126 15:18:03.035072 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vjth7\" (UniqueName: \"kubernetes.io/projected/bcaf9d05-38af-46ec-b475-37ba51771361-kube-api-access-vjth7\") pod \"e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6zpfj7\" (UID: \"bcaf9d05-38af-46ec-b475-37ba51771361\") " pod="openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6zpfj7" Nov 26 15:18:03 crc kubenswrapper[4785]: I1126 15:18:03.169523 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6zpfj7" Nov 26 15:18:03 crc kubenswrapper[4785]: I1126 15:18:03.429460 4785 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6zpfj7"] Nov 26 15:18:03 crc kubenswrapper[4785]: W1126 15:18:03.438038 4785 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbcaf9d05_38af_46ec_b475_37ba51771361.slice/crio-3e26dcc0d0f2f57ab6b9784d8bd7491709ddb1c6349c80d4f8abfb1c6f86f52a WatchSource:0}: Error finding container 3e26dcc0d0f2f57ab6b9784d8bd7491709ddb1c6349c80d4f8abfb1c6f86f52a: Status 404 returned error can't find the container with id 3e26dcc0d0f2f57ab6b9784d8bd7491709ddb1c6349c80d4f8abfb1c6f86f52a Nov 26 15:18:04 crc kubenswrapper[4785]: I1126 15:18:04.081516 4785 generic.go:334] "Generic (PLEG): container finished" podID="bcaf9d05-38af-46ec-b475-37ba51771361" containerID="f79a70b193e059a75cf059cd4eeb50bef06382c1b1b194f95be12405366d4c40" exitCode=0 Nov 26 15:18:04 crc kubenswrapper[4785]: I1126 15:18:04.081602 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6zpfj7" event={"ID":"bcaf9d05-38af-46ec-b475-37ba51771361","Type":"ContainerDied","Data":"f79a70b193e059a75cf059cd4eeb50bef06382c1b1b194f95be12405366d4c40"} Nov 26 15:18:04 crc kubenswrapper[4785]: I1126 15:18:04.081884 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6zpfj7" event={"ID":"bcaf9d05-38af-46ec-b475-37ba51771361","Type":"ContainerStarted","Data":"3e26dcc0d0f2f57ab6b9784d8bd7491709ddb1c6349c80d4f8abfb1c6f86f52a"} Nov 26 15:18:04 crc kubenswrapper[4785]: I1126 15:18:04.083268 4785 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Nov 26 15:18:07 crc kubenswrapper[4785]: I1126 15:18:07.101829 4785 generic.go:334] "Generic (PLEG): container finished" podID="bcaf9d05-38af-46ec-b475-37ba51771361" containerID="39b0aa93c312dae7e895a35067bdb352427c81a95bcd8a767ec4ba8d56a21336" exitCode=0 Nov 26 15:18:07 crc kubenswrapper[4785]: I1126 15:18:07.101924 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6zpfj7" event={"ID":"bcaf9d05-38af-46ec-b475-37ba51771361","Type":"ContainerDied","Data":"39b0aa93c312dae7e895a35067bdb352427c81a95bcd8a767ec4ba8d56a21336"} Nov 26 15:18:07 crc kubenswrapper[4785]: I1126 15:18:07.740118 4785 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-kjgdh" Nov 26 15:18:08 crc kubenswrapper[4785]: I1126 15:18:08.118150 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6zpfj7" event={"ID":"bcaf9d05-38af-46ec-b475-37ba51771361","Type":"ContainerStarted","Data":"54c70cf3bb1fd3bb45834dd07928ee1d288c8363b139f078f27aa70df7d2a777"} Nov 26 15:18:08 crc kubenswrapper[4785]: I1126 15:18:08.143525 4785 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6zpfj7" podStartSLOduration=4.161099782 podStartE2EDuration="6.143508235s" podCreationTimestamp="2025-11-26 15:18:02 +0000 UTC" firstStartedPulling="2025-11-26 15:18:04.082946159 +0000 UTC m=+647.761311943" lastFinishedPulling="2025-11-26 15:18:06.065354622 +0000 UTC m=+649.743720396" observedRunningTime="2025-11-26 15:18:08.141291376 +0000 UTC m=+651.819657190" watchObservedRunningTime="2025-11-26 15:18:08.143508235 +0000 UTC m=+651.821874009" Nov 26 15:18:09 crc kubenswrapper[4785]: I1126 15:18:09.129435 4785 generic.go:334] "Generic (PLEG): container finished" podID="bcaf9d05-38af-46ec-b475-37ba51771361" containerID="54c70cf3bb1fd3bb45834dd07928ee1d288c8363b139f078f27aa70df7d2a777" exitCode=0 Nov 26 15:18:09 crc kubenswrapper[4785]: I1126 15:18:09.129506 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6zpfj7" event={"ID":"bcaf9d05-38af-46ec-b475-37ba51771361","Type":"ContainerDied","Data":"54c70cf3bb1fd3bb45834dd07928ee1d288c8363b139f078f27aa70df7d2a777"} Nov 26 15:18:10 crc kubenswrapper[4785]: I1126 15:18:10.479298 4785 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6zpfj7" Nov 26 15:18:10 crc kubenswrapper[4785]: I1126 15:18:10.632058 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/bcaf9d05-38af-46ec-b475-37ba51771361-bundle\") pod \"bcaf9d05-38af-46ec-b475-37ba51771361\" (UID: \"bcaf9d05-38af-46ec-b475-37ba51771361\") " Nov 26 15:18:10 crc kubenswrapper[4785]: I1126 15:18:10.632147 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/bcaf9d05-38af-46ec-b475-37ba51771361-util\") pod \"bcaf9d05-38af-46ec-b475-37ba51771361\" (UID: \"bcaf9d05-38af-46ec-b475-37ba51771361\") " Nov 26 15:18:10 crc kubenswrapper[4785]: I1126 15:18:10.632216 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vjth7\" (UniqueName: \"kubernetes.io/projected/bcaf9d05-38af-46ec-b475-37ba51771361-kube-api-access-vjth7\") pod \"bcaf9d05-38af-46ec-b475-37ba51771361\" (UID: \"bcaf9d05-38af-46ec-b475-37ba51771361\") " Nov 26 15:18:10 crc kubenswrapper[4785]: I1126 15:18:10.634408 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bcaf9d05-38af-46ec-b475-37ba51771361-bundle" (OuterVolumeSpecName: "bundle") pod "bcaf9d05-38af-46ec-b475-37ba51771361" (UID: "bcaf9d05-38af-46ec-b475-37ba51771361"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 15:18:10 crc kubenswrapper[4785]: I1126 15:18:10.640354 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bcaf9d05-38af-46ec-b475-37ba51771361-kube-api-access-vjth7" (OuterVolumeSpecName: "kube-api-access-vjth7") pod "bcaf9d05-38af-46ec-b475-37ba51771361" (UID: "bcaf9d05-38af-46ec-b475-37ba51771361"). InnerVolumeSpecName "kube-api-access-vjth7". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 15:18:10 crc kubenswrapper[4785]: I1126 15:18:10.654335 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bcaf9d05-38af-46ec-b475-37ba51771361-util" (OuterVolumeSpecName: "util") pod "bcaf9d05-38af-46ec-b475-37ba51771361" (UID: "bcaf9d05-38af-46ec-b475-37ba51771361"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 15:18:10 crc kubenswrapper[4785]: I1126 15:18:10.733601 4785 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/bcaf9d05-38af-46ec-b475-37ba51771361-bundle\") on node \"crc\" DevicePath \"\"" Nov 26 15:18:10 crc kubenswrapper[4785]: I1126 15:18:10.733656 4785 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/bcaf9d05-38af-46ec-b475-37ba51771361-util\") on node \"crc\" DevicePath \"\"" Nov 26 15:18:10 crc kubenswrapper[4785]: I1126 15:18:10.733678 4785 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vjth7\" (UniqueName: \"kubernetes.io/projected/bcaf9d05-38af-46ec-b475-37ba51771361-kube-api-access-vjth7\") on node \"crc\" DevicePath \"\"" Nov 26 15:18:11 crc kubenswrapper[4785]: I1126 15:18:11.146927 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6zpfj7" event={"ID":"bcaf9d05-38af-46ec-b475-37ba51771361","Type":"ContainerDied","Data":"3e26dcc0d0f2f57ab6b9784d8bd7491709ddb1c6349c80d4f8abfb1c6f86f52a"} Nov 26 15:18:11 crc kubenswrapper[4785]: I1126 15:18:11.147390 4785 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3e26dcc0d0f2f57ab6b9784d8bd7491709ddb1c6349c80d4f8abfb1c6f86f52a" Nov 26 15:18:11 crc kubenswrapper[4785]: I1126 15:18:11.146995 4785 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6zpfj7" Nov 26 15:18:18 crc kubenswrapper[4785]: I1126 15:18:18.890144 4785 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-84667dbb5-sslgl"] Nov 26 15:18:18 crc kubenswrapper[4785]: E1126 15:18:18.890878 4785 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bcaf9d05-38af-46ec-b475-37ba51771361" containerName="util" Nov 26 15:18:18 crc kubenswrapper[4785]: I1126 15:18:18.890893 4785 state_mem.go:107] "Deleted CPUSet assignment" podUID="bcaf9d05-38af-46ec-b475-37ba51771361" containerName="util" Nov 26 15:18:18 crc kubenswrapper[4785]: E1126 15:18:18.890902 4785 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bcaf9d05-38af-46ec-b475-37ba51771361" containerName="extract" Nov 26 15:18:18 crc kubenswrapper[4785]: I1126 15:18:18.890907 4785 state_mem.go:107] "Deleted CPUSet assignment" podUID="bcaf9d05-38af-46ec-b475-37ba51771361" containerName="extract" Nov 26 15:18:18 crc kubenswrapper[4785]: E1126 15:18:18.890922 4785 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bcaf9d05-38af-46ec-b475-37ba51771361" containerName="pull" Nov 26 15:18:18 crc kubenswrapper[4785]: I1126 15:18:18.890929 4785 state_mem.go:107] "Deleted CPUSet assignment" podUID="bcaf9d05-38af-46ec-b475-37ba51771361" containerName="pull" Nov 26 15:18:18 crc kubenswrapper[4785]: I1126 15:18:18.891012 4785 memory_manager.go:354] "RemoveStaleState removing state" podUID="bcaf9d05-38af-46ec-b475-37ba51771361" containerName="extract" Nov 26 15:18:18 crc kubenswrapper[4785]: I1126 15:18:18.891517 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-84667dbb5-sslgl" Nov 26 15:18:18 crc kubenswrapper[4785]: I1126 15:18:18.894975 4785 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Nov 26 15:18:18 crc kubenswrapper[4785]: I1126 15:18:18.895082 4785 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-mlgnr" Nov 26 15:18:18 crc kubenswrapper[4785]: I1126 15:18:18.895127 4785 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Nov 26 15:18:18 crc kubenswrapper[4785]: I1126 15:18:18.895333 4785 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Nov 26 15:18:18 crc kubenswrapper[4785]: I1126 15:18:18.895479 4785 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Nov 26 15:18:18 crc kubenswrapper[4785]: I1126 15:18:18.909989 4785 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-84667dbb5-sslgl"] Nov 26 15:18:19 crc kubenswrapper[4785]: I1126 15:18:19.035905 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/161001b1-a5be-49ea-8031-e2c11dd07800-apiservice-cert\") pod \"metallb-operator-controller-manager-84667dbb5-sslgl\" (UID: \"161001b1-a5be-49ea-8031-e2c11dd07800\") " pod="metallb-system/metallb-operator-controller-manager-84667dbb5-sslgl" Nov 26 15:18:19 crc kubenswrapper[4785]: I1126 15:18:19.036170 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/161001b1-a5be-49ea-8031-e2c11dd07800-webhook-cert\") pod \"metallb-operator-controller-manager-84667dbb5-sslgl\" (UID: \"161001b1-a5be-49ea-8031-e2c11dd07800\") " pod="metallb-system/metallb-operator-controller-manager-84667dbb5-sslgl" Nov 26 15:18:19 crc kubenswrapper[4785]: I1126 15:18:19.036326 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zj7kc\" (UniqueName: \"kubernetes.io/projected/161001b1-a5be-49ea-8031-e2c11dd07800-kube-api-access-zj7kc\") pod \"metallb-operator-controller-manager-84667dbb5-sslgl\" (UID: \"161001b1-a5be-49ea-8031-e2c11dd07800\") " pod="metallb-system/metallb-operator-controller-manager-84667dbb5-sslgl" Nov 26 15:18:19 crc kubenswrapper[4785]: I1126 15:18:19.137061 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/161001b1-a5be-49ea-8031-e2c11dd07800-apiservice-cert\") pod \"metallb-operator-controller-manager-84667dbb5-sslgl\" (UID: \"161001b1-a5be-49ea-8031-e2c11dd07800\") " pod="metallb-system/metallb-operator-controller-manager-84667dbb5-sslgl" Nov 26 15:18:19 crc kubenswrapper[4785]: I1126 15:18:19.137110 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/161001b1-a5be-49ea-8031-e2c11dd07800-webhook-cert\") pod \"metallb-operator-controller-manager-84667dbb5-sslgl\" (UID: \"161001b1-a5be-49ea-8031-e2c11dd07800\") " pod="metallb-system/metallb-operator-controller-manager-84667dbb5-sslgl" Nov 26 15:18:19 crc kubenswrapper[4785]: I1126 15:18:19.137167 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zj7kc\" (UniqueName: \"kubernetes.io/projected/161001b1-a5be-49ea-8031-e2c11dd07800-kube-api-access-zj7kc\") pod \"metallb-operator-controller-manager-84667dbb5-sslgl\" (UID: \"161001b1-a5be-49ea-8031-e2c11dd07800\") " pod="metallb-system/metallb-operator-controller-manager-84667dbb5-sslgl" Nov 26 15:18:19 crc kubenswrapper[4785]: I1126 15:18:19.141937 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/161001b1-a5be-49ea-8031-e2c11dd07800-apiservice-cert\") pod \"metallb-operator-controller-manager-84667dbb5-sslgl\" (UID: \"161001b1-a5be-49ea-8031-e2c11dd07800\") " pod="metallb-system/metallb-operator-controller-manager-84667dbb5-sslgl" Nov 26 15:18:19 crc kubenswrapper[4785]: I1126 15:18:19.150197 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/161001b1-a5be-49ea-8031-e2c11dd07800-webhook-cert\") pod \"metallb-operator-controller-manager-84667dbb5-sslgl\" (UID: \"161001b1-a5be-49ea-8031-e2c11dd07800\") " pod="metallb-system/metallb-operator-controller-manager-84667dbb5-sslgl" Nov 26 15:18:19 crc kubenswrapper[4785]: I1126 15:18:19.150636 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zj7kc\" (UniqueName: \"kubernetes.io/projected/161001b1-a5be-49ea-8031-e2c11dd07800-kube-api-access-zj7kc\") pod \"metallb-operator-controller-manager-84667dbb5-sslgl\" (UID: \"161001b1-a5be-49ea-8031-e2c11dd07800\") " pod="metallb-system/metallb-operator-controller-manager-84667dbb5-sslgl" Nov 26 15:18:19 crc kubenswrapper[4785]: I1126 15:18:19.210909 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-84667dbb5-sslgl" Nov 26 15:18:19 crc kubenswrapper[4785]: I1126 15:18:19.422030 4785 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-6cf4498f-spzbf"] Nov 26 15:18:19 crc kubenswrapper[4785]: I1126 15:18:19.422933 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-6cf4498f-spzbf" Nov 26 15:18:19 crc kubenswrapper[4785]: I1126 15:18:19.429082 4785 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Nov 26 15:18:19 crc kubenswrapper[4785]: I1126 15:18:19.429427 4785 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-nhb66" Nov 26 15:18:19 crc kubenswrapper[4785]: I1126 15:18:19.431127 4785 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Nov 26 15:18:19 crc kubenswrapper[4785]: I1126 15:18:19.449829 4785 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-6cf4498f-spzbf"] Nov 26 15:18:19 crc kubenswrapper[4785]: I1126 15:18:19.541334 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q26qj\" (UniqueName: \"kubernetes.io/projected/3bc35760-4dcf-49b0-a6c7-19f57d889012-kube-api-access-q26qj\") pod \"metallb-operator-webhook-server-6cf4498f-spzbf\" (UID: \"3bc35760-4dcf-49b0-a6c7-19f57d889012\") " pod="metallb-system/metallb-operator-webhook-server-6cf4498f-spzbf" Nov 26 15:18:19 crc kubenswrapper[4785]: I1126 15:18:19.541452 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/3bc35760-4dcf-49b0-a6c7-19f57d889012-apiservice-cert\") pod \"metallb-operator-webhook-server-6cf4498f-spzbf\" (UID: \"3bc35760-4dcf-49b0-a6c7-19f57d889012\") " pod="metallb-system/metallb-operator-webhook-server-6cf4498f-spzbf" Nov 26 15:18:19 crc kubenswrapper[4785]: I1126 15:18:19.541477 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/3bc35760-4dcf-49b0-a6c7-19f57d889012-webhook-cert\") pod \"metallb-operator-webhook-server-6cf4498f-spzbf\" (UID: \"3bc35760-4dcf-49b0-a6c7-19f57d889012\") " pod="metallb-system/metallb-operator-webhook-server-6cf4498f-spzbf" Nov 26 15:18:19 crc kubenswrapper[4785]: I1126 15:18:19.598544 4785 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-84667dbb5-sslgl"] Nov 26 15:18:19 crc kubenswrapper[4785]: W1126 15:18:19.604133 4785 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod161001b1_a5be_49ea_8031_e2c11dd07800.slice/crio-f7f82423b45d2de326c412261f6e15d60e2265cb15e698c201e87ebee90c84a6 WatchSource:0}: Error finding container f7f82423b45d2de326c412261f6e15d60e2265cb15e698c201e87ebee90c84a6: Status 404 returned error can't find the container with id f7f82423b45d2de326c412261f6e15d60e2265cb15e698c201e87ebee90c84a6 Nov 26 15:18:19 crc kubenswrapper[4785]: I1126 15:18:19.642889 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/3bc35760-4dcf-49b0-a6c7-19f57d889012-webhook-cert\") pod \"metallb-operator-webhook-server-6cf4498f-spzbf\" (UID: \"3bc35760-4dcf-49b0-a6c7-19f57d889012\") " pod="metallb-system/metallb-operator-webhook-server-6cf4498f-spzbf" Nov 26 15:18:19 crc kubenswrapper[4785]: I1126 15:18:19.642977 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q26qj\" (UniqueName: \"kubernetes.io/projected/3bc35760-4dcf-49b0-a6c7-19f57d889012-kube-api-access-q26qj\") pod \"metallb-operator-webhook-server-6cf4498f-spzbf\" (UID: \"3bc35760-4dcf-49b0-a6c7-19f57d889012\") " pod="metallb-system/metallb-operator-webhook-server-6cf4498f-spzbf" Nov 26 15:18:19 crc kubenswrapper[4785]: I1126 15:18:19.643018 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/3bc35760-4dcf-49b0-a6c7-19f57d889012-apiservice-cert\") pod \"metallb-operator-webhook-server-6cf4498f-spzbf\" (UID: \"3bc35760-4dcf-49b0-a6c7-19f57d889012\") " pod="metallb-system/metallb-operator-webhook-server-6cf4498f-spzbf" Nov 26 15:18:19 crc kubenswrapper[4785]: I1126 15:18:19.657397 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/3bc35760-4dcf-49b0-a6c7-19f57d889012-webhook-cert\") pod \"metallb-operator-webhook-server-6cf4498f-spzbf\" (UID: \"3bc35760-4dcf-49b0-a6c7-19f57d889012\") " pod="metallb-system/metallb-operator-webhook-server-6cf4498f-spzbf" Nov 26 15:18:19 crc kubenswrapper[4785]: I1126 15:18:19.657405 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/3bc35760-4dcf-49b0-a6c7-19f57d889012-apiservice-cert\") pod \"metallb-operator-webhook-server-6cf4498f-spzbf\" (UID: \"3bc35760-4dcf-49b0-a6c7-19f57d889012\") " pod="metallb-system/metallb-operator-webhook-server-6cf4498f-spzbf" Nov 26 15:18:19 crc kubenswrapper[4785]: I1126 15:18:19.661234 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q26qj\" (UniqueName: \"kubernetes.io/projected/3bc35760-4dcf-49b0-a6c7-19f57d889012-kube-api-access-q26qj\") pod \"metallb-operator-webhook-server-6cf4498f-spzbf\" (UID: \"3bc35760-4dcf-49b0-a6c7-19f57d889012\") " pod="metallb-system/metallb-operator-webhook-server-6cf4498f-spzbf" Nov 26 15:18:19 crc kubenswrapper[4785]: I1126 15:18:19.740174 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-6cf4498f-spzbf" Nov 26 15:18:19 crc kubenswrapper[4785]: I1126 15:18:19.927113 4785 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-6cf4498f-spzbf"] Nov 26 15:18:20 crc kubenswrapper[4785]: I1126 15:18:20.198426 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-6cf4498f-spzbf" event={"ID":"3bc35760-4dcf-49b0-a6c7-19f57d889012","Type":"ContainerStarted","Data":"da15606b4546d413cf5c44c12fbc79f74a298c0bd1b40b3129c25040876aeb3b"} Nov 26 15:18:20 crc kubenswrapper[4785]: I1126 15:18:20.199434 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-84667dbb5-sslgl" event={"ID":"161001b1-a5be-49ea-8031-e2c11dd07800","Type":"ContainerStarted","Data":"f7f82423b45d2de326c412261f6e15d60e2265cb15e698c201e87ebee90c84a6"} Nov 26 15:18:23 crc kubenswrapper[4785]: I1126 15:18:23.216301 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-84667dbb5-sslgl" event={"ID":"161001b1-a5be-49ea-8031-e2c11dd07800","Type":"ContainerStarted","Data":"49e10019708c7c760007229dd4fdf7afe9b43556b42661516134fda3234f359a"} Nov 26 15:18:23 crc kubenswrapper[4785]: I1126 15:18:23.216909 4785 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-84667dbb5-sslgl" Nov 26 15:18:23 crc kubenswrapper[4785]: I1126 15:18:23.234750 4785 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-84667dbb5-sslgl" podStartSLOduration=2.074440939 podStartE2EDuration="5.234712556s" podCreationTimestamp="2025-11-26 15:18:18 +0000 UTC" firstStartedPulling="2025-11-26 15:18:19.605918336 +0000 UTC m=+663.284284090" lastFinishedPulling="2025-11-26 15:18:22.766189943 +0000 UTC m=+666.444555707" observedRunningTime="2025-11-26 15:18:23.232631371 +0000 UTC m=+666.910997155" watchObservedRunningTime="2025-11-26 15:18:23.234712556 +0000 UTC m=+666.913078330" Nov 26 15:18:25 crc kubenswrapper[4785]: I1126 15:18:25.231648 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-6cf4498f-spzbf" event={"ID":"3bc35760-4dcf-49b0-a6c7-19f57d889012","Type":"ContainerStarted","Data":"1ca9a37bba6aea97aacae28bccbec3c2111bbd855af5666e154a43415dd8bc4f"} Nov 26 15:18:25 crc kubenswrapper[4785]: I1126 15:18:25.233131 4785 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-6cf4498f-spzbf" Nov 26 15:18:25 crc kubenswrapper[4785]: I1126 15:18:25.273885 4785 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-6cf4498f-spzbf" podStartSLOduration=1.806337764 podStartE2EDuration="6.273864412s" podCreationTimestamp="2025-11-26 15:18:19 +0000 UTC" firstStartedPulling="2025-11-26 15:18:19.938316096 +0000 UTC m=+663.616681860" lastFinishedPulling="2025-11-26 15:18:24.405842744 +0000 UTC m=+668.084208508" observedRunningTime="2025-11-26 15:18:25.270164114 +0000 UTC m=+668.948529918" watchObservedRunningTime="2025-11-26 15:18:25.273864412 +0000 UTC m=+668.952230186" Nov 26 15:18:39 crc kubenswrapper[4785]: I1126 15:18:39.749034 4785 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-6cf4498f-spzbf" Nov 26 15:18:59 crc kubenswrapper[4785]: I1126 15:18:59.213669 4785 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-84667dbb5-sslgl" Nov 26 15:19:00 crc kubenswrapper[4785]: I1126 15:19:00.024376 4785 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-h5jcn"] Nov 26 15:19:00 crc kubenswrapper[4785]: I1126 15:19:00.027465 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-h5jcn" Nov 26 15:19:00 crc kubenswrapper[4785]: I1126 15:19:00.030217 4785 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Nov 26 15:19:00 crc kubenswrapper[4785]: I1126 15:19:00.031124 4785 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Nov 26 15:19:00 crc kubenswrapper[4785]: I1126 15:19:00.031277 4785 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-pnk82" Nov 26 15:19:00 crc kubenswrapper[4785]: I1126 15:19:00.049316 4785 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-6998585d5-7bmlh"] Nov 26 15:19:00 crc kubenswrapper[4785]: I1126 15:19:00.051755 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-6998585d5-7bmlh" Nov 26 15:19:00 crc kubenswrapper[4785]: I1126 15:19:00.054836 4785 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Nov 26 15:19:00 crc kubenswrapper[4785]: I1126 15:19:00.077328 4785 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-6998585d5-7bmlh"] Nov 26 15:19:00 crc kubenswrapper[4785]: I1126 15:19:00.113506 4785 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-mdltn"] Nov 26 15:19:00 crc kubenswrapper[4785]: I1126 15:19:00.114495 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-mdltn" Nov 26 15:19:00 crc kubenswrapper[4785]: I1126 15:19:00.116545 4785 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Nov 26 15:19:00 crc kubenswrapper[4785]: I1126 15:19:00.116957 4785 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Nov 26 15:19:00 crc kubenswrapper[4785]: I1126 15:19:00.117128 4785 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-kz6tr" Nov 26 15:19:00 crc kubenswrapper[4785]: I1126 15:19:00.119195 4785 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Nov 26 15:19:00 crc kubenswrapper[4785]: I1126 15:19:00.141076 4785 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-6c7b4b5f48-9gfl9"] Nov 26 15:19:00 crc kubenswrapper[4785]: I1126 15:19:00.142210 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-6c7b4b5f48-9gfl9" Nov 26 15:19:00 crc kubenswrapper[4785]: I1126 15:19:00.145442 4785 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Nov 26 15:19:00 crc kubenswrapper[4785]: I1126 15:19:00.148437 4785 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-6c7b4b5f48-9gfl9"] Nov 26 15:19:00 crc kubenswrapper[4785]: I1126 15:19:00.189038 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/c54bc198-6911-461a-9c78-dfd7fd744524-frr-sockets\") pod \"frr-k8s-h5jcn\" (UID: \"c54bc198-6911-461a-9c78-dfd7fd744524\") " pod="metallb-system/frr-k8s-h5jcn" Nov 26 15:19:00 crc kubenswrapper[4785]: I1126 15:19:00.189107 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lk6rg\" (UniqueName: \"kubernetes.io/projected/c54bc198-6911-461a-9c78-dfd7fd744524-kube-api-access-lk6rg\") pod \"frr-k8s-h5jcn\" (UID: \"c54bc198-6911-461a-9c78-dfd7fd744524\") " pod="metallb-system/frr-k8s-h5jcn" Nov 26 15:19:00 crc kubenswrapper[4785]: I1126 15:19:00.189439 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/c54bc198-6911-461a-9c78-dfd7fd744524-reloader\") pod \"frr-k8s-h5jcn\" (UID: \"c54bc198-6911-461a-9c78-dfd7fd744524\") " pod="metallb-system/frr-k8s-h5jcn" Nov 26 15:19:00 crc kubenswrapper[4785]: I1126 15:19:00.189594 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/2a6a141b-7f8b-4b07-8afb-33bd21fc7b7d-metallb-excludel2\") pod \"speaker-mdltn\" (UID: \"2a6a141b-7f8b-4b07-8afb-33bd21fc7b7d\") " pod="metallb-system/speaker-mdltn" Nov 26 15:19:00 crc kubenswrapper[4785]: I1126 15:19:00.189649 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/c54bc198-6911-461a-9c78-dfd7fd744524-metrics\") pod \"frr-k8s-h5jcn\" (UID: \"c54bc198-6911-461a-9c78-dfd7fd744524\") " pod="metallb-system/frr-k8s-h5jcn" Nov 26 15:19:00 crc kubenswrapper[4785]: I1126 15:19:00.189670 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2a6a141b-7f8b-4b07-8afb-33bd21fc7b7d-metrics-certs\") pod \"speaker-mdltn\" (UID: \"2a6a141b-7f8b-4b07-8afb-33bd21fc7b7d\") " pod="metallb-system/speaker-mdltn" Nov 26 15:19:00 crc kubenswrapper[4785]: I1126 15:19:00.189746 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/2a6a141b-7f8b-4b07-8afb-33bd21fc7b7d-memberlist\") pod \"speaker-mdltn\" (UID: \"2a6a141b-7f8b-4b07-8afb-33bd21fc7b7d\") " pod="metallb-system/speaker-mdltn" Nov 26 15:19:00 crc kubenswrapper[4785]: I1126 15:19:00.189835 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/c54bc198-6911-461a-9c78-dfd7fd744524-frr-startup\") pod \"frr-k8s-h5jcn\" (UID: \"c54bc198-6911-461a-9c78-dfd7fd744524\") " pod="metallb-system/frr-k8s-h5jcn" Nov 26 15:19:00 crc kubenswrapper[4785]: I1126 15:19:00.189871 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lw8hs\" (UniqueName: \"kubernetes.io/projected/98a95f21-9e38-4113-848b-6b9ced267e38-kube-api-access-lw8hs\") pod \"frr-k8s-webhook-server-6998585d5-7bmlh\" (UID: \"98a95f21-9e38-4113-848b-6b9ced267e38\") " pod="metallb-system/frr-k8s-webhook-server-6998585d5-7bmlh" Nov 26 15:19:00 crc kubenswrapper[4785]: I1126 15:19:00.189912 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c54bc198-6911-461a-9c78-dfd7fd744524-metrics-certs\") pod \"frr-k8s-h5jcn\" (UID: \"c54bc198-6911-461a-9c78-dfd7fd744524\") " pod="metallb-system/frr-k8s-h5jcn" Nov 26 15:19:00 crc kubenswrapper[4785]: I1126 15:19:00.189937 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/98a95f21-9e38-4113-848b-6b9ced267e38-cert\") pod \"frr-k8s-webhook-server-6998585d5-7bmlh\" (UID: \"98a95f21-9e38-4113-848b-6b9ced267e38\") " pod="metallb-system/frr-k8s-webhook-server-6998585d5-7bmlh" Nov 26 15:19:00 crc kubenswrapper[4785]: I1126 15:19:00.189974 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/c54bc198-6911-461a-9c78-dfd7fd744524-frr-conf\") pod \"frr-k8s-h5jcn\" (UID: \"c54bc198-6911-461a-9c78-dfd7fd744524\") " pod="metallb-system/frr-k8s-h5jcn" Nov 26 15:19:00 crc kubenswrapper[4785]: I1126 15:19:00.190083 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sjtxn\" (UniqueName: \"kubernetes.io/projected/2a6a141b-7f8b-4b07-8afb-33bd21fc7b7d-kube-api-access-sjtxn\") pod \"speaker-mdltn\" (UID: \"2a6a141b-7f8b-4b07-8afb-33bd21fc7b7d\") " pod="metallb-system/speaker-mdltn" Nov 26 15:19:00 crc kubenswrapper[4785]: I1126 15:19:00.291607 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/2a6a141b-7f8b-4b07-8afb-33bd21fc7b7d-metallb-excludel2\") pod \"speaker-mdltn\" (UID: \"2a6a141b-7f8b-4b07-8afb-33bd21fc7b7d\") " pod="metallb-system/speaker-mdltn" Nov 26 15:19:00 crc kubenswrapper[4785]: I1126 15:19:00.291656 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/c54bc198-6911-461a-9c78-dfd7fd744524-metrics\") pod \"frr-k8s-h5jcn\" (UID: \"c54bc198-6911-461a-9c78-dfd7fd744524\") " pod="metallb-system/frr-k8s-h5jcn" Nov 26 15:19:00 crc kubenswrapper[4785]: I1126 15:19:00.291674 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2a6a141b-7f8b-4b07-8afb-33bd21fc7b7d-metrics-certs\") pod \"speaker-mdltn\" (UID: \"2a6a141b-7f8b-4b07-8afb-33bd21fc7b7d\") " pod="metallb-system/speaker-mdltn" Nov 26 15:19:00 crc kubenswrapper[4785]: I1126 15:19:00.291688 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/2a6a141b-7f8b-4b07-8afb-33bd21fc7b7d-memberlist\") pod \"speaker-mdltn\" (UID: \"2a6a141b-7f8b-4b07-8afb-33bd21fc7b7d\") " pod="metallb-system/speaker-mdltn" Nov 26 15:19:00 crc kubenswrapper[4785]: I1126 15:19:00.291713 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zzhmr\" (UniqueName: \"kubernetes.io/projected/6aa1ab43-d7e6-412b-a16d-d92e016442bb-kube-api-access-zzhmr\") pod \"controller-6c7b4b5f48-9gfl9\" (UID: \"6aa1ab43-d7e6-412b-a16d-d92e016442bb\") " pod="metallb-system/controller-6c7b4b5f48-9gfl9" Nov 26 15:19:00 crc kubenswrapper[4785]: I1126 15:19:00.291732 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/c54bc198-6911-461a-9c78-dfd7fd744524-frr-startup\") pod \"frr-k8s-h5jcn\" (UID: \"c54bc198-6911-461a-9c78-dfd7fd744524\") " pod="metallb-system/frr-k8s-h5jcn" Nov 26 15:19:00 crc kubenswrapper[4785]: I1126 15:19:00.291749 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lw8hs\" (UniqueName: \"kubernetes.io/projected/98a95f21-9e38-4113-848b-6b9ced267e38-kube-api-access-lw8hs\") pod \"frr-k8s-webhook-server-6998585d5-7bmlh\" (UID: \"98a95f21-9e38-4113-848b-6b9ced267e38\") " pod="metallb-system/frr-k8s-webhook-server-6998585d5-7bmlh" Nov 26 15:19:00 crc kubenswrapper[4785]: I1126 15:19:00.291771 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c54bc198-6911-461a-9c78-dfd7fd744524-metrics-certs\") pod \"frr-k8s-h5jcn\" (UID: \"c54bc198-6911-461a-9c78-dfd7fd744524\") " pod="metallb-system/frr-k8s-h5jcn" Nov 26 15:19:00 crc kubenswrapper[4785]: I1126 15:19:00.291789 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/98a95f21-9e38-4113-848b-6b9ced267e38-cert\") pod \"frr-k8s-webhook-server-6998585d5-7bmlh\" (UID: \"98a95f21-9e38-4113-848b-6b9ced267e38\") " pod="metallb-system/frr-k8s-webhook-server-6998585d5-7bmlh" Nov 26 15:19:00 crc kubenswrapper[4785]: I1126 15:19:00.291807 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/c54bc198-6911-461a-9c78-dfd7fd744524-frr-conf\") pod \"frr-k8s-h5jcn\" (UID: \"c54bc198-6911-461a-9c78-dfd7fd744524\") " pod="metallb-system/frr-k8s-h5jcn" Nov 26 15:19:00 crc kubenswrapper[4785]: I1126 15:19:00.291831 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sjtxn\" (UniqueName: \"kubernetes.io/projected/2a6a141b-7f8b-4b07-8afb-33bd21fc7b7d-kube-api-access-sjtxn\") pod \"speaker-mdltn\" (UID: \"2a6a141b-7f8b-4b07-8afb-33bd21fc7b7d\") " pod="metallb-system/speaker-mdltn" Nov 26 15:19:00 crc kubenswrapper[4785]: I1126 15:19:00.291852 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6aa1ab43-d7e6-412b-a16d-d92e016442bb-metrics-certs\") pod \"controller-6c7b4b5f48-9gfl9\" (UID: \"6aa1ab43-d7e6-412b-a16d-d92e016442bb\") " pod="metallb-system/controller-6c7b4b5f48-9gfl9" Nov 26 15:19:00 crc kubenswrapper[4785]: E1126 15:19:00.291875 4785 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Nov 26 15:19:00 crc kubenswrapper[4785]: I1126 15:19:00.291908 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6aa1ab43-d7e6-412b-a16d-d92e016442bb-cert\") pod \"controller-6c7b4b5f48-9gfl9\" (UID: \"6aa1ab43-d7e6-412b-a16d-d92e016442bb\") " pod="metallb-system/controller-6c7b4b5f48-9gfl9" Nov 26 15:19:00 crc kubenswrapper[4785]: I1126 15:19:00.291934 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/c54bc198-6911-461a-9c78-dfd7fd744524-frr-sockets\") pod \"frr-k8s-h5jcn\" (UID: \"c54bc198-6911-461a-9c78-dfd7fd744524\") " pod="metallb-system/frr-k8s-h5jcn" Nov 26 15:19:00 crc kubenswrapper[4785]: E1126 15:19:00.292015 4785 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2a6a141b-7f8b-4b07-8afb-33bd21fc7b7d-memberlist podName:2a6a141b-7f8b-4b07-8afb-33bd21fc7b7d nodeName:}" failed. No retries permitted until 2025-11-26 15:19:00.791940392 +0000 UTC m=+704.470306266 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/2a6a141b-7f8b-4b07-8afb-33bd21fc7b7d-memberlist") pod "speaker-mdltn" (UID: "2a6a141b-7f8b-4b07-8afb-33bd21fc7b7d") : secret "metallb-memberlist" not found Nov 26 15:19:00 crc kubenswrapper[4785]: I1126 15:19:00.292276 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/c54bc198-6911-461a-9c78-dfd7fd744524-metrics\") pod \"frr-k8s-h5jcn\" (UID: \"c54bc198-6911-461a-9c78-dfd7fd744524\") " pod="metallb-system/frr-k8s-h5jcn" Nov 26 15:19:00 crc kubenswrapper[4785]: I1126 15:19:00.292468 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/c54bc198-6911-461a-9c78-dfd7fd744524-frr-sockets\") pod \"frr-k8s-h5jcn\" (UID: \"c54bc198-6911-461a-9c78-dfd7fd744524\") " pod="metallb-system/frr-k8s-h5jcn" Nov 26 15:19:00 crc kubenswrapper[4785]: I1126 15:19:00.292525 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lk6rg\" (UniqueName: \"kubernetes.io/projected/c54bc198-6911-461a-9c78-dfd7fd744524-kube-api-access-lk6rg\") pod \"frr-k8s-h5jcn\" (UID: \"c54bc198-6911-461a-9c78-dfd7fd744524\") " pod="metallb-system/frr-k8s-h5jcn" Nov 26 15:19:00 crc kubenswrapper[4785]: I1126 15:19:00.292590 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/c54bc198-6911-461a-9c78-dfd7fd744524-frr-conf\") pod \"frr-k8s-h5jcn\" (UID: \"c54bc198-6911-461a-9c78-dfd7fd744524\") " pod="metallb-system/frr-k8s-h5jcn" Nov 26 15:19:00 crc kubenswrapper[4785]: I1126 15:19:00.292674 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/c54bc198-6911-461a-9c78-dfd7fd744524-reloader\") pod \"frr-k8s-h5jcn\" (UID: \"c54bc198-6911-461a-9c78-dfd7fd744524\") " pod="metallb-system/frr-k8s-h5jcn" Nov 26 15:19:00 crc kubenswrapper[4785]: I1126 15:19:00.292741 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/2a6a141b-7f8b-4b07-8afb-33bd21fc7b7d-metallb-excludel2\") pod \"speaker-mdltn\" (UID: \"2a6a141b-7f8b-4b07-8afb-33bd21fc7b7d\") " pod="metallb-system/speaker-mdltn" Nov 26 15:19:00 crc kubenswrapper[4785]: I1126 15:19:00.292863 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/c54bc198-6911-461a-9c78-dfd7fd744524-reloader\") pod \"frr-k8s-h5jcn\" (UID: \"c54bc198-6911-461a-9c78-dfd7fd744524\") " pod="metallb-system/frr-k8s-h5jcn" Nov 26 15:19:00 crc kubenswrapper[4785]: I1126 15:19:00.293641 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/c54bc198-6911-461a-9c78-dfd7fd744524-frr-startup\") pod \"frr-k8s-h5jcn\" (UID: \"c54bc198-6911-461a-9c78-dfd7fd744524\") " pod="metallb-system/frr-k8s-h5jcn" Nov 26 15:19:00 crc kubenswrapper[4785]: I1126 15:19:00.300915 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/98a95f21-9e38-4113-848b-6b9ced267e38-cert\") pod \"frr-k8s-webhook-server-6998585d5-7bmlh\" (UID: \"98a95f21-9e38-4113-848b-6b9ced267e38\") " pod="metallb-system/frr-k8s-webhook-server-6998585d5-7bmlh" Nov 26 15:19:00 crc kubenswrapper[4785]: I1126 15:19:00.307806 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2a6a141b-7f8b-4b07-8afb-33bd21fc7b7d-metrics-certs\") pod \"speaker-mdltn\" (UID: \"2a6a141b-7f8b-4b07-8afb-33bd21fc7b7d\") " pod="metallb-system/speaker-mdltn" Nov 26 15:19:00 crc kubenswrapper[4785]: I1126 15:19:00.309472 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c54bc198-6911-461a-9c78-dfd7fd744524-metrics-certs\") pod \"frr-k8s-h5jcn\" (UID: \"c54bc198-6911-461a-9c78-dfd7fd744524\") " pod="metallb-system/frr-k8s-h5jcn" Nov 26 15:19:00 crc kubenswrapper[4785]: I1126 15:19:00.309750 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lw8hs\" (UniqueName: \"kubernetes.io/projected/98a95f21-9e38-4113-848b-6b9ced267e38-kube-api-access-lw8hs\") pod \"frr-k8s-webhook-server-6998585d5-7bmlh\" (UID: \"98a95f21-9e38-4113-848b-6b9ced267e38\") " pod="metallb-system/frr-k8s-webhook-server-6998585d5-7bmlh" Nov 26 15:19:00 crc kubenswrapper[4785]: I1126 15:19:00.310947 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sjtxn\" (UniqueName: \"kubernetes.io/projected/2a6a141b-7f8b-4b07-8afb-33bd21fc7b7d-kube-api-access-sjtxn\") pod \"speaker-mdltn\" (UID: \"2a6a141b-7f8b-4b07-8afb-33bd21fc7b7d\") " pod="metallb-system/speaker-mdltn" Nov 26 15:19:00 crc kubenswrapper[4785]: I1126 15:19:00.321188 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lk6rg\" (UniqueName: \"kubernetes.io/projected/c54bc198-6911-461a-9c78-dfd7fd744524-kube-api-access-lk6rg\") pod \"frr-k8s-h5jcn\" (UID: \"c54bc198-6911-461a-9c78-dfd7fd744524\") " pod="metallb-system/frr-k8s-h5jcn" Nov 26 15:19:00 crc kubenswrapper[4785]: I1126 15:19:00.355838 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-h5jcn" Nov 26 15:19:00 crc kubenswrapper[4785]: I1126 15:19:00.374095 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-6998585d5-7bmlh" Nov 26 15:19:00 crc kubenswrapper[4785]: I1126 15:19:00.394049 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zzhmr\" (UniqueName: \"kubernetes.io/projected/6aa1ab43-d7e6-412b-a16d-d92e016442bb-kube-api-access-zzhmr\") pod \"controller-6c7b4b5f48-9gfl9\" (UID: \"6aa1ab43-d7e6-412b-a16d-d92e016442bb\") " pod="metallb-system/controller-6c7b4b5f48-9gfl9" Nov 26 15:19:00 crc kubenswrapper[4785]: I1126 15:19:00.394130 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6aa1ab43-d7e6-412b-a16d-d92e016442bb-metrics-certs\") pod \"controller-6c7b4b5f48-9gfl9\" (UID: \"6aa1ab43-d7e6-412b-a16d-d92e016442bb\") " pod="metallb-system/controller-6c7b4b5f48-9gfl9" Nov 26 15:19:00 crc kubenswrapper[4785]: I1126 15:19:00.394165 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6aa1ab43-d7e6-412b-a16d-d92e016442bb-cert\") pod \"controller-6c7b4b5f48-9gfl9\" (UID: \"6aa1ab43-d7e6-412b-a16d-d92e016442bb\") " pod="metallb-system/controller-6c7b4b5f48-9gfl9" Nov 26 15:19:00 crc kubenswrapper[4785]: E1126 15:19:00.394414 4785 secret.go:188] Couldn't get secret metallb-system/controller-certs-secret: secret "controller-certs-secret" not found Nov 26 15:19:00 crc kubenswrapper[4785]: E1126 15:19:00.394577 4785 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6aa1ab43-d7e6-412b-a16d-d92e016442bb-metrics-certs podName:6aa1ab43-d7e6-412b-a16d-d92e016442bb nodeName:}" failed. No retries permitted until 2025-11-26 15:19:00.894524771 +0000 UTC m=+704.572890555 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6aa1ab43-d7e6-412b-a16d-d92e016442bb-metrics-certs") pod "controller-6c7b4b5f48-9gfl9" (UID: "6aa1ab43-d7e6-412b-a16d-d92e016442bb") : secret "controller-certs-secret" not found Nov 26 15:19:00 crc kubenswrapper[4785]: I1126 15:19:00.398799 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6aa1ab43-d7e6-412b-a16d-d92e016442bb-cert\") pod \"controller-6c7b4b5f48-9gfl9\" (UID: \"6aa1ab43-d7e6-412b-a16d-d92e016442bb\") " pod="metallb-system/controller-6c7b4b5f48-9gfl9" Nov 26 15:19:00 crc kubenswrapper[4785]: I1126 15:19:00.421248 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zzhmr\" (UniqueName: \"kubernetes.io/projected/6aa1ab43-d7e6-412b-a16d-d92e016442bb-kube-api-access-zzhmr\") pod \"controller-6c7b4b5f48-9gfl9\" (UID: \"6aa1ab43-d7e6-412b-a16d-d92e016442bb\") " pod="metallb-system/controller-6c7b4b5f48-9gfl9" Nov 26 15:19:00 crc kubenswrapper[4785]: I1126 15:19:00.771854 4785 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-6998585d5-7bmlh"] Nov 26 15:19:00 crc kubenswrapper[4785]: W1126 15:19:00.775954 4785 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod98a95f21_9e38_4113_848b_6b9ced267e38.slice/crio-079fc9087ac1dbbc55136ad3902dab2057d10198d02e73b861ecd8a1c43388fe WatchSource:0}: Error finding container 079fc9087ac1dbbc55136ad3902dab2057d10198d02e73b861ecd8a1c43388fe: Status 404 returned error can't find the container with id 079fc9087ac1dbbc55136ad3902dab2057d10198d02e73b861ecd8a1c43388fe Nov 26 15:19:00 crc kubenswrapper[4785]: I1126 15:19:00.799583 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/2a6a141b-7f8b-4b07-8afb-33bd21fc7b7d-memberlist\") pod \"speaker-mdltn\" (UID: \"2a6a141b-7f8b-4b07-8afb-33bd21fc7b7d\") " pod="metallb-system/speaker-mdltn" Nov 26 15:19:00 crc kubenswrapper[4785]: E1126 15:19:00.799780 4785 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Nov 26 15:19:00 crc kubenswrapper[4785]: E1126 15:19:00.799876 4785 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2a6a141b-7f8b-4b07-8afb-33bd21fc7b7d-memberlist podName:2a6a141b-7f8b-4b07-8afb-33bd21fc7b7d nodeName:}" failed. No retries permitted until 2025-11-26 15:19:01.799853651 +0000 UTC m=+705.478219415 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/2a6a141b-7f8b-4b07-8afb-33bd21fc7b7d-memberlist") pod "speaker-mdltn" (UID: "2a6a141b-7f8b-4b07-8afb-33bd21fc7b7d") : secret "metallb-memberlist" not found Nov 26 15:19:00 crc kubenswrapper[4785]: I1126 15:19:00.901212 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6aa1ab43-d7e6-412b-a16d-d92e016442bb-metrics-certs\") pod \"controller-6c7b4b5f48-9gfl9\" (UID: \"6aa1ab43-d7e6-412b-a16d-d92e016442bb\") " pod="metallb-system/controller-6c7b4b5f48-9gfl9" Nov 26 15:19:00 crc kubenswrapper[4785]: I1126 15:19:00.905794 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6aa1ab43-d7e6-412b-a16d-d92e016442bb-metrics-certs\") pod \"controller-6c7b4b5f48-9gfl9\" (UID: \"6aa1ab43-d7e6-412b-a16d-d92e016442bb\") " pod="metallb-system/controller-6c7b4b5f48-9gfl9" Nov 26 15:19:01 crc kubenswrapper[4785]: I1126 15:19:01.101460 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-6c7b4b5f48-9gfl9" Nov 26 15:19:01 crc kubenswrapper[4785]: I1126 15:19:01.346814 4785 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-6c7b4b5f48-9gfl9"] Nov 26 15:19:01 crc kubenswrapper[4785]: W1126 15:19:01.353056 4785 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6aa1ab43_d7e6_412b_a16d_d92e016442bb.slice/crio-15891653c1198d1e19152bbc40408e474aa40a62bd9e88d6d47a12766ad7cb27 WatchSource:0}: Error finding container 15891653c1198d1e19152bbc40408e474aa40a62bd9e88d6d47a12766ad7cb27: Status 404 returned error can't find the container with id 15891653c1198d1e19152bbc40408e474aa40a62bd9e88d6d47a12766ad7cb27 Nov 26 15:19:01 crc kubenswrapper[4785]: I1126 15:19:01.447143 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-6c7b4b5f48-9gfl9" event={"ID":"6aa1ab43-d7e6-412b-a16d-d92e016442bb","Type":"ContainerStarted","Data":"15891653c1198d1e19152bbc40408e474aa40a62bd9e88d6d47a12766ad7cb27"} Nov 26 15:19:01 crc kubenswrapper[4785]: I1126 15:19:01.448909 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-h5jcn" event={"ID":"c54bc198-6911-461a-9c78-dfd7fd744524","Type":"ContainerStarted","Data":"4d091798d494d9d14492101433d53819444c87c5ca466d4691d6579c9e4a3a81"} Nov 26 15:19:01 crc kubenswrapper[4785]: I1126 15:19:01.450009 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-6998585d5-7bmlh" event={"ID":"98a95f21-9e38-4113-848b-6b9ced267e38","Type":"ContainerStarted","Data":"079fc9087ac1dbbc55136ad3902dab2057d10198d02e73b861ecd8a1c43388fe"} Nov 26 15:19:01 crc kubenswrapper[4785]: I1126 15:19:01.813332 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/2a6a141b-7f8b-4b07-8afb-33bd21fc7b7d-memberlist\") pod \"speaker-mdltn\" (UID: \"2a6a141b-7f8b-4b07-8afb-33bd21fc7b7d\") " pod="metallb-system/speaker-mdltn" Nov 26 15:19:01 crc kubenswrapper[4785]: I1126 15:19:01.823603 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/2a6a141b-7f8b-4b07-8afb-33bd21fc7b7d-memberlist\") pod \"speaker-mdltn\" (UID: \"2a6a141b-7f8b-4b07-8afb-33bd21fc7b7d\") " pod="metallb-system/speaker-mdltn" Nov 26 15:19:01 crc kubenswrapper[4785]: I1126 15:19:01.928382 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-mdltn" Nov 26 15:19:01 crc kubenswrapper[4785]: W1126 15:19:01.950885 4785 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2a6a141b_7f8b_4b07_8afb_33bd21fc7b7d.slice/crio-781f2b93b3955e8d2448b29a999fa03ccd54e5a2f996c2389d3686e0b69405dc WatchSource:0}: Error finding container 781f2b93b3955e8d2448b29a999fa03ccd54e5a2f996c2389d3686e0b69405dc: Status 404 returned error can't find the container with id 781f2b93b3955e8d2448b29a999fa03ccd54e5a2f996c2389d3686e0b69405dc Nov 26 15:19:02 crc kubenswrapper[4785]: I1126 15:19:02.459173 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-mdltn" event={"ID":"2a6a141b-7f8b-4b07-8afb-33bd21fc7b7d","Type":"ContainerStarted","Data":"4f3c36a0a4aa94e5c9e39a0b206e9bbe1fc60c418c1c7996ba610761d23a8d7a"} Nov 26 15:19:02 crc kubenswrapper[4785]: I1126 15:19:02.459225 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-mdltn" event={"ID":"2a6a141b-7f8b-4b07-8afb-33bd21fc7b7d","Type":"ContainerStarted","Data":"781f2b93b3955e8d2448b29a999fa03ccd54e5a2f996c2389d3686e0b69405dc"} Nov 26 15:19:02 crc kubenswrapper[4785]: I1126 15:19:02.461837 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-6c7b4b5f48-9gfl9" event={"ID":"6aa1ab43-d7e6-412b-a16d-d92e016442bb","Type":"ContainerStarted","Data":"7734ce7a3c87b1cdf12ca9f4725dbba2139f69ae4c8f27c18ad8875cd57cc2a0"} Nov 26 15:19:05 crc kubenswrapper[4785]: I1126 15:19:05.489778 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-mdltn" event={"ID":"2a6a141b-7f8b-4b07-8afb-33bd21fc7b7d","Type":"ContainerStarted","Data":"6c204317c7240b940160f16fbaae2803113abcb18f2e7fdc6dd7dbe80987ae0c"} Nov 26 15:19:05 crc kubenswrapper[4785]: I1126 15:19:05.490735 4785 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-mdltn" Nov 26 15:19:05 crc kubenswrapper[4785]: I1126 15:19:05.494240 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-6c7b4b5f48-9gfl9" event={"ID":"6aa1ab43-d7e6-412b-a16d-d92e016442bb","Type":"ContainerStarted","Data":"ff9b3512251363d805a6ade6bf63be883247d132886a48a30307272a91780dc1"} Nov 26 15:19:05 crc kubenswrapper[4785]: I1126 15:19:05.494385 4785 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-6c7b4b5f48-9gfl9" Nov 26 15:19:05 crc kubenswrapper[4785]: I1126 15:19:05.509292 4785 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-mdltn" podStartSLOduration=2.6577817169999998 podStartE2EDuration="5.509275496s" podCreationTimestamp="2025-11-26 15:19:00 +0000 UTC" firstStartedPulling="2025-11-26 15:19:02.198342723 +0000 UTC m=+705.876708487" lastFinishedPulling="2025-11-26 15:19:05.049836502 +0000 UTC m=+708.728202266" observedRunningTime="2025-11-26 15:19:05.506295497 +0000 UTC m=+709.184661271" watchObservedRunningTime="2025-11-26 15:19:05.509275496 +0000 UTC m=+709.187641260" Nov 26 15:19:05 crc kubenswrapper[4785]: I1126 15:19:05.528270 4785 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-6c7b4b5f48-9gfl9" podStartSLOduration=1.992634158 podStartE2EDuration="5.528255205s" podCreationTimestamp="2025-11-26 15:19:00 +0000 UTC" firstStartedPulling="2025-11-26 15:19:01.505265209 +0000 UTC m=+705.183631013" lastFinishedPulling="2025-11-26 15:19:05.040886296 +0000 UTC m=+708.719252060" observedRunningTime="2025-11-26 15:19:05.523409638 +0000 UTC m=+709.201775402" watchObservedRunningTime="2025-11-26 15:19:05.528255205 +0000 UTC m=+709.206620969" Nov 26 15:19:08 crc kubenswrapper[4785]: I1126 15:19:08.514152 4785 generic.go:334] "Generic (PLEG): container finished" podID="c54bc198-6911-461a-9c78-dfd7fd744524" containerID="fcaf8a66b505b60dab59aac980114df480cfe4815e1c5c81d907237d640ae643" exitCode=0 Nov 26 15:19:08 crc kubenswrapper[4785]: I1126 15:19:08.514239 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-h5jcn" event={"ID":"c54bc198-6911-461a-9c78-dfd7fd744524","Type":"ContainerDied","Data":"fcaf8a66b505b60dab59aac980114df480cfe4815e1c5c81d907237d640ae643"} Nov 26 15:19:08 crc kubenswrapper[4785]: I1126 15:19:08.516267 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-6998585d5-7bmlh" event={"ID":"98a95f21-9e38-4113-848b-6b9ced267e38","Type":"ContainerStarted","Data":"bce14b3b8c3aa86a97354a9e4abbdcea56421eacbe7f45aaf4ccecae4c49c3c4"} Nov 26 15:19:08 crc kubenswrapper[4785]: I1126 15:19:08.517019 4785 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-6998585d5-7bmlh" Nov 26 15:19:08 crc kubenswrapper[4785]: I1126 15:19:08.566449 4785 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-6998585d5-7bmlh" podStartSLOduration=1.855254561 podStartE2EDuration="8.566425038s" podCreationTimestamp="2025-11-26 15:19:00 +0000 UTC" firstStartedPulling="2025-11-26 15:19:00.777994215 +0000 UTC m=+704.456359979" lastFinishedPulling="2025-11-26 15:19:07.489164692 +0000 UTC m=+711.167530456" observedRunningTime="2025-11-26 15:19:08.565845993 +0000 UTC m=+712.244211797" watchObservedRunningTime="2025-11-26 15:19:08.566425038 +0000 UTC m=+712.244790842" Nov 26 15:19:09 crc kubenswrapper[4785]: I1126 15:19:09.525339 4785 generic.go:334] "Generic (PLEG): container finished" podID="c54bc198-6911-461a-9c78-dfd7fd744524" containerID="e867be94d86fb6ecdcbcec07d52633e579972b40960d57209eb03890491bbdee" exitCode=0 Nov 26 15:19:09 crc kubenswrapper[4785]: I1126 15:19:09.525612 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-h5jcn" event={"ID":"c54bc198-6911-461a-9c78-dfd7fd744524","Type":"ContainerDied","Data":"e867be94d86fb6ecdcbcec07d52633e579972b40960d57209eb03890491bbdee"} Nov 26 15:19:10 crc kubenswrapper[4785]: I1126 15:19:10.534634 4785 generic.go:334] "Generic (PLEG): container finished" podID="c54bc198-6911-461a-9c78-dfd7fd744524" containerID="7364ee363d9453c1f43164022b416fe10c889e0c0246de62cffc59a352deafcf" exitCode=0 Nov 26 15:19:10 crc kubenswrapper[4785]: I1126 15:19:10.534712 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-h5jcn" event={"ID":"c54bc198-6911-461a-9c78-dfd7fd744524","Type":"ContainerDied","Data":"7364ee363d9453c1f43164022b416fe10c889e0c0246de62cffc59a352deafcf"} Nov 26 15:19:11 crc kubenswrapper[4785]: I1126 15:19:11.105486 4785 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-6c7b4b5f48-9gfl9" Nov 26 15:19:11 crc kubenswrapper[4785]: I1126 15:19:11.546932 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-h5jcn" event={"ID":"c54bc198-6911-461a-9c78-dfd7fd744524","Type":"ContainerStarted","Data":"be57704196f5e1e2e150e7ac27a48a9dcaebb10b0d99040f9a88bdfb0b061089"} Nov 26 15:19:11 crc kubenswrapper[4785]: I1126 15:19:11.546987 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-h5jcn" event={"ID":"c54bc198-6911-461a-9c78-dfd7fd744524","Type":"ContainerStarted","Data":"32fa1e9699c5abdd3f80e40d699f23582f8dad5b445d2aeac2d366424bf18466"} Nov 26 15:19:11 crc kubenswrapper[4785]: I1126 15:19:11.547003 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-h5jcn" event={"ID":"c54bc198-6911-461a-9c78-dfd7fd744524","Type":"ContainerStarted","Data":"bc2a0b4f398466bc7c8f277bcb2d2ee8539682c5c66871b8a0ee46ecabe0b95d"} Nov 26 15:19:11 crc kubenswrapper[4785]: I1126 15:19:11.547013 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-h5jcn" event={"ID":"c54bc198-6911-461a-9c78-dfd7fd744524","Type":"ContainerStarted","Data":"1b25e3eeb5abbaa19ebf40b74994c24796511e578bc63b93f20f41bde1f0c169"} Nov 26 15:19:11 crc kubenswrapper[4785]: I1126 15:19:11.547023 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-h5jcn" event={"ID":"c54bc198-6911-461a-9c78-dfd7fd744524","Type":"ContainerStarted","Data":"3457911f4e29d76eb60129be8893b234ce34102d38f9278826f0a9c24613b77a"} Nov 26 15:19:12 crc kubenswrapper[4785]: I1126 15:19:12.559429 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-h5jcn" event={"ID":"c54bc198-6911-461a-9c78-dfd7fd744524","Type":"ContainerStarted","Data":"beeddf17e6cf59f7f8920fe62f0cd3ed3bf0a9fa3f55c35fe8d8f43423614b8a"} Nov 26 15:19:12 crc kubenswrapper[4785]: I1126 15:19:12.559950 4785 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-h5jcn" Nov 26 15:19:12 crc kubenswrapper[4785]: I1126 15:19:12.713188 4785 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-h5jcn" podStartSLOduration=5.741609491 podStartE2EDuration="12.713163641s" podCreationTimestamp="2025-11-26 15:19:00 +0000 UTC" firstStartedPulling="2025-11-26 15:19:00.509271322 +0000 UTC m=+704.187637086" lastFinishedPulling="2025-11-26 15:19:07.480825472 +0000 UTC m=+711.159191236" observedRunningTime="2025-11-26 15:19:12.710005809 +0000 UTC m=+716.388371613" watchObservedRunningTime="2025-11-26 15:19:12.713163641 +0000 UTC m=+716.391529435" Nov 26 15:19:15 crc kubenswrapper[4785]: I1126 15:19:15.357171 4785 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-h5jcn" Nov 26 15:19:15 crc kubenswrapper[4785]: I1126 15:19:15.432802 4785 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-h5jcn" Nov 26 15:19:20 crc kubenswrapper[4785]: I1126 15:19:20.360950 4785 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-h5jcn" Nov 26 15:19:20 crc kubenswrapper[4785]: I1126 15:19:20.380488 4785 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-6998585d5-7bmlh" Nov 26 15:19:21 crc kubenswrapper[4785]: I1126 15:19:21.932949 4785 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-mdltn" Nov 26 15:19:28 crc kubenswrapper[4785]: I1126 15:19:28.833179 4785 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-index-m7w9j"] Nov 26 15:19:28 crc kubenswrapper[4785]: I1126 15:19:28.834134 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-index-m7w9j" Nov 26 15:19:28 crc kubenswrapper[4785]: I1126 15:19:28.850082 4785 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-index-dockercfg-qq2jl" Nov 26 15:19:28 crc kubenswrapper[4785]: I1126 15:19:28.850176 4785 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Nov 26 15:19:28 crc kubenswrapper[4785]: I1126 15:19:28.850580 4785 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Nov 26 15:19:28 crc kubenswrapper[4785]: I1126 15:19:28.859746 4785 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-index-m7w9j"] Nov 26 15:19:28 crc kubenswrapper[4785]: I1126 15:19:28.898704 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fx2km\" (UniqueName: \"kubernetes.io/projected/3102af1a-6a23-462b-a99b-0198a849dca5-kube-api-access-fx2km\") pod \"mariadb-operator-index-m7w9j\" (UID: \"3102af1a-6a23-462b-a99b-0198a849dca5\") " pod="openstack-operators/mariadb-operator-index-m7w9j" Nov 26 15:19:29 crc kubenswrapper[4785]: I1126 15:19:29.000236 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fx2km\" (UniqueName: \"kubernetes.io/projected/3102af1a-6a23-462b-a99b-0198a849dca5-kube-api-access-fx2km\") pod \"mariadb-operator-index-m7w9j\" (UID: \"3102af1a-6a23-462b-a99b-0198a849dca5\") " pod="openstack-operators/mariadb-operator-index-m7w9j" Nov 26 15:19:29 crc kubenswrapper[4785]: I1126 15:19:29.019027 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fx2km\" (UniqueName: \"kubernetes.io/projected/3102af1a-6a23-462b-a99b-0198a849dca5-kube-api-access-fx2km\") pod \"mariadb-operator-index-m7w9j\" (UID: \"3102af1a-6a23-462b-a99b-0198a849dca5\") " pod="openstack-operators/mariadb-operator-index-m7w9j" Nov 26 15:19:29 crc kubenswrapper[4785]: I1126 15:19:29.149772 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-index-m7w9j" Nov 26 15:19:29 crc kubenswrapper[4785]: I1126 15:19:29.417271 4785 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-index-m7w9j"] Nov 26 15:19:29 crc kubenswrapper[4785]: I1126 15:19:29.664831 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-index-m7w9j" event={"ID":"3102af1a-6a23-462b-a99b-0198a849dca5","Type":"ContainerStarted","Data":"d71e020e1a72a442cc9f00366a71123a4fce1f22ea95381b26b6fdd14757189f"} Nov 26 15:19:32 crc kubenswrapper[4785]: I1126 15:19:32.714242 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-index-m7w9j" event={"ID":"3102af1a-6a23-462b-a99b-0198a849dca5","Type":"ContainerStarted","Data":"1b96d80804b8bbbde8c7743bea643bafa4f8094da2fd5350b222f589ea4549a4"} Nov 26 15:19:32 crc kubenswrapper[4785]: I1126 15:19:32.743443 4785 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-index-m7w9j" podStartSLOduration=2.37794161 podStartE2EDuration="4.743424816s" podCreationTimestamp="2025-11-26 15:19:28 +0000 UTC" firstStartedPulling="2025-11-26 15:19:29.427924353 +0000 UTC m=+733.106290117" lastFinishedPulling="2025-11-26 15:19:31.793407519 +0000 UTC m=+735.471773323" observedRunningTime="2025-11-26 15:19:32.737266886 +0000 UTC m=+736.415632720" watchObservedRunningTime="2025-11-26 15:19:32.743424816 +0000 UTC m=+736.421790590" Nov 26 15:19:32 crc kubenswrapper[4785]: I1126 15:19:32.968659 4785 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/mariadb-operator-index-m7w9j"] Nov 26 15:19:33 crc kubenswrapper[4785]: I1126 15:19:33.582215 4785 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-index-h9jqb"] Nov 26 15:19:33 crc kubenswrapper[4785]: I1126 15:19:33.583756 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-index-h9jqb" Nov 26 15:19:33 crc kubenswrapper[4785]: I1126 15:19:33.595629 4785 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-index-h9jqb"] Nov 26 15:19:33 crc kubenswrapper[4785]: I1126 15:19:33.667999 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9lddx\" (UniqueName: \"kubernetes.io/projected/06d34fac-12a4-41e4-96b6-1d8df99cfee4-kube-api-access-9lddx\") pod \"mariadb-operator-index-h9jqb\" (UID: \"06d34fac-12a4-41e4-96b6-1d8df99cfee4\") " pod="openstack-operators/mariadb-operator-index-h9jqb" Nov 26 15:19:33 crc kubenswrapper[4785]: I1126 15:19:33.769758 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9lddx\" (UniqueName: \"kubernetes.io/projected/06d34fac-12a4-41e4-96b6-1d8df99cfee4-kube-api-access-9lddx\") pod \"mariadb-operator-index-h9jqb\" (UID: \"06d34fac-12a4-41e4-96b6-1d8df99cfee4\") " pod="openstack-operators/mariadb-operator-index-h9jqb" Nov 26 15:19:33 crc kubenswrapper[4785]: I1126 15:19:33.806131 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9lddx\" (UniqueName: \"kubernetes.io/projected/06d34fac-12a4-41e4-96b6-1d8df99cfee4-kube-api-access-9lddx\") pod \"mariadb-operator-index-h9jqb\" (UID: \"06d34fac-12a4-41e4-96b6-1d8df99cfee4\") " pod="openstack-operators/mariadb-operator-index-h9jqb" Nov 26 15:19:33 crc kubenswrapper[4785]: I1126 15:19:33.906183 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-index-h9jqb" Nov 26 15:19:34 crc kubenswrapper[4785]: I1126 15:19:34.092467 4785 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-index-h9jqb"] Nov 26 15:19:34 crc kubenswrapper[4785]: W1126 15:19:34.099155 4785 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod06d34fac_12a4_41e4_96b6_1d8df99cfee4.slice/crio-3a6b59fe00ced35e899f8bfa743fe507dac4167da370ed010807ef154e70da5b WatchSource:0}: Error finding container 3a6b59fe00ced35e899f8bfa743fe507dac4167da370ed010807ef154e70da5b: Status 404 returned error can't find the container with id 3a6b59fe00ced35e899f8bfa743fe507dac4167da370ed010807ef154e70da5b Nov 26 15:19:34 crc kubenswrapper[4785]: I1126 15:19:34.727178 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-index-h9jqb" event={"ID":"06d34fac-12a4-41e4-96b6-1d8df99cfee4","Type":"ContainerStarted","Data":"3a6b59fe00ced35e899f8bfa743fe507dac4167da370ed010807ef154e70da5b"} Nov 26 15:19:34 crc kubenswrapper[4785]: I1126 15:19:34.727638 4785 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/mariadb-operator-index-m7w9j" podUID="3102af1a-6a23-462b-a99b-0198a849dca5" containerName="registry-server" containerID="cri-o://1b96d80804b8bbbde8c7743bea643bafa4f8094da2fd5350b222f589ea4549a4" gracePeriod=2 Nov 26 15:19:35 crc kubenswrapper[4785]: I1126 15:19:35.175409 4785 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-index-m7w9j" Nov 26 15:19:35 crc kubenswrapper[4785]: I1126 15:19:35.288815 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fx2km\" (UniqueName: \"kubernetes.io/projected/3102af1a-6a23-462b-a99b-0198a849dca5-kube-api-access-fx2km\") pod \"3102af1a-6a23-462b-a99b-0198a849dca5\" (UID: \"3102af1a-6a23-462b-a99b-0198a849dca5\") " Nov 26 15:19:35 crc kubenswrapper[4785]: I1126 15:19:35.297598 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3102af1a-6a23-462b-a99b-0198a849dca5-kube-api-access-fx2km" (OuterVolumeSpecName: "kube-api-access-fx2km") pod "3102af1a-6a23-462b-a99b-0198a849dca5" (UID: "3102af1a-6a23-462b-a99b-0198a849dca5"). InnerVolumeSpecName "kube-api-access-fx2km". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 15:19:35 crc kubenswrapper[4785]: I1126 15:19:35.390407 4785 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fx2km\" (UniqueName: \"kubernetes.io/projected/3102af1a-6a23-462b-a99b-0198a849dca5-kube-api-access-fx2km\") on node \"crc\" DevicePath \"\"" Nov 26 15:19:35 crc kubenswrapper[4785]: I1126 15:19:35.737948 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-index-h9jqb" event={"ID":"06d34fac-12a4-41e4-96b6-1d8df99cfee4","Type":"ContainerStarted","Data":"aec5dbe2ce0a8e5abf203b3e29e165fd1a3578a1ebe302b5e3c3d680a47d4e98"} Nov 26 15:19:35 crc kubenswrapper[4785]: I1126 15:19:35.740704 4785 generic.go:334] "Generic (PLEG): container finished" podID="3102af1a-6a23-462b-a99b-0198a849dca5" containerID="1b96d80804b8bbbde8c7743bea643bafa4f8094da2fd5350b222f589ea4549a4" exitCode=0 Nov 26 15:19:35 crc kubenswrapper[4785]: I1126 15:19:35.740777 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-index-m7w9j" event={"ID":"3102af1a-6a23-462b-a99b-0198a849dca5","Type":"ContainerDied","Data":"1b96d80804b8bbbde8c7743bea643bafa4f8094da2fd5350b222f589ea4549a4"} Nov 26 15:19:35 crc kubenswrapper[4785]: I1126 15:19:35.740938 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-index-m7w9j" event={"ID":"3102af1a-6a23-462b-a99b-0198a849dca5","Type":"ContainerDied","Data":"d71e020e1a72a442cc9f00366a71123a4fce1f22ea95381b26b6fdd14757189f"} Nov 26 15:19:35 crc kubenswrapper[4785]: I1126 15:19:35.740986 4785 scope.go:117] "RemoveContainer" containerID="1b96d80804b8bbbde8c7743bea643bafa4f8094da2fd5350b222f589ea4549a4" Nov 26 15:19:35 crc kubenswrapper[4785]: I1126 15:19:35.741199 4785 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-index-m7w9j" Nov 26 15:19:35 crc kubenswrapper[4785]: I1126 15:19:35.772444 4785 scope.go:117] "RemoveContainer" containerID="1b96d80804b8bbbde8c7743bea643bafa4f8094da2fd5350b222f589ea4549a4" Nov 26 15:19:35 crc kubenswrapper[4785]: E1126 15:19:35.773260 4785 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1b96d80804b8bbbde8c7743bea643bafa4f8094da2fd5350b222f589ea4549a4\": container with ID starting with 1b96d80804b8bbbde8c7743bea643bafa4f8094da2fd5350b222f589ea4549a4 not found: ID does not exist" containerID="1b96d80804b8bbbde8c7743bea643bafa4f8094da2fd5350b222f589ea4549a4" Nov 26 15:19:35 crc kubenswrapper[4785]: I1126 15:19:35.773303 4785 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1b96d80804b8bbbde8c7743bea643bafa4f8094da2fd5350b222f589ea4549a4"} err="failed to get container status \"1b96d80804b8bbbde8c7743bea643bafa4f8094da2fd5350b222f589ea4549a4\": rpc error: code = NotFound desc = could not find container \"1b96d80804b8bbbde8c7743bea643bafa4f8094da2fd5350b222f589ea4549a4\": container with ID starting with 1b96d80804b8bbbde8c7743bea643bafa4f8094da2fd5350b222f589ea4549a4 not found: ID does not exist" Nov 26 15:19:35 crc kubenswrapper[4785]: I1126 15:19:35.774842 4785 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-index-h9jqb" podStartSLOduration=2.00438714 podStartE2EDuration="2.77481173s" podCreationTimestamp="2025-11-26 15:19:33 +0000 UTC" firstStartedPulling="2025-11-26 15:19:34.104405401 +0000 UTC m=+737.782771165" lastFinishedPulling="2025-11-26 15:19:34.874829991 +0000 UTC m=+738.553195755" observedRunningTime="2025-11-26 15:19:35.762158712 +0000 UTC m=+739.440524526" watchObservedRunningTime="2025-11-26 15:19:35.77481173 +0000 UTC m=+739.453177534" Nov 26 15:19:35 crc kubenswrapper[4785]: I1126 15:19:35.804683 4785 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/mariadb-operator-index-m7w9j"] Nov 26 15:19:35 crc kubenswrapper[4785]: I1126 15:19:35.808241 4785 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/mariadb-operator-index-m7w9j"] Nov 26 15:19:37 crc kubenswrapper[4785]: I1126 15:19:37.047128 4785 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3102af1a-6a23-462b-a99b-0198a849dca5" path="/var/lib/kubelet/pods/3102af1a-6a23-462b-a99b-0198a849dca5/volumes" Nov 26 15:19:37 crc kubenswrapper[4785]: I1126 15:19:37.289303 4785 patch_prober.go:28] interesting pod/machine-config-daemon-gkxdl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 26 15:19:37 crc kubenswrapper[4785]: I1126 15:19:37.289400 4785 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-gkxdl" podUID="5539d39a-e2bc-4e7f-8b5a-a5e3e10c4ba4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 26 15:19:43 crc kubenswrapper[4785]: I1126 15:19:43.907066 4785 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/mariadb-operator-index-h9jqb" Nov 26 15:19:43 crc kubenswrapper[4785]: I1126 15:19:43.907696 4785 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-index-h9jqb" Nov 26 15:19:43 crc kubenswrapper[4785]: I1126 15:19:43.942343 4785 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/mariadb-operator-index-h9jqb" Nov 26 15:19:44 crc kubenswrapper[4785]: I1126 15:19:44.846260 4785 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-index-h9jqb" Nov 26 15:19:45 crc kubenswrapper[4785]: I1126 15:19:45.380016 4785 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-dss42"] Nov 26 15:19:45 crc kubenswrapper[4785]: I1126 15:19:45.380267 4785 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-dss42" podUID="21c0b576-b3af-4d08-8e09-2c3728c8623e" containerName="controller-manager" containerID="cri-o://922a0f563b0bcbe0266469c9db21898a5984b4955d5ffcf9f400951cf54995e6" gracePeriod=30 Nov 26 15:19:45 crc kubenswrapper[4785]: I1126 15:19:45.464818 4785 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-sq88m"] Nov 26 15:19:45 crc kubenswrapper[4785]: I1126 15:19:45.465029 4785 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-sq88m" podUID="6428b894-a54a-4acc-8f84-66843f7165f0" containerName="route-controller-manager" containerID="cri-o://97f31c1fc87ff33c5bd9529349429c4b06ece2231977446049756454a3dc85ed" gracePeriod=30 Nov 26 15:19:45 crc kubenswrapper[4785]: I1126 15:19:45.732151 4785 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-dss42" Nov 26 15:19:45 crc kubenswrapper[4785]: I1126 15:19:45.804677 4785 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-sq88m" Nov 26 15:19:45 crc kubenswrapper[4785]: I1126 15:19:45.817210 4785 generic.go:334] "Generic (PLEG): container finished" podID="21c0b576-b3af-4d08-8e09-2c3728c8623e" containerID="922a0f563b0bcbe0266469c9db21898a5984b4955d5ffcf9f400951cf54995e6" exitCode=0 Nov 26 15:19:45 crc kubenswrapper[4785]: I1126 15:19:45.817279 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-dss42" event={"ID":"21c0b576-b3af-4d08-8e09-2c3728c8623e","Type":"ContainerDied","Data":"922a0f563b0bcbe0266469c9db21898a5984b4955d5ffcf9f400951cf54995e6"} Nov 26 15:19:45 crc kubenswrapper[4785]: I1126 15:19:45.817307 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-dss42" event={"ID":"21c0b576-b3af-4d08-8e09-2c3728c8623e","Type":"ContainerDied","Data":"99a90ce8e7350bb22ead93c90ef66f2288bc38cce129ed78d6d1195c736bb5bd"} Nov 26 15:19:45 crc kubenswrapper[4785]: I1126 15:19:45.817315 4785 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-dss42" Nov 26 15:19:45 crc kubenswrapper[4785]: I1126 15:19:45.817325 4785 scope.go:117] "RemoveContainer" containerID="922a0f563b0bcbe0266469c9db21898a5984b4955d5ffcf9f400951cf54995e6" Nov 26 15:19:45 crc kubenswrapper[4785]: I1126 15:19:45.822821 4785 generic.go:334] "Generic (PLEG): container finished" podID="6428b894-a54a-4acc-8f84-66843f7165f0" containerID="97f31c1fc87ff33c5bd9529349429c4b06ece2231977446049756454a3dc85ed" exitCode=0 Nov 26 15:19:45 crc kubenswrapper[4785]: I1126 15:19:45.823679 4785 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-sq88m" Nov 26 15:19:45 crc kubenswrapper[4785]: I1126 15:19:45.823816 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-sq88m" event={"ID":"6428b894-a54a-4acc-8f84-66843f7165f0","Type":"ContainerDied","Data":"97f31c1fc87ff33c5bd9529349429c4b06ece2231977446049756454a3dc85ed"} Nov 26 15:19:45 crc kubenswrapper[4785]: I1126 15:19:45.823880 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-sq88m" event={"ID":"6428b894-a54a-4acc-8f84-66843f7165f0","Type":"ContainerDied","Data":"ee9e5c86c3bf579e7074af649739a8a3d5bf490e5fb2b8ef6149e9e9e08f2d51"} Nov 26 15:19:45 crc kubenswrapper[4785]: I1126 15:19:45.840136 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n5kn4\" (UniqueName: \"kubernetes.io/projected/21c0b576-b3af-4d08-8e09-2c3728c8623e-kube-api-access-n5kn4\") pod \"21c0b576-b3af-4d08-8e09-2c3728c8623e\" (UID: \"21c0b576-b3af-4d08-8e09-2c3728c8623e\") " Nov 26 15:19:45 crc kubenswrapper[4785]: I1126 15:19:45.840173 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/21c0b576-b3af-4d08-8e09-2c3728c8623e-proxy-ca-bundles\") pod \"21c0b576-b3af-4d08-8e09-2c3728c8623e\" (UID: \"21c0b576-b3af-4d08-8e09-2c3728c8623e\") " Nov 26 15:19:45 crc kubenswrapper[4785]: I1126 15:19:45.840277 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/21c0b576-b3af-4d08-8e09-2c3728c8623e-client-ca\") pod \"21c0b576-b3af-4d08-8e09-2c3728c8623e\" (UID: \"21c0b576-b3af-4d08-8e09-2c3728c8623e\") " Nov 26 15:19:45 crc kubenswrapper[4785]: I1126 15:19:45.840297 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/21c0b576-b3af-4d08-8e09-2c3728c8623e-config\") pod \"21c0b576-b3af-4d08-8e09-2c3728c8623e\" (UID: \"21c0b576-b3af-4d08-8e09-2c3728c8623e\") " Nov 26 15:19:45 crc kubenswrapper[4785]: I1126 15:19:45.840351 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/21c0b576-b3af-4d08-8e09-2c3728c8623e-serving-cert\") pod \"21c0b576-b3af-4d08-8e09-2c3728c8623e\" (UID: \"21c0b576-b3af-4d08-8e09-2c3728c8623e\") " Nov 26 15:19:45 crc kubenswrapper[4785]: I1126 15:19:45.873197 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/21c0b576-b3af-4d08-8e09-2c3728c8623e-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "21c0b576-b3af-4d08-8e09-2c3728c8623e" (UID: "21c0b576-b3af-4d08-8e09-2c3728c8623e"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 15:19:45 crc kubenswrapper[4785]: I1126 15:19:45.873398 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/21c0b576-b3af-4d08-8e09-2c3728c8623e-client-ca" (OuterVolumeSpecName: "client-ca") pod "21c0b576-b3af-4d08-8e09-2c3728c8623e" (UID: "21c0b576-b3af-4d08-8e09-2c3728c8623e"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 15:19:45 crc kubenswrapper[4785]: I1126 15:19:45.873429 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/21c0b576-b3af-4d08-8e09-2c3728c8623e-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "21c0b576-b3af-4d08-8e09-2c3728c8623e" (UID: "21c0b576-b3af-4d08-8e09-2c3728c8623e"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 15:19:45 crc kubenswrapper[4785]: I1126 15:19:45.873597 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/21c0b576-b3af-4d08-8e09-2c3728c8623e-config" (OuterVolumeSpecName: "config") pod "21c0b576-b3af-4d08-8e09-2c3728c8623e" (UID: "21c0b576-b3af-4d08-8e09-2c3728c8623e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 15:19:45 crc kubenswrapper[4785]: I1126 15:19:45.875334 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/21c0b576-b3af-4d08-8e09-2c3728c8623e-kube-api-access-n5kn4" (OuterVolumeSpecName: "kube-api-access-n5kn4") pod "21c0b576-b3af-4d08-8e09-2c3728c8623e" (UID: "21c0b576-b3af-4d08-8e09-2c3728c8623e"). InnerVolumeSpecName "kube-api-access-n5kn4". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 15:19:45 crc kubenswrapper[4785]: I1126 15:19:45.875744 4785 scope.go:117] "RemoveContainer" containerID="922a0f563b0bcbe0266469c9db21898a5984b4955d5ffcf9f400951cf54995e6" Nov 26 15:19:45 crc kubenswrapper[4785]: E1126 15:19:45.876281 4785 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"922a0f563b0bcbe0266469c9db21898a5984b4955d5ffcf9f400951cf54995e6\": container with ID starting with 922a0f563b0bcbe0266469c9db21898a5984b4955d5ffcf9f400951cf54995e6 not found: ID does not exist" containerID="922a0f563b0bcbe0266469c9db21898a5984b4955d5ffcf9f400951cf54995e6" Nov 26 15:19:45 crc kubenswrapper[4785]: I1126 15:19:45.876319 4785 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"922a0f563b0bcbe0266469c9db21898a5984b4955d5ffcf9f400951cf54995e6"} err="failed to get container status \"922a0f563b0bcbe0266469c9db21898a5984b4955d5ffcf9f400951cf54995e6\": rpc error: code = NotFound desc = could not find container \"922a0f563b0bcbe0266469c9db21898a5984b4955d5ffcf9f400951cf54995e6\": container with ID starting with 922a0f563b0bcbe0266469c9db21898a5984b4955d5ffcf9f400951cf54995e6 not found: ID does not exist" Nov 26 15:19:45 crc kubenswrapper[4785]: I1126 15:19:45.876347 4785 scope.go:117] "RemoveContainer" containerID="97f31c1fc87ff33c5bd9529349429c4b06ece2231977446049756454a3dc85ed" Nov 26 15:19:45 crc kubenswrapper[4785]: I1126 15:19:45.897869 4785 scope.go:117] "RemoveContainer" containerID="97f31c1fc87ff33c5bd9529349429c4b06ece2231977446049756454a3dc85ed" Nov 26 15:19:45 crc kubenswrapper[4785]: E1126 15:19:45.898341 4785 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"97f31c1fc87ff33c5bd9529349429c4b06ece2231977446049756454a3dc85ed\": container with ID starting with 97f31c1fc87ff33c5bd9529349429c4b06ece2231977446049756454a3dc85ed not found: ID does not exist" containerID="97f31c1fc87ff33c5bd9529349429c4b06ece2231977446049756454a3dc85ed" Nov 26 15:19:45 crc kubenswrapper[4785]: I1126 15:19:45.898385 4785 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"97f31c1fc87ff33c5bd9529349429c4b06ece2231977446049756454a3dc85ed"} err="failed to get container status \"97f31c1fc87ff33c5bd9529349429c4b06ece2231977446049756454a3dc85ed\": rpc error: code = NotFound desc = could not find container \"97f31c1fc87ff33c5bd9529349429c4b06ece2231977446049756454a3dc85ed\": container with ID starting with 97f31c1fc87ff33c5bd9529349429c4b06ece2231977446049756454a3dc85ed not found: ID does not exist" Nov 26 15:19:45 crc kubenswrapper[4785]: I1126 15:19:45.941652 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6428b894-a54a-4acc-8f84-66843f7165f0-serving-cert\") pod \"6428b894-a54a-4acc-8f84-66843f7165f0\" (UID: \"6428b894-a54a-4acc-8f84-66843f7165f0\") " Nov 26 15:19:45 crc kubenswrapper[4785]: I1126 15:19:45.941713 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6428b894-a54a-4acc-8f84-66843f7165f0-client-ca\") pod \"6428b894-a54a-4acc-8f84-66843f7165f0\" (UID: \"6428b894-a54a-4acc-8f84-66843f7165f0\") " Nov 26 15:19:45 crc kubenswrapper[4785]: I1126 15:19:45.941756 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f72tq\" (UniqueName: \"kubernetes.io/projected/6428b894-a54a-4acc-8f84-66843f7165f0-kube-api-access-f72tq\") pod \"6428b894-a54a-4acc-8f84-66843f7165f0\" (UID: \"6428b894-a54a-4acc-8f84-66843f7165f0\") " Nov 26 15:19:45 crc kubenswrapper[4785]: I1126 15:19:45.941822 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6428b894-a54a-4acc-8f84-66843f7165f0-config\") pod \"6428b894-a54a-4acc-8f84-66843f7165f0\" (UID: \"6428b894-a54a-4acc-8f84-66843f7165f0\") " Nov 26 15:19:45 crc kubenswrapper[4785]: I1126 15:19:45.942060 4785 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n5kn4\" (UniqueName: \"kubernetes.io/projected/21c0b576-b3af-4d08-8e09-2c3728c8623e-kube-api-access-n5kn4\") on node \"crc\" DevicePath \"\"" Nov 26 15:19:45 crc kubenswrapper[4785]: I1126 15:19:45.942087 4785 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/21c0b576-b3af-4d08-8e09-2c3728c8623e-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Nov 26 15:19:45 crc kubenswrapper[4785]: I1126 15:19:45.942105 4785 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/21c0b576-b3af-4d08-8e09-2c3728c8623e-client-ca\") on node \"crc\" DevicePath \"\"" Nov 26 15:19:45 crc kubenswrapper[4785]: I1126 15:19:45.942120 4785 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/21c0b576-b3af-4d08-8e09-2c3728c8623e-config\") on node \"crc\" DevicePath \"\"" Nov 26 15:19:45 crc kubenswrapper[4785]: I1126 15:19:45.942136 4785 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/21c0b576-b3af-4d08-8e09-2c3728c8623e-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 26 15:19:45 crc kubenswrapper[4785]: I1126 15:19:45.943331 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6428b894-a54a-4acc-8f84-66843f7165f0-client-ca" (OuterVolumeSpecName: "client-ca") pod "6428b894-a54a-4acc-8f84-66843f7165f0" (UID: "6428b894-a54a-4acc-8f84-66843f7165f0"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 15:19:45 crc kubenswrapper[4785]: I1126 15:19:45.943455 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6428b894-a54a-4acc-8f84-66843f7165f0-config" (OuterVolumeSpecName: "config") pod "6428b894-a54a-4acc-8f84-66843f7165f0" (UID: "6428b894-a54a-4acc-8f84-66843f7165f0"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 15:19:45 crc kubenswrapper[4785]: I1126 15:19:45.949309 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6428b894-a54a-4acc-8f84-66843f7165f0-kube-api-access-f72tq" (OuterVolumeSpecName: "kube-api-access-f72tq") pod "6428b894-a54a-4acc-8f84-66843f7165f0" (UID: "6428b894-a54a-4acc-8f84-66843f7165f0"). InnerVolumeSpecName "kube-api-access-f72tq". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 15:19:45 crc kubenswrapper[4785]: I1126 15:19:45.949478 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6428b894-a54a-4acc-8f84-66843f7165f0-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6428b894-a54a-4acc-8f84-66843f7165f0" (UID: "6428b894-a54a-4acc-8f84-66843f7165f0"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 15:19:46 crc kubenswrapper[4785]: I1126 15:19:46.043834 4785 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6428b894-a54a-4acc-8f84-66843f7165f0-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 26 15:19:46 crc kubenswrapper[4785]: I1126 15:19:46.043898 4785 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6428b894-a54a-4acc-8f84-66843f7165f0-client-ca\") on node \"crc\" DevicePath \"\"" Nov 26 15:19:46 crc kubenswrapper[4785]: I1126 15:19:46.043921 4785 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f72tq\" (UniqueName: \"kubernetes.io/projected/6428b894-a54a-4acc-8f84-66843f7165f0-kube-api-access-f72tq\") on node \"crc\" DevicePath \"\"" Nov 26 15:19:46 crc kubenswrapper[4785]: I1126 15:19:46.043944 4785 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6428b894-a54a-4acc-8f84-66843f7165f0-config\") on node \"crc\" DevicePath \"\"" Nov 26 15:19:46 crc kubenswrapper[4785]: I1126 15:19:46.173687 4785 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-dss42"] Nov 26 15:19:46 crc kubenswrapper[4785]: I1126 15:19:46.177067 4785 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-dss42"] Nov 26 15:19:46 crc kubenswrapper[4785]: I1126 15:19:46.181952 4785 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-sq88m"] Nov 26 15:19:46 crc kubenswrapper[4785]: I1126 15:19:46.184937 4785 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-sq88m"] Nov 26 15:19:46 crc kubenswrapper[4785]: E1126 15:19:46.217670 4785 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod21c0b576_b3af_4d08_8e09_2c3728c8623e.slice/crio-99a90ce8e7350bb22ead93c90ef66f2288bc38cce129ed78d6d1195c736bb5bd\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod21c0b576_b3af_4d08_8e09_2c3728c8623e.slice\": RecentStats: unable to find data in memory cache]" Nov 26 15:19:46 crc kubenswrapper[4785]: I1126 15:19:46.981115 4785 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5fc6746fbb-dmhmb"] Nov 26 15:19:46 crc kubenswrapper[4785]: E1126 15:19:46.981521 4785 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6428b894-a54a-4acc-8f84-66843f7165f0" containerName="route-controller-manager" Nov 26 15:19:46 crc kubenswrapper[4785]: I1126 15:19:46.981531 4785 state_mem.go:107] "Deleted CPUSet assignment" podUID="6428b894-a54a-4acc-8f84-66843f7165f0" containerName="route-controller-manager" Nov 26 15:19:46 crc kubenswrapper[4785]: E1126 15:19:46.981540 4785 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3102af1a-6a23-462b-a99b-0198a849dca5" containerName="registry-server" Nov 26 15:19:46 crc kubenswrapper[4785]: I1126 15:19:46.981545 4785 state_mem.go:107] "Deleted CPUSet assignment" podUID="3102af1a-6a23-462b-a99b-0198a849dca5" containerName="registry-server" Nov 26 15:19:46 crc kubenswrapper[4785]: E1126 15:19:46.981572 4785 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21c0b576-b3af-4d08-8e09-2c3728c8623e" containerName="controller-manager" Nov 26 15:19:46 crc kubenswrapper[4785]: I1126 15:19:46.981579 4785 state_mem.go:107] "Deleted CPUSet assignment" podUID="21c0b576-b3af-4d08-8e09-2c3728c8623e" containerName="controller-manager" Nov 26 15:19:46 crc kubenswrapper[4785]: I1126 15:19:46.981668 4785 memory_manager.go:354] "RemoveStaleState removing state" podUID="3102af1a-6a23-462b-a99b-0198a849dca5" containerName="registry-server" Nov 26 15:19:46 crc kubenswrapper[4785]: I1126 15:19:46.981678 4785 memory_manager.go:354] "RemoveStaleState removing state" podUID="21c0b576-b3af-4d08-8e09-2c3728c8623e" containerName="controller-manager" Nov 26 15:19:46 crc kubenswrapper[4785]: I1126 15:19:46.981691 4785 memory_manager.go:354] "RemoveStaleState removing state" podUID="6428b894-a54a-4acc-8f84-66843f7165f0" containerName="route-controller-manager" Nov 26 15:19:46 crc kubenswrapper[4785]: I1126 15:19:46.982038 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5fc6746fbb-dmhmb" Nov 26 15:19:46 crc kubenswrapper[4785]: I1126 15:19:46.984957 4785 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Nov 26 15:19:46 crc kubenswrapper[4785]: I1126 15:19:46.985111 4785 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Nov 26 15:19:46 crc kubenswrapper[4785]: I1126 15:19:46.985537 4785 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Nov 26 15:19:46 crc kubenswrapper[4785]: I1126 15:19:46.985808 4785 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Nov 26 15:19:46 crc kubenswrapper[4785]: I1126 15:19:46.986090 4785 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Nov 26 15:19:46 crc kubenswrapper[4785]: I1126 15:19:46.987459 4785 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Nov 26 15:19:46 crc kubenswrapper[4785]: I1126 15:19:46.992591 4785 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5fc6746fbb-dmhmb"] Nov 26 15:19:47 crc kubenswrapper[4785]: I1126 15:19:47.043377 4785 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="21c0b576-b3af-4d08-8e09-2c3728c8623e" path="/var/lib/kubelet/pods/21c0b576-b3af-4d08-8e09-2c3728c8623e/volumes" Nov 26 15:19:47 crc kubenswrapper[4785]: I1126 15:19:47.044168 4785 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6428b894-a54a-4acc-8f84-66843f7165f0" path="/var/lib/kubelet/pods/6428b894-a54a-4acc-8f84-66843f7165f0/volumes" Nov 26 15:19:47 crc kubenswrapper[4785]: I1126 15:19:47.053542 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-22pv7\" (UniqueName: \"kubernetes.io/projected/855f4a26-3755-4fb3-b6a5-0f32b292b24d-kube-api-access-22pv7\") pod \"route-controller-manager-5fc6746fbb-dmhmb\" (UID: \"855f4a26-3755-4fb3-b6a5-0f32b292b24d\") " pod="openshift-route-controller-manager/route-controller-manager-5fc6746fbb-dmhmb" Nov 26 15:19:47 crc kubenswrapper[4785]: I1126 15:19:47.053690 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/855f4a26-3755-4fb3-b6a5-0f32b292b24d-client-ca\") pod \"route-controller-manager-5fc6746fbb-dmhmb\" (UID: \"855f4a26-3755-4fb3-b6a5-0f32b292b24d\") " pod="openshift-route-controller-manager/route-controller-manager-5fc6746fbb-dmhmb" Nov 26 15:19:47 crc kubenswrapper[4785]: I1126 15:19:47.053874 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/855f4a26-3755-4fb3-b6a5-0f32b292b24d-serving-cert\") pod \"route-controller-manager-5fc6746fbb-dmhmb\" (UID: \"855f4a26-3755-4fb3-b6a5-0f32b292b24d\") " pod="openshift-route-controller-manager/route-controller-manager-5fc6746fbb-dmhmb" Nov 26 15:19:47 crc kubenswrapper[4785]: I1126 15:19:47.053941 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/855f4a26-3755-4fb3-b6a5-0f32b292b24d-config\") pod \"route-controller-manager-5fc6746fbb-dmhmb\" (UID: \"855f4a26-3755-4fb3-b6a5-0f32b292b24d\") " pod="openshift-route-controller-manager/route-controller-manager-5fc6746fbb-dmhmb" Nov 26 15:19:47 crc kubenswrapper[4785]: I1126 15:19:47.155018 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/855f4a26-3755-4fb3-b6a5-0f32b292b24d-client-ca\") pod \"route-controller-manager-5fc6746fbb-dmhmb\" (UID: \"855f4a26-3755-4fb3-b6a5-0f32b292b24d\") " pod="openshift-route-controller-manager/route-controller-manager-5fc6746fbb-dmhmb" Nov 26 15:19:47 crc kubenswrapper[4785]: I1126 15:19:47.155602 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/855f4a26-3755-4fb3-b6a5-0f32b292b24d-serving-cert\") pod \"route-controller-manager-5fc6746fbb-dmhmb\" (UID: \"855f4a26-3755-4fb3-b6a5-0f32b292b24d\") " pod="openshift-route-controller-manager/route-controller-manager-5fc6746fbb-dmhmb" Nov 26 15:19:47 crc kubenswrapper[4785]: I1126 15:19:47.155644 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/855f4a26-3755-4fb3-b6a5-0f32b292b24d-config\") pod \"route-controller-manager-5fc6746fbb-dmhmb\" (UID: \"855f4a26-3755-4fb3-b6a5-0f32b292b24d\") " pod="openshift-route-controller-manager/route-controller-manager-5fc6746fbb-dmhmb" Nov 26 15:19:47 crc kubenswrapper[4785]: I1126 15:19:47.155712 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-22pv7\" (UniqueName: \"kubernetes.io/projected/855f4a26-3755-4fb3-b6a5-0f32b292b24d-kube-api-access-22pv7\") pod \"route-controller-manager-5fc6746fbb-dmhmb\" (UID: \"855f4a26-3755-4fb3-b6a5-0f32b292b24d\") " pod="openshift-route-controller-manager/route-controller-manager-5fc6746fbb-dmhmb" Nov 26 15:19:47 crc kubenswrapper[4785]: I1126 15:19:47.156094 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/855f4a26-3755-4fb3-b6a5-0f32b292b24d-client-ca\") pod \"route-controller-manager-5fc6746fbb-dmhmb\" (UID: \"855f4a26-3755-4fb3-b6a5-0f32b292b24d\") " pod="openshift-route-controller-manager/route-controller-manager-5fc6746fbb-dmhmb" Nov 26 15:19:47 crc kubenswrapper[4785]: I1126 15:19:47.156741 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/855f4a26-3755-4fb3-b6a5-0f32b292b24d-config\") pod \"route-controller-manager-5fc6746fbb-dmhmb\" (UID: \"855f4a26-3755-4fb3-b6a5-0f32b292b24d\") " pod="openshift-route-controller-manager/route-controller-manager-5fc6746fbb-dmhmb" Nov 26 15:19:47 crc kubenswrapper[4785]: I1126 15:19:47.159869 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/855f4a26-3755-4fb3-b6a5-0f32b292b24d-serving-cert\") pod \"route-controller-manager-5fc6746fbb-dmhmb\" (UID: \"855f4a26-3755-4fb3-b6a5-0f32b292b24d\") " pod="openshift-route-controller-manager/route-controller-manager-5fc6746fbb-dmhmb" Nov 26 15:19:47 crc kubenswrapper[4785]: I1126 15:19:47.177686 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-22pv7\" (UniqueName: \"kubernetes.io/projected/855f4a26-3755-4fb3-b6a5-0f32b292b24d-kube-api-access-22pv7\") pod \"route-controller-manager-5fc6746fbb-dmhmb\" (UID: \"855f4a26-3755-4fb3-b6a5-0f32b292b24d\") " pod="openshift-route-controller-manager/route-controller-manager-5fc6746fbb-dmhmb" Nov 26 15:19:47 crc kubenswrapper[4785]: I1126 15:19:47.269290 4785 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-7587b4cb4b-xhlcx"] Nov 26 15:19:47 crc kubenswrapper[4785]: I1126 15:19:47.270072 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7587b4cb4b-xhlcx" Nov 26 15:19:47 crc kubenswrapper[4785]: I1126 15:19:47.272127 4785 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Nov 26 15:19:47 crc kubenswrapper[4785]: I1126 15:19:47.272247 4785 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Nov 26 15:19:47 crc kubenswrapper[4785]: I1126 15:19:47.272432 4785 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Nov 26 15:19:47 crc kubenswrapper[4785]: I1126 15:19:47.272599 4785 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Nov 26 15:19:47 crc kubenswrapper[4785]: I1126 15:19:47.273057 4785 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Nov 26 15:19:47 crc kubenswrapper[4785]: I1126 15:19:47.273245 4785 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Nov 26 15:19:47 crc kubenswrapper[4785]: I1126 15:19:47.280159 4785 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7587b4cb4b-xhlcx"] Nov 26 15:19:47 crc kubenswrapper[4785]: I1126 15:19:47.280286 4785 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Nov 26 15:19:47 crc kubenswrapper[4785]: I1126 15:19:47.297582 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5fc6746fbb-dmhmb" Nov 26 15:19:47 crc kubenswrapper[4785]: I1126 15:19:47.357645 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0f721c02-f1fb-43be-96e6-3b92eb3bef15-proxy-ca-bundles\") pod \"controller-manager-7587b4cb4b-xhlcx\" (UID: \"0f721c02-f1fb-43be-96e6-3b92eb3bef15\") " pod="openshift-controller-manager/controller-manager-7587b4cb4b-xhlcx" Nov 26 15:19:47 crc kubenswrapper[4785]: I1126 15:19:47.357695 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0f721c02-f1fb-43be-96e6-3b92eb3bef15-serving-cert\") pod \"controller-manager-7587b4cb4b-xhlcx\" (UID: \"0f721c02-f1fb-43be-96e6-3b92eb3bef15\") " pod="openshift-controller-manager/controller-manager-7587b4cb4b-xhlcx" Nov 26 15:19:47 crc kubenswrapper[4785]: I1126 15:19:47.357728 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7gcfm\" (UniqueName: \"kubernetes.io/projected/0f721c02-f1fb-43be-96e6-3b92eb3bef15-kube-api-access-7gcfm\") pod \"controller-manager-7587b4cb4b-xhlcx\" (UID: \"0f721c02-f1fb-43be-96e6-3b92eb3bef15\") " pod="openshift-controller-manager/controller-manager-7587b4cb4b-xhlcx" Nov 26 15:19:47 crc kubenswrapper[4785]: I1126 15:19:47.357760 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0f721c02-f1fb-43be-96e6-3b92eb3bef15-config\") pod \"controller-manager-7587b4cb4b-xhlcx\" (UID: \"0f721c02-f1fb-43be-96e6-3b92eb3bef15\") " pod="openshift-controller-manager/controller-manager-7587b4cb4b-xhlcx" Nov 26 15:19:47 crc kubenswrapper[4785]: I1126 15:19:47.357821 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0f721c02-f1fb-43be-96e6-3b92eb3bef15-client-ca\") pod \"controller-manager-7587b4cb4b-xhlcx\" (UID: \"0f721c02-f1fb-43be-96e6-3b92eb3bef15\") " pod="openshift-controller-manager/controller-manager-7587b4cb4b-xhlcx" Nov 26 15:19:47 crc kubenswrapper[4785]: I1126 15:19:47.458809 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0f721c02-f1fb-43be-96e6-3b92eb3bef15-proxy-ca-bundles\") pod \"controller-manager-7587b4cb4b-xhlcx\" (UID: \"0f721c02-f1fb-43be-96e6-3b92eb3bef15\") " pod="openshift-controller-manager/controller-manager-7587b4cb4b-xhlcx" Nov 26 15:19:47 crc kubenswrapper[4785]: I1126 15:19:47.458860 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0f721c02-f1fb-43be-96e6-3b92eb3bef15-serving-cert\") pod \"controller-manager-7587b4cb4b-xhlcx\" (UID: \"0f721c02-f1fb-43be-96e6-3b92eb3bef15\") " pod="openshift-controller-manager/controller-manager-7587b4cb4b-xhlcx" Nov 26 15:19:47 crc kubenswrapper[4785]: I1126 15:19:47.458895 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7gcfm\" (UniqueName: \"kubernetes.io/projected/0f721c02-f1fb-43be-96e6-3b92eb3bef15-kube-api-access-7gcfm\") pod \"controller-manager-7587b4cb4b-xhlcx\" (UID: \"0f721c02-f1fb-43be-96e6-3b92eb3bef15\") " pod="openshift-controller-manager/controller-manager-7587b4cb4b-xhlcx" Nov 26 15:19:47 crc kubenswrapper[4785]: I1126 15:19:47.458930 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0f721c02-f1fb-43be-96e6-3b92eb3bef15-config\") pod \"controller-manager-7587b4cb4b-xhlcx\" (UID: \"0f721c02-f1fb-43be-96e6-3b92eb3bef15\") " pod="openshift-controller-manager/controller-manager-7587b4cb4b-xhlcx" Nov 26 15:19:47 crc kubenswrapper[4785]: I1126 15:19:47.458993 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0f721c02-f1fb-43be-96e6-3b92eb3bef15-client-ca\") pod \"controller-manager-7587b4cb4b-xhlcx\" (UID: \"0f721c02-f1fb-43be-96e6-3b92eb3bef15\") " pod="openshift-controller-manager/controller-manager-7587b4cb4b-xhlcx" Nov 26 15:19:47 crc kubenswrapper[4785]: I1126 15:19:47.460741 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0f721c02-f1fb-43be-96e6-3b92eb3bef15-client-ca\") pod \"controller-manager-7587b4cb4b-xhlcx\" (UID: \"0f721c02-f1fb-43be-96e6-3b92eb3bef15\") " pod="openshift-controller-manager/controller-manager-7587b4cb4b-xhlcx" Nov 26 15:19:47 crc kubenswrapper[4785]: I1126 15:19:47.461767 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0f721c02-f1fb-43be-96e6-3b92eb3bef15-proxy-ca-bundles\") pod \"controller-manager-7587b4cb4b-xhlcx\" (UID: \"0f721c02-f1fb-43be-96e6-3b92eb3bef15\") " pod="openshift-controller-manager/controller-manager-7587b4cb4b-xhlcx" Nov 26 15:19:47 crc kubenswrapper[4785]: I1126 15:19:47.463921 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0f721c02-f1fb-43be-96e6-3b92eb3bef15-config\") pod \"controller-manager-7587b4cb4b-xhlcx\" (UID: \"0f721c02-f1fb-43be-96e6-3b92eb3bef15\") " pod="openshift-controller-manager/controller-manager-7587b4cb4b-xhlcx" Nov 26 15:19:47 crc kubenswrapper[4785]: I1126 15:19:47.465618 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0f721c02-f1fb-43be-96e6-3b92eb3bef15-serving-cert\") pod \"controller-manager-7587b4cb4b-xhlcx\" (UID: \"0f721c02-f1fb-43be-96e6-3b92eb3bef15\") " pod="openshift-controller-manager/controller-manager-7587b4cb4b-xhlcx" Nov 26 15:19:47 crc kubenswrapper[4785]: I1126 15:19:47.478234 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7gcfm\" (UniqueName: \"kubernetes.io/projected/0f721c02-f1fb-43be-96e6-3b92eb3bef15-kube-api-access-7gcfm\") pod \"controller-manager-7587b4cb4b-xhlcx\" (UID: \"0f721c02-f1fb-43be-96e6-3b92eb3bef15\") " pod="openshift-controller-manager/controller-manager-7587b4cb4b-xhlcx" Nov 26 15:19:47 crc kubenswrapper[4785]: I1126 15:19:47.483575 4785 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5fc6746fbb-dmhmb"] Nov 26 15:19:47 crc kubenswrapper[4785]: W1126 15:19:47.500913 4785 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod855f4a26_3755_4fb3_b6a5_0f32b292b24d.slice/crio-a7abfedac37997068f3efe46d8b9d84d03ac6405a1b9fab59dabca5feed91ba5 WatchSource:0}: Error finding container a7abfedac37997068f3efe46d8b9d84d03ac6405a1b9fab59dabca5feed91ba5: Status 404 returned error can't find the container with id a7abfedac37997068f3efe46d8b9d84d03ac6405a1b9fab59dabca5feed91ba5 Nov 26 15:19:47 crc kubenswrapper[4785]: I1126 15:19:47.586529 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7587b4cb4b-xhlcx" Nov 26 15:19:47 crc kubenswrapper[4785]: I1126 15:19:47.837874 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5fc6746fbb-dmhmb" event={"ID":"855f4a26-3755-4fb3-b6a5-0f32b292b24d","Type":"ContainerStarted","Data":"399df145a5f58305d678af13608857ff8b797ed0f8749d46a28ab3c410800537"} Nov 26 15:19:47 crc kubenswrapper[4785]: I1126 15:19:47.837913 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5fc6746fbb-dmhmb" event={"ID":"855f4a26-3755-4fb3-b6a5-0f32b292b24d","Type":"ContainerStarted","Data":"a7abfedac37997068f3efe46d8b9d84d03ac6405a1b9fab59dabca5feed91ba5"} Nov 26 15:19:47 crc kubenswrapper[4785]: I1126 15:19:47.838165 4785 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-5fc6746fbb-dmhmb" Nov 26 15:19:47 crc kubenswrapper[4785]: I1126 15:19:47.853501 4785 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-5fc6746fbb-dmhmb" podStartSLOduration=1.853482141 podStartE2EDuration="1.853482141s" podCreationTimestamp="2025-11-26 15:19:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 15:19:47.85192014 +0000 UTC m=+751.530285924" watchObservedRunningTime="2025-11-26 15:19:47.853482141 +0000 UTC m=+751.531847905" Nov 26 15:19:47 crc kubenswrapper[4785]: I1126 15:19:47.992435 4785 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7587b4cb4b-xhlcx"] Nov 26 15:19:48 crc kubenswrapper[4785]: I1126 15:19:48.146614 4785 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-5fc6746fbb-dmhmb" Nov 26 15:19:48 crc kubenswrapper[4785]: I1126 15:19:48.844841 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7587b4cb4b-xhlcx" event={"ID":"0f721c02-f1fb-43be-96e6-3b92eb3bef15","Type":"ContainerStarted","Data":"ad3f7765fd6569749dfc4d4bef4d03728aad11a377750200e34bb5b499172948"} Nov 26 15:19:48 crc kubenswrapper[4785]: I1126 15:19:48.844920 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7587b4cb4b-xhlcx" event={"ID":"0f721c02-f1fb-43be-96e6-3b92eb3bef15","Type":"ContainerStarted","Data":"b4b8b16967c53e0fa98aa31f3761cdbc3a6e35851f4935a9139ca1289c32963e"} Nov 26 15:19:48 crc kubenswrapper[4785]: I1126 15:19:48.868340 4785 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-7587b4cb4b-xhlcx" podStartSLOduration=3.868320759 podStartE2EDuration="3.868320759s" podCreationTimestamp="2025-11-26 15:19:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 15:19:48.865727192 +0000 UTC m=+752.544092986" watchObservedRunningTime="2025-11-26 15:19:48.868320759 +0000 UTC m=+752.546686543" Nov 26 15:19:49 crc kubenswrapper[4785]: I1126 15:19:49.854522 4785 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-7587b4cb4b-xhlcx" Nov 26 15:19:49 crc kubenswrapper[4785]: I1126 15:19:49.862497 4785 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-7587b4cb4b-xhlcx" Nov 26 15:19:50 crc kubenswrapper[4785]: I1126 15:19:50.908302 4785 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/27e8bc079695f3aed52a6c5be68196d91a6230a1a03a8fc87a19aa534flmpn9"] Nov 26 15:19:50 crc kubenswrapper[4785]: I1126 15:19:50.909362 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/27e8bc079695f3aed52a6c5be68196d91a6230a1a03a8fc87a19aa534flmpn9" Nov 26 15:19:50 crc kubenswrapper[4785]: I1126 15:19:50.911391 4785 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-sfhll" Nov 26 15:19:50 crc kubenswrapper[4785]: I1126 15:19:50.924752 4785 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/27e8bc079695f3aed52a6c5be68196d91a6230a1a03a8fc87a19aa534flmpn9"] Nov 26 15:19:51 crc kubenswrapper[4785]: I1126 15:19:51.003410 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8rghx\" (UniqueName: \"kubernetes.io/projected/4aad6667-7a55-4c14-a191-7723fd1e5274-kube-api-access-8rghx\") pod \"27e8bc079695f3aed52a6c5be68196d91a6230a1a03a8fc87a19aa534flmpn9\" (UID: \"4aad6667-7a55-4c14-a191-7723fd1e5274\") " pod="openstack-operators/27e8bc079695f3aed52a6c5be68196d91a6230a1a03a8fc87a19aa534flmpn9" Nov 26 15:19:51 crc kubenswrapper[4785]: I1126 15:19:51.003479 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4aad6667-7a55-4c14-a191-7723fd1e5274-bundle\") pod \"27e8bc079695f3aed52a6c5be68196d91a6230a1a03a8fc87a19aa534flmpn9\" (UID: \"4aad6667-7a55-4c14-a191-7723fd1e5274\") " pod="openstack-operators/27e8bc079695f3aed52a6c5be68196d91a6230a1a03a8fc87a19aa534flmpn9" Nov 26 15:19:51 crc kubenswrapper[4785]: I1126 15:19:51.003502 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4aad6667-7a55-4c14-a191-7723fd1e5274-util\") pod \"27e8bc079695f3aed52a6c5be68196d91a6230a1a03a8fc87a19aa534flmpn9\" (UID: \"4aad6667-7a55-4c14-a191-7723fd1e5274\") " pod="openstack-operators/27e8bc079695f3aed52a6c5be68196d91a6230a1a03a8fc87a19aa534flmpn9" Nov 26 15:19:51 crc kubenswrapper[4785]: I1126 15:19:51.104743 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4aad6667-7a55-4c14-a191-7723fd1e5274-bundle\") pod \"27e8bc079695f3aed52a6c5be68196d91a6230a1a03a8fc87a19aa534flmpn9\" (UID: \"4aad6667-7a55-4c14-a191-7723fd1e5274\") " pod="openstack-operators/27e8bc079695f3aed52a6c5be68196d91a6230a1a03a8fc87a19aa534flmpn9" Nov 26 15:19:51 crc kubenswrapper[4785]: I1126 15:19:51.105694 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4aad6667-7a55-4c14-a191-7723fd1e5274-bundle\") pod \"27e8bc079695f3aed52a6c5be68196d91a6230a1a03a8fc87a19aa534flmpn9\" (UID: \"4aad6667-7a55-4c14-a191-7723fd1e5274\") " pod="openstack-operators/27e8bc079695f3aed52a6c5be68196d91a6230a1a03a8fc87a19aa534flmpn9" Nov 26 15:19:51 crc kubenswrapper[4785]: I1126 15:19:51.105987 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4aad6667-7a55-4c14-a191-7723fd1e5274-util\") pod \"27e8bc079695f3aed52a6c5be68196d91a6230a1a03a8fc87a19aa534flmpn9\" (UID: \"4aad6667-7a55-4c14-a191-7723fd1e5274\") " pod="openstack-operators/27e8bc079695f3aed52a6c5be68196d91a6230a1a03a8fc87a19aa534flmpn9" Nov 26 15:19:51 crc kubenswrapper[4785]: I1126 15:19:51.106275 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8rghx\" (UniqueName: \"kubernetes.io/projected/4aad6667-7a55-4c14-a191-7723fd1e5274-kube-api-access-8rghx\") pod \"27e8bc079695f3aed52a6c5be68196d91a6230a1a03a8fc87a19aa534flmpn9\" (UID: \"4aad6667-7a55-4c14-a191-7723fd1e5274\") " pod="openstack-operators/27e8bc079695f3aed52a6c5be68196d91a6230a1a03a8fc87a19aa534flmpn9" Nov 26 15:19:51 crc kubenswrapper[4785]: I1126 15:19:51.106461 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4aad6667-7a55-4c14-a191-7723fd1e5274-util\") pod \"27e8bc079695f3aed52a6c5be68196d91a6230a1a03a8fc87a19aa534flmpn9\" (UID: \"4aad6667-7a55-4c14-a191-7723fd1e5274\") " pod="openstack-operators/27e8bc079695f3aed52a6c5be68196d91a6230a1a03a8fc87a19aa534flmpn9" Nov 26 15:19:51 crc kubenswrapper[4785]: I1126 15:19:51.131110 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8rghx\" (UniqueName: \"kubernetes.io/projected/4aad6667-7a55-4c14-a191-7723fd1e5274-kube-api-access-8rghx\") pod \"27e8bc079695f3aed52a6c5be68196d91a6230a1a03a8fc87a19aa534flmpn9\" (UID: \"4aad6667-7a55-4c14-a191-7723fd1e5274\") " pod="openstack-operators/27e8bc079695f3aed52a6c5be68196d91a6230a1a03a8fc87a19aa534flmpn9" Nov 26 15:19:51 crc kubenswrapper[4785]: I1126 15:19:51.228731 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/27e8bc079695f3aed52a6c5be68196d91a6230a1a03a8fc87a19aa534flmpn9" Nov 26 15:19:51 crc kubenswrapper[4785]: I1126 15:19:51.641294 4785 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/27e8bc079695f3aed52a6c5be68196d91a6230a1a03a8fc87a19aa534flmpn9"] Nov 26 15:19:51 crc kubenswrapper[4785]: W1126 15:19:51.654849 4785 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4aad6667_7a55_4c14_a191_7723fd1e5274.slice/crio-15255ce84f4524b01328f17e634b3221cac5be46a9f08512d8226d7da2a9089e WatchSource:0}: Error finding container 15255ce84f4524b01328f17e634b3221cac5be46a9f08512d8226d7da2a9089e: Status 404 returned error can't find the container with id 15255ce84f4524b01328f17e634b3221cac5be46a9f08512d8226d7da2a9089e Nov 26 15:19:51 crc kubenswrapper[4785]: I1126 15:19:51.867814 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/27e8bc079695f3aed52a6c5be68196d91a6230a1a03a8fc87a19aa534flmpn9" event={"ID":"4aad6667-7a55-4c14-a191-7723fd1e5274","Type":"ContainerStarted","Data":"d658deb727bff3e6e5f49295a87407cdda997751c19f7ce24a9111af71a12e71"} Nov 26 15:19:51 crc kubenswrapper[4785]: I1126 15:19:51.867871 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/27e8bc079695f3aed52a6c5be68196d91a6230a1a03a8fc87a19aa534flmpn9" event={"ID":"4aad6667-7a55-4c14-a191-7723fd1e5274","Type":"ContainerStarted","Data":"15255ce84f4524b01328f17e634b3221cac5be46a9f08512d8226d7da2a9089e"} Nov 26 15:19:52 crc kubenswrapper[4785]: I1126 15:19:52.877445 4785 generic.go:334] "Generic (PLEG): container finished" podID="4aad6667-7a55-4c14-a191-7723fd1e5274" containerID="d658deb727bff3e6e5f49295a87407cdda997751c19f7ce24a9111af71a12e71" exitCode=0 Nov 26 15:19:52 crc kubenswrapper[4785]: I1126 15:19:52.877508 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/27e8bc079695f3aed52a6c5be68196d91a6230a1a03a8fc87a19aa534flmpn9" event={"ID":"4aad6667-7a55-4c14-a191-7723fd1e5274","Type":"ContainerDied","Data":"d658deb727bff3e6e5f49295a87407cdda997751c19f7ce24a9111af71a12e71"} Nov 26 15:19:53 crc kubenswrapper[4785]: I1126 15:19:53.862702 4785 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Nov 26 15:19:54 crc kubenswrapper[4785]: I1126 15:19:54.893133 4785 generic.go:334] "Generic (PLEG): container finished" podID="4aad6667-7a55-4c14-a191-7723fd1e5274" containerID="0f90d48567aec2d48651b1085648317c934f077b69f9a58b4e530143fca22c43" exitCode=0 Nov 26 15:19:54 crc kubenswrapper[4785]: I1126 15:19:54.893192 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/27e8bc079695f3aed52a6c5be68196d91a6230a1a03a8fc87a19aa534flmpn9" event={"ID":"4aad6667-7a55-4c14-a191-7723fd1e5274","Type":"ContainerDied","Data":"0f90d48567aec2d48651b1085648317c934f077b69f9a58b4e530143fca22c43"} Nov 26 15:19:55 crc kubenswrapper[4785]: I1126 15:19:55.905219 4785 generic.go:334] "Generic (PLEG): container finished" podID="4aad6667-7a55-4c14-a191-7723fd1e5274" containerID="a0abbf5902c451576d4c24bd81dd0b5abf99bb0c24048134fce2808d62be2d6e" exitCode=0 Nov 26 15:19:55 crc kubenswrapper[4785]: I1126 15:19:55.905269 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/27e8bc079695f3aed52a6c5be68196d91a6230a1a03a8fc87a19aa534flmpn9" event={"ID":"4aad6667-7a55-4c14-a191-7723fd1e5274","Type":"ContainerDied","Data":"a0abbf5902c451576d4c24bd81dd0b5abf99bb0c24048134fce2808d62be2d6e"} Nov 26 15:19:57 crc kubenswrapper[4785]: I1126 15:19:57.263008 4785 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/27e8bc079695f3aed52a6c5be68196d91a6230a1a03a8fc87a19aa534flmpn9" Nov 26 15:19:57 crc kubenswrapper[4785]: I1126 15:19:57.399127 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4aad6667-7a55-4c14-a191-7723fd1e5274-bundle\") pod \"4aad6667-7a55-4c14-a191-7723fd1e5274\" (UID: \"4aad6667-7a55-4c14-a191-7723fd1e5274\") " Nov 26 15:19:57 crc kubenswrapper[4785]: I1126 15:19:57.399217 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8rghx\" (UniqueName: \"kubernetes.io/projected/4aad6667-7a55-4c14-a191-7723fd1e5274-kube-api-access-8rghx\") pod \"4aad6667-7a55-4c14-a191-7723fd1e5274\" (UID: \"4aad6667-7a55-4c14-a191-7723fd1e5274\") " Nov 26 15:19:57 crc kubenswrapper[4785]: I1126 15:19:57.399262 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4aad6667-7a55-4c14-a191-7723fd1e5274-util\") pod \"4aad6667-7a55-4c14-a191-7723fd1e5274\" (UID: \"4aad6667-7a55-4c14-a191-7723fd1e5274\") " Nov 26 15:19:57 crc kubenswrapper[4785]: I1126 15:19:57.400455 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4aad6667-7a55-4c14-a191-7723fd1e5274-bundle" (OuterVolumeSpecName: "bundle") pod "4aad6667-7a55-4c14-a191-7723fd1e5274" (UID: "4aad6667-7a55-4c14-a191-7723fd1e5274"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 15:19:57 crc kubenswrapper[4785]: I1126 15:19:57.409349 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4aad6667-7a55-4c14-a191-7723fd1e5274-util" (OuterVolumeSpecName: "util") pod "4aad6667-7a55-4c14-a191-7723fd1e5274" (UID: "4aad6667-7a55-4c14-a191-7723fd1e5274"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 15:19:57 crc kubenswrapper[4785]: I1126 15:19:57.409731 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4aad6667-7a55-4c14-a191-7723fd1e5274-kube-api-access-8rghx" (OuterVolumeSpecName: "kube-api-access-8rghx") pod "4aad6667-7a55-4c14-a191-7723fd1e5274" (UID: "4aad6667-7a55-4c14-a191-7723fd1e5274"). InnerVolumeSpecName "kube-api-access-8rghx". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 15:19:57 crc kubenswrapper[4785]: I1126 15:19:57.501003 4785 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4aad6667-7a55-4c14-a191-7723fd1e5274-bundle\") on node \"crc\" DevicePath \"\"" Nov 26 15:19:57 crc kubenswrapper[4785]: I1126 15:19:57.501045 4785 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8rghx\" (UniqueName: \"kubernetes.io/projected/4aad6667-7a55-4c14-a191-7723fd1e5274-kube-api-access-8rghx\") on node \"crc\" DevicePath \"\"" Nov 26 15:19:57 crc kubenswrapper[4785]: I1126 15:19:57.501059 4785 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4aad6667-7a55-4c14-a191-7723fd1e5274-util\") on node \"crc\" DevicePath \"\"" Nov 26 15:19:57 crc kubenswrapper[4785]: I1126 15:19:57.922740 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/27e8bc079695f3aed52a6c5be68196d91a6230a1a03a8fc87a19aa534flmpn9" event={"ID":"4aad6667-7a55-4c14-a191-7723fd1e5274","Type":"ContainerDied","Data":"15255ce84f4524b01328f17e634b3221cac5be46a9f08512d8226d7da2a9089e"} Nov 26 15:19:57 crc kubenswrapper[4785]: I1126 15:19:57.923371 4785 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="15255ce84f4524b01328f17e634b3221cac5be46a9f08512d8226d7da2a9089e" Nov 26 15:19:57 crc kubenswrapper[4785]: I1126 15:19:57.922876 4785 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/27e8bc079695f3aed52a6c5be68196d91a6230a1a03a8fc87a19aa534flmpn9" Nov 26 15:20:04 crc kubenswrapper[4785]: I1126 15:20:04.065258 4785 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-747fb5cb85-5slw2"] Nov 26 15:20:04 crc kubenswrapper[4785]: E1126 15:20:04.065910 4785 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4aad6667-7a55-4c14-a191-7723fd1e5274" containerName="pull" Nov 26 15:20:04 crc kubenswrapper[4785]: I1126 15:20:04.065923 4785 state_mem.go:107] "Deleted CPUSet assignment" podUID="4aad6667-7a55-4c14-a191-7723fd1e5274" containerName="pull" Nov 26 15:20:04 crc kubenswrapper[4785]: E1126 15:20:04.065935 4785 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4aad6667-7a55-4c14-a191-7723fd1e5274" containerName="util" Nov 26 15:20:04 crc kubenswrapper[4785]: I1126 15:20:04.065941 4785 state_mem.go:107] "Deleted CPUSet assignment" podUID="4aad6667-7a55-4c14-a191-7723fd1e5274" containerName="util" Nov 26 15:20:04 crc kubenswrapper[4785]: E1126 15:20:04.065955 4785 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4aad6667-7a55-4c14-a191-7723fd1e5274" containerName="extract" Nov 26 15:20:04 crc kubenswrapper[4785]: I1126 15:20:04.065962 4785 state_mem.go:107] "Deleted CPUSet assignment" podUID="4aad6667-7a55-4c14-a191-7723fd1e5274" containerName="extract" Nov 26 15:20:04 crc kubenswrapper[4785]: I1126 15:20:04.066066 4785 memory_manager.go:354] "RemoveStaleState removing state" podUID="4aad6667-7a55-4c14-a191-7723fd1e5274" containerName="extract" Nov 26 15:20:04 crc kubenswrapper[4785]: I1126 15:20:04.066417 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-747fb5cb85-5slw2" Nov 26 15:20:04 crc kubenswrapper[4785]: I1126 15:20:04.068454 4785 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Nov 26 15:20:04 crc kubenswrapper[4785]: I1126 15:20:04.068647 4785 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-service-cert" Nov 26 15:20:04 crc kubenswrapper[4785]: I1126 15:20:04.068682 4785 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-fktn4" Nov 26 15:20:04 crc kubenswrapper[4785]: I1126 15:20:04.078643 4785 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-747fb5cb85-5slw2"] Nov 26 15:20:04 crc kubenswrapper[4785]: I1126 15:20:04.187834 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/8d72a0d6-d729-4c0f-90c7-2a5eca6fc32a-apiservice-cert\") pod \"mariadb-operator-controller-manager-747fb5cb85-5slw2\" (UID: \"8d72a0d6-d729-4c0f-90c7-2a5eca6fc32a\") " pod="openstack-operators/mariadb-operator-controller-manager-747fb5cb85-5slw2" Nov 26 15:20:04 crc kubenswrapper[4785]: I1126 15:20:04.187895 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/8d72a0d6-d729-4c0f-90c7-2a5eca6fc32a-webhook-cert\") pod \"mariadb-operator-controller-manager-747fb5cb85-5slw2\" (UID: \"8d72a0d6-d729-4c0f-90c7-2a5eca6fc32a\") " pod="openstack-operators/mariadb-operator-controller-manager-747fb5cb85-5slw2" Nov 26 15:20:04 crc kubenswrapper[4785]: I1126 15:20:04.187929 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fzt9m\" (UniqueName: \"kubernetes.io/projected/8d72a0d6-d729-4c0f-90c7-2a5eca6fc32a-kube-api-access-fzt9m\") pod \"mariadb-operator-controller-manager-747fb5cb85-5slw2\" (UID: \"8d72a0d6-d729-4c0f-90c7-2a5eca6fc32a\") " pod="openstack-operators/mariadb-operator-controller-manager-747fb5cb85-5slw2" Nov 26 15:20:04 crc kubenswrapper[4785]: I1126 15:20:04.288921 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/8d72a0d6-d729-4c0f-90c7-2a5eca6fc32a-webhook-cert\") pod \"mariadb-operator-controller-manager-747fb5cb85-5slw2\" (UID: \"8d72a0d6-d729-4c0f-90c7-2a5eca6fc32a\") " pod="openstack-operators/mariadb-operator-controller-manager-747fb5cb85-5slw2" Nov 26 15:20:04 crc kubenswrapper[4785]: I1126 15:20:04.289114 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fzt9m\" (UniqueName: \"kubernetes.io/projected/8d72a0d6-d729-4c0f-90c7-2a5eca6fc32a-kube-api-access-fzt9m\") pod \"mariadb-operator-controller-manager-747fb5cb85-5slw2\" (UID: \"8d72a0d6-d729-4c0f-90c7-2a5eca6fc32a\") " pod="openstack-operators/mariadb-operator-controller-manager-747fb5cb85-5slw2" Nov 26 15:20:04 crc kubenswrapper[4785]: I1126 15:20:04.289185 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/8d72a0d6-d729-4c0f-90c7-2a5eca6fc32a-apiservice-cert\") pod \"mariadb-operator-controller-manager-747fb5cb85-5slw2\" (UID: \"8d72a0d6-d729-4c0f-90c7-2a5eca6fc32a\") " pod="openstack-operators/mariadb-operator-controller-manager-747fb5cb85-5slw2" Nov 26 15:20:04 crc kubenswrapper[4785]: I1126 15:20:04.294101 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/8d72a0d6-d729-4c0f-90c7-2a5eca6fc32a-apiservice-cert\") pod \"mariadb-operator-controller-manager-747fb5cb85-5slw2\" (UID: \"8d72a0d6-d729-4c0f-90c7-2a5eca6fc32a\") " pod="openstack-operators/mariadb-operator-controller-manager-747fb5cb85-5slw2" Nov 26 15:20:04 crc kubenswrapper[4785]: I1126 15:20:04.305413 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/8d72a0d6-d729-4c0f-90c7-2a5eca6fc32a-webhook-cert\") pod \"mariadb-operator-controller-manager-747fb5cb85-5slw2\" (UID: \"8d72a0d6-d729-4c0f-90c7-2a5eca6fc32a\") " pod="openstack-operators/mariadb-operator-controller-manager-747fb5cb85-5slw2" Nov 26 15:20:04 crc kubenswrapper[4785]: I1126 15:20:04.305991 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fzt9m\" (UniqueName: \"kubernetes.io/projected/8d72a0d6-d729-4c0f-90c7-2a5eca6fc32a-kube-api-access-fzt9m\") pod \"mariadb-operator-controller-manager-747fb5cb85-5slw2\" (UID: \"8d72a0d6-d729-4c0f-90c7-2a5eca6fc32a\") " pod="openstack-operators/mariadb-operator-controller-manager-747fb5cb85-5slw2" Nov 26 15:20:04 crc kubenswrapper[4785]: I1126 15:20:04.383764 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-747fb5cb85-5slw2" Nov 26 15:20:04 crc kubenswrapper[4785]: I1126 15:20:04.805215 4785 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-747fb5cb85-5slw2"] Nov 26 15:20:04 crc kubenswrapper[4785]: I1126 15:20:04.976422 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-747fb5cb85-5slw2" event={"ID":"8d72a0d6-d729-4c0f-90c7-2a5eca6fc32a","Type":"ContainerStarted","Data":"d408364c066a2dd4629ff7d9ffeb8f896d8d495f46f07311592ea5fd6abe80d6"} Nov 26 15:20:07 crc kubenswrapper[4785]: I1126 15:20:07.288886 4785 patch_prober.go:28] interesting pod/machine-config-daemon-gkxdl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 26 15:20:07 crc kubenswrapper[4785]: I1126 15:20:07.289262 4785 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-gkxdl" podUID="5539d39a-e2bc-4e7f-8b5a-a5e3e10c4ba4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 26 15:20:10 crc kubenswrapper[4785]: I1126 15:20:10.005097 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-747fb5cb85-5slw2" event={"ID":"8d72a0d6-d729-4c0f-90c7-2a5eca6fc32a","Type":"ContainerStarted","Data":"24bb1ca1f490c9d7f463fb454dd4b20660d0c21a62cc095d06e0fca8c5f19482"} Nov 26 15:20:10 crc kubenswrapper[4785]: I1126 15:20:10.005486 4785 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-747fb5cb85-5slw2" Nov 26 15:20:10 crc kubenswrapper[4785]: I1126 15:20:10.019065 4785 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-747fb5cb85-5slw2" podStartSLOduration=1.599734773 podStartE2EDuration="6.019049851s" podCreationTimestamp="2025-11-26 15:20:04 +0000 UTC" firstStartedPulling="2025-11-26 15:20:04.828753638 +0000 UTC m=+768.507119402" lastFinishedPulling="2025-11-26 15:20:09.248068716 +0000 UTC m=+772.926434480" observedRunningTime="2025-11-26 15:20:10.018029464 +0000 UTC m=+773.696395238" watchObservedRunningTime="2025-11-26 15:20:10.019049851 +0000 UTC m=+773.697415615" Nov 26 15:20:14 crc kubenswrapper[4785]: I1126 15:20:14.390082 4785 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-747fb5cb85-5slw2" Nov 26 15:20:19 crc kubenswrapper[4785]: I1126 15:20:19.177267 4785 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-index-fksfv"] Nov 26 15:20:19 crc kubenswrapper[4785]: I1126 15:20:19.178771 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-index-fksfv" Nov 26 15:20:19 crc kubenswrapper[4785]: I1126 15:20:19.181051 4785 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-index-dockercfg-v6tlh" Nov 26 15:20:19 crc kubenswrapper[4785]: I1126 15:20:19.192264 4785 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-index-fksfv"] Nov 26 15:20:19 crc kubenswrapper[4785]: I1126 15:20:19.313481 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n4xqh\" (UniqueName: \"kubernetes.io/projected/41d2f8b5-9b84-41a6-8900-fd76783adb8c-kube-api-access-n4xqh\") pod \"infra-operator-index-fksfv\" (UID: \"41d2f8b5-9b84-41a6-8900-fd76783adb8c\") " pod="openstack-operators/infra-operator-index-fksfv" Nov 26 15:20:19 crc kubenswrapper[4785]: I1126 15:20:19.415377 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n4xqh\" (UniqueName: \"kubernetes.io/projected/41d2f8b5-9b84-41a6-8900-fd76783adb8c-kube-api-access-n4xqh\") pod \"infra-operator-index-fksfv\" (UID: \"41d2f8b5-9b84-41a6-8900-fd76783adb8c\") " pod="openstack-operators/infra-operator-index-fksfv" Nov 26 15:20:19 crc kubenswrapper[4785]: I1126 15:20:19.449947 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n4xqh\" (UniqueName: \"kubernetes.io/projected/41d2f8b5-9b84-41a6-8900-fd76783adb8c-kube-api-access-n4xqh\") pod \"infra-operator-index-fksfv\" (UID: \"41d2f8b5-9b84-41a6-8900-fd76783adb8c\") " pod="openstack-operators/infra-operator-index-fksfv" Nov 26 15:20:19 crc kubenswrapper[4785]: I1126 15:20:19.543142 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-index-fksfv" Nov 26 15:20:19 crc kubenswrapper[4785]: I1126 15:20:19.949902 4785 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-index-fksfv"] Nov 26 15:20:19 crc kubenswrapper[4785]: W1126 15:20:19.956839 4785 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod41d2f8b5_9b84_41a6_8900_fd76783adb8c.slice/crio-bcada80661f6b5ff2b07315500254b261ae17ba9c3973b39f2e89693970f4b7b WatchSource:0}: Error finding container bcada80661f6b5ff2b07315500254b261ae17ba9c3973b39f2e89693970f4b7b: Status 404 returned error can't find the container with id bcada80661f6b5ff2b07315500254b261ae17ba9c3973b39f2e89693970f4b7b Nov 26 15:20:20 crc kubenswrapper[4785]: I1126 15:20:20.071090 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-index-fksfv" event={"ID":"41d2f8b5-9b84-41a6-8900-fd76783adb8c","Type":"ContainerStarted","Data":"bcada80661f6b5ff2b07315500254b261ae17ba9c3973b39f2e89693970f4b7b"} Nov 26 15:20:22 crc kubenswrapper[4785]: I1126 15:20:22.086490 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-index-fksfv" event={"ID":"41d2f8b5-9b84-41a6-8900-fd76783adb8c","Type":"ContainerStarted","Data":"024691d51cff061eb6f53dac55faf0ad78da68f61f517ab1f86322c348596848"} Nov 26 15:20:22 crc kubenswrapper[4785]: I1126 15:20:22.102573 4785 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-index-fksfv" podStartSLOduration=1.904226567 podStartE2EDuration="3.102538364s" podCreationTimestamp="2025-11-26 15:20:19 +0000 UTC" firstStartedPulling="2025-11-26 15:20:19.960969634 +0000 UTC m=+783.639335428" lastFinishedPulling="2025-11-26 15:20:21.159281461 +0000 UTC m=+784.837647225" observedRunningTime="2025-11-26 15:20:22.101354362 +0000 UTC m=+785.779720156" watchObservedRunningTime="2025-11-26 15:20:22.102538364 +0000 UTC m=+785.780904128" Nov 26 15:20:23 crc kubenswrapper[4785]: I1126 15:20:23.178133 4785 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/infra-operator-index-fksfv"] Nov 26 15:20:23 crc kubenswrapper[4785]: I1126 15:20:23.788843 4785 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-index-lf5bg"] Nov 26 15:20:23 crc kubenswrapper[4785]: I1126 15:20:23.790330 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-index-lf5bg" Nov 26 15:20:23 crc kubenswrapper[4785]: I1126 15:20:23.796612 4785 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-index-lf5bg"] Nov 26 15:20:23 crc kubenswrapper[4785]: I1126 15:20:23.881800 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5r7k4\" (UniqueName: \"kubernetes.io/projected/f83f3408-4867-48ce-8161-9c07c4e887ec-kube-api-access-5r7k4\") pod \"infra-operator-index-lf5bg\" (UID: \"f83f3408-4867-48ce-8161-9c07c4e887ec\") " pod="openstack-operators/infra-operator-index-lf5bg" Nov 26 15:20:23 crc kubenswrapper[4785]: I1126 15:20:23.983312 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5r7k4\" (UniqueName: \"kubernetes.io/projected/f83f3408-4867-48ce-8161-9c07c4e887ec-kube-api-access-5r7k4\") pod \"infra-operator-index-lf5bg\" (UID: \"f83f3408-4867-48ce-8161-9c07c4e887ec\") " pod="openstack-operators/infra-operator-index-lf5bg" Nov 26 15:20:24 crc kubenswrapper[4785]: I1126 15:20:24.007725 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5r7k4\" (UniqueName: \"kubernetes.io/projected/f83f3408-4867-48ce-8161-9c07c4e887ec-kube-api-access-5r7k4\") pod \"infra-operator-index-lf5bg\" (UID: \"f83f3408-4867-48ce-8161-9c07c4e887ec\") " pod="openstack-operators/infra-operator-index-lf5bg" Nov 26 15:20:24 crc kubenswrapper[4785]: I1126 15:20:24.100380 4785 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/infra-operator-index-fksfv" podUID="41d2f8b5-9b84-41a6-8900-fd76783adb8c" containerName="registry-server" containerID="cri-o://024691d51cff061eb6f53dac55faf0ad78da68f61f517ab1f86322c348596848" gracePeriod=2 Nov 26 15:20:24 crc kubenswrapper[4785]: I1126 15:20:24.121716 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-index-lf5bg" Nov 26 15:20:24 crc kubenswrapper[4785]: I1126 15:20:24.528039 4785 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-index-lf5bg"] Nov 26 15:20:24 crc kubenswrapper[4785]: W1126 15:20:24.540137 4785 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf83f3408_4867_48ce_8161_9c07c4e887ec.slice/crio-d697b4589df5cd92cd8200f107f03be6fbacd5acfe53515e041e05c077b10492 WatchSource:0}: Error finding container d697b4589df5cd92cd8200f107f03be6fbacd5acfe53515e041e05c077b10492: Status 404 returned error can't find the container with id d697b4589df5cd92cd8200f107f03be6fbacd5acfe53515e041e05c077b10492 Nov 26 15:20:24 crc kubenswrapper[4785]: I1126 15:20:24.552449 4785 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-index-fksfv" Nov 26 15:20:24 crc kubenswrapper[4785]: I1126 15:20:24.692359 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n4xqh\" (UniqueName: \"kubernetes.io/projected/41d2f8b5-9b84-41a6-8900-fd76783adb8c-kube-api-access-n4xqh\") pod \"41d2f8b5-9b84-41a6-8900-fd76783adb8c\" (UID: \"41d2f8b5-9b84-41a6-8900-fd76783adb8c\") " Nov 26 15:20:24 crc kubenswrapper[4785]: I1126 15:20:24.702735 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/41d2f8b5-9b84-41a6-8900-fd76783adb8c-kube-api-access-n4xqh" (OuterVolumeSpecName: "kube-api-access-n4xqh") pod "41d2f8b5-9b84-41a6-8900-fd76783adb8c" (UID: "41d2f8b5-9b84-41a6-8900-fd76783adb8c"). InnerVolumeSpecName "kube-api-access-n4xqh". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 15:20:24 crc kubenswrapper[4785]: I1126 15:20:24.794167 4785 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n4xqh\" (UniqueName: \"kubernetes.io/projected/41d2f8b5-9b84-41a6-8900-fd76783adb8c-kube-api-access-n4xqh\") on node \"crc\" DevicePath \"\"" Nov 26 15:20:25 crc kubenswrapper[4785]: I1126 15:20:25.107073 4785 generic.go:334] "Generic (PLEG): container finished" podID="41d2f8b5-9b84-41a6-8900-fd76783adb8c" containerID="024691d51cff061eb6f53dac55faf0ad78da68f61f517ab1f86322c348596848" exitCode=0 Nov 26 15:20:25 crc kubenswrapper[4785]: I1126 15:20:25.107144 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-index-fksfv" event={"ID":"41d2f8b5-9b84-41a6-8900-fd76783adb8c","Type":"ContainerDied","Data":"024691d51cff061eb6f53dac55faf0ad78da68f61f517ab1f86322c348596848"} Nov 26 15:20:25 crc kubenswrapper[4785]: I1126 15:20:25.107169 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-index-fksfv" event={"ID":"41d2f8b5-9b84-41a6-8900-fd76783adb8c","Type":"ContainerDied","Data":"bcada80661f6b5ff2b07315500254b261ae17ba9c3973b39f2e89693970f4b7b"} Nov 26 15:20:25 crc kubenswrapper[4785]: I1126 15:20:25.107173 4785 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-index-fksfv" Nov 26 15:20:25 crc kubenswrapper[4785]: I1126 15:20:25.107185 4785 scope.go:117] "RemoveContainer" containerID="024691d51cff061eb6f53dac55faf0ad78da68f61f517ab1f86322c348596848" Nov 26 15:20:25 crc kubenswrapper[4785]: I1126 15:20:25.108342 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-index-lf5bg" event={"ID":"f83f3408-4867-48ce-8161-9c07c4e887ec","Type":"ContainerStarted","Data":"d697b4589df5cd92cd8200f107f03be6fbacd5acfe53515e041e05c077b10492"} Nov 26 15:20:25 crc kubenswrapper[4785]: I1126 15:20:25.130900 4785 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/infra-operator-index-fksfv"] Nov 26 15:20:25 crc kubenswrapper[4785]: I1126 15:20:25.134311 4785 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/infra-operator-index-fksfv"] Nov 26 15:20:25 crc kubenswrapper[4785]: I1126 15:20:25.134400 4785 scope.go:117] "RemoveContainer" containerID="024691d51cff061eb6f53dac55faf0ad78da68f61f517ab1f86322c348596848" Nov 26 15:20:25 crc kubenswrapper[4785]: E1126 15:20:25.134807 4785 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"024691d51cff061eb6f53dac55faf0ad78da68f61f517ab1f86322c348596848\": container with ID starting with 024691d51cff061eb6f53dac55faf0ad78da68f61f517ab1f86322c348596848 not found: ID does not exist" containerID="024691d51cff061eb6f53dac55faf0ad78da68f61f517ab1f86322c348596848" Nov 26 15:20:25 crc kubenswrapper[4785]: I1126 15:20:25.134847 4785 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"024691d51cff061eb6f53dac55faf0ad78da68f61f517ab1f86322c348596848"} err="failed to get container status \"024691d51cff061eb6f53dac55faf0ad78da68f61f517ab1f86322c348596848\": rpc error: code = NotFound desc = could not find container \"024691d51cff061eb6f53dac55faf0ad78da68f61f517ab1f86322c348596848\": container with ID starting with 024691d51cff061eb6f53dac55faf0ad78da68f61f517ab1f86322c348596848 not found: ID does not exist" Nov 26 15:20:26 crc kubenswrapper[4785]: I1126 15:20:26.120050 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-index-lf5bg" event={"ID":"f83f3408-4867-48ce-8161-9c07c4e887ec","Type":"ContainerStarted","Data":"1de985d5f95345e8d55f8709ee49ee1dfcb3c25d1e616cbab58ad74afcdc0c56"} Nov 26 15:20:26 crc kubenswrapper[4785]: I1126 15:20:26.153603 4785 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-index-lf5bg" podStartSLOduration=2.592678054 podStartE2EDuration="3.153536017s" podCreationTimestamp="2025-11-26 15:20:23 +0000 UTC" firstStartedPulling="2025-11-26 15:20:24.543975291 +0000 UTC m=+788.222341055" lastFinishedPulling="2025-11-26 15:20:25.104833254 +0000 UTC m=+788.783199018" observedRunningTime="2025-11-26 15:20:26.148108508 +0000 UTC m=+789.826474342" watchObservedRunningTime="2025-11-26 15:20:26.153536017 +0000 UTC m=+789.831901851" Nov 26 15:20:27 crc kubenswrapper[4785]: I1126 15:20:27.049666 4785 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="41d2f8b5-9b84-41a6-8900-fd76783adb8c" path="/var/lib/kubelet/pods/41d2f8b5-9b84-41a6-8900-fd76783adb8c/volumes" Nov 26 15:20:34 crc kubenswrapper[4785]: I1126 15:20:34.122426 4785 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-index-lf5bg" Nov 26 15:20:34 crc kubenswrapper[4785]: I1126 15:20:34.123195 4785 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/infra-operator-index-lf5bg" Nov 26 15:20:34 crc kubenswrapper[4785]: I1126 15:20:34.158106 4785 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/infra-operator-index-lf5bg" Nov 26 15:20:34 crc kubenswrapper[4785]: I1126 15:20:34.212626 4785 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-index-lf5bg" Nov 26 15:20:35 crc kubenswrapper[4785]: I1126 15:20:35.630515 4785 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/5d473c3169f40b179d14921c90af2c8546b7b757fe551b7dba7d903f5d6ndpx"] Nov 26 15:20:35 crc kubenswrapper[4785]: E1126 15:20:35.630928 4785 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="41d2f8b5-9b84-41a6-8900-fd76783adb8c" containerName="registry-server" Nov 26 15:20:35 crc kubenswrapper[4785]: I1126 15:20:35.630950 4785 state_mem.go:107] "Deleted CPUSet assignment" podUID="41d2f8b5-9b84-41a6-8900-fd76783adb8c" containerName="registry-server" Nov 26 15:20:35 crc kubenswrapper[4785]: I1126 15:20:35.631124 4785 memory_manager.go:354] "RemoveStaleState removing state" podUID="41d2f8b5-9b84-41a6-8900-fd76783adb8c" containerName="registry-server" Nov 26 15:20:35 crc kubenswrapper[4785]: I1126 15:20:35.633286 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/5d473c3169f40b179d14921c90af2c8546b7b757fe551b7dba7d903f5d6ndpx" Nov 26 15:20:35 crc kubenswrapper[4785]: I1126 15:20:35.635762 4785 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-sfhll" Nov 26 15:20:35 crc kubenswrapper[4785]: I1126 15:20:35.642595 4785 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/5d473c3169f40b179d14921c90af2c8546b7b757fe551b7dba7d903f5d6ndpx"] Nov 26 15:20:35 crc kubenswrapper[4785]: I1126 15:20:35.737782 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qh45t\" (UniqueName: \"kubernetes.io/projected/362049d6-ebac-4703-b856-408cc878f2b6-kube-api-access-qh45t\") pod \"5d473c3169f40b179d14921c90af2c8546b7b757fe551b7dba7d903f5d6ndpx\" (UID: \"362049d6-ebac-4703-b856-408cc878f2b6\") " pod="openstack-operators/5d473c3169f40b179d14921c90af2c8546b7b757fe551b7dba7d903f5d6ndpx" Nov 26 15:20:35 crc kubenswrapper[4785]: I1126 15:20:35.737830 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/362049d6-ebac-4703-b856-408cc878f2b6-bundle\") pod \"5d473c3169f40b179d14921c90af2c8546b7b757fe551b7dba7d903f5d6ndpx\" (UID: \"362049d6-ebac-4703-b856-408cc878f2b6\") " pod="openstack-operators/5d473c3169f40b179d14921c90af2c8546b7b757fe551b7dba7d903f5d6ndpx" Nov 26 15:20:35 crc kubenswrapper[4785]: I1126 15:20:35.737877 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/362049d6-ebac-4703-b856-408cc878f2b6-util\") pod \"5d473c3169f40b179d14921c90af2c8546b7b757fe551b7dba7d903f5d6ndpx\" (UID: \"362049d6-ebac-4703-b856-408cc878f2b6\") " pod="openstack-operators/5d473c3169f40b179d14921c90af2c8546b7b757fe551b7dba7d903f5d6ndpx" Nov 26 15:20:35 crc kubenswrapper[4785]: I1126 15:20:35.839467 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qh45t\" (UniqueName: \"kubernetes.io/projected/362049d6-ebac-4703-b856-408cc878f2b6-kube-api-access-qh45t\") pod \"5d473c3169f40b179d14921c90af2c8546b7b757fe551b7dba7d903f5d6ndpx\" (UID: \"362049d6-ebac-4703-b856-408cc878f2b6\") " pod="openstack-operators/5d473c3169f40b179d14921c90af2c8546b7b757fe551b7dba7d903f5d6ndpx" Nov 26 15:20:35 crc kubenswrapper[4785]: I1126 15:20:35.839600 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/362049d6-ebac-4703-b856-408cc878f2b6-bundle\") pod \"5d473c3169f40b179d14921c90af2c8546b7b757fe551b7dba7d903f5d6ndpx\" (UID: \"362049d6-ebac-4703-b856-408cc878f2b6\") " pod="openstack-operators/5d473c3169f40b179d14921c90af2c8546b7b757fe551b7dba7d903f5d6ndpx" Nov 26 15:20:35 crc kubenswrapper[4785]: I1126 15:20:35.839696 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/362049d6-ebac-4703-b856-408cc878f2b6-util\") pod \"5d473c3169f40b179d14921c90af2c8546b7b757fe551b7dba7d903f5d6ndpx\" (UID: \"362049d6-ebac-4703-b856-408cc878f2b6\") " pod="openstack-operators/5d473c3169f40b179d14921c90af2c8546b7b757fe551b7dba7d903f5d6ndpx" Nov 26 15:20:35 crc kubenswrapper[4785]: I1126 15:20:35.840607 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/362049d6-ebac-4703-b856-408cc878f2b6-util\") pod \"5d473c3169f40b179d14921c90af2c8546b7b757fe551b7dba7d903f5d6ndpx\" (UID: \"362049d6-ebac-4703-b856-408cc878f2b6\") " pod="openstack-operators/5d473c3169f40b179d14921c90af2c8546b7b757fe551b7dba7d903f5d6ndpx" Nov 26 15:20:35 crc kubenswrapper[4785]: I1126 15:20:35.841638 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/362049d6-ebac-4703-b856-408cc878f2b6-bundle\") pod \"5d473c3169f40b179d14921c90af2c8546b7b757fe551b7dba7d903f5d6ndpx\" (UID: \"362049d6-ebac-4703-b856-408cc878f2b6\") " pod="openstack-operators/5d473c3169f40b179d14921c90af2c8546b7b757fe551b7dba7d903f5d6ndpx" Nov 26 15:20:35 crc kubenswrapper[4785]: I1126 15:20:35.876649 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qh45t\" (UniqueName: \"kubernetes.io/projected/362049d6-ebac-4703-b856-408cc878f2b6-kube-api-access-qh45t\") pod \"5d473c3169f40b179d14921c90af2c8546b7b757fe551b7dba7d903f5d6ndpx\" (UID: \"362049d6-ebac-4703-b856-408cc878f2b6\") " pod="openstack-operators/5d473c3169f40b179d14921c90af2c8546b7b757fe551b7dba7d903f5d6ndpx" Nov 26 15:20:35 crc kubenswrapper[4785]: I1126 15:20:35.988247 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/5d473c3169f40b179d14921c90af2c8546b7b757fe551b7dba7d903f5d6ndpx" Nov 26 15:20:36 crc kubenswrapper[4785]: I1126 15:20:36.457848 4785 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/5d473c3169f40b179d14921c90af2c8546b7b757fe551b7dba7d903f5d6ndpx"] Nov 26 15:20:36 crc kubenswrapper[4785]: W1126 15:20:36.475942 4785 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod362049d6_ebac_4703_b856_408cc878f2b6.slice/crio-1e62885264539155bcf77c2239951b201d455e2a4ba83576bd0d7e0a263afda0 WatchSource:0}: Error finding container 1e62885264539155bcf77c2239951b201d455e2a4ba83576bd0d7e0a263afda0: Status 404 returned error can't find the container with id 1e62885264539155bcf77c2239951b201d455e2a4ba83576bd0d7e0a263afda0 Nov 26 15:20:37 crc kubenswrapper[4785]: I1126 15:20:37.202334 4785 generic.go:334] "Generic (PLEG): container finished" podID="362049d6-ebac-4703-b856-408cc878f2b6" containerID="a3b95f0582acc4208632fb5e4440b7d652ee9e82801febac8078c1dd1bc1fefe" exitCode=0 Nov 26 15:20:37 crc kubenswrapper[4785]: I1126 15:20:37.202686 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/5d473c3169f40b179d14921c90af2c8546b7b757fe551b7dba7d903f5d6ndpx" event={"ID":"362049d6-ebac-4703-b856-408cc878f2b6","Type":"ContainerDied","Data":"a3b95f0582acc4208632fb5e4440b7d652ee9e82801febac8078c1dd1bc1fefe"} Nov 26 15:20:37 crc kubenswrapper[4785]: I1126 15:20:37.202717 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/5d473c3169f40b179d14921c90af2c8546b7b757fe551b7dba7d903f5d6ndpx" event={"ID":"362049d6-ebac-4703-b856-408cc878f2b6","Type":"ContainerStarted","Data":"1e62885264539155bcf77c2239951b201d455e2a4ba83576bd0d7e0a263afda0"} Nov 26 15:20:37 crc kubenswrapper[4785]: I1126 15:20:37.289694 4785 patch_prober.go:28] interesting pod/machine-config-daemon-gkxdl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 26 15:20:37 crc kubenswrapper[4785]: I1126 15:20:37.289803 4785 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-gkxdl" podUID="5539d39a-e2bc-4e7f-8b5a-a5e3e10c4ba4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 26 15:20:37 crc kubenswrapper[4785]: I1126 15:20:37.289889 4785 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-gkxdl" Nov 26 15:20:37 crc kubenswrapper[4785]: I1126 15:20:37.290735 4785 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"5335547fe096a0c8db3ef049b0bdabe9e8e04a213e13c79d8696fbb738fcc9fa"} pod="openshift-machine-config-operator/machine-config-daemon-gkxdl" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 26 15:20:37 crc kubenswrapper[4785]: I1126 15:20:37.290847 4785 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-gkxdl" podUID="5539d39a-e2bc-4e7f-8b5a-a5e3e10c4ba4" containerName="machine-config-daemon" containerID="cri-o://5335547fe096a0c8db3ef049b0bdabe9e8e04a213e13c79d8696fbb738fcc9fa" gracePeriod=600 Nov 26 15:20:38 crc kubenswrapper[4785]: I1126 15:20:38.213687 4785 generic.go:334] "Generic (PLEG): container finished" podID="5539d39a-e2bc-4e7f-8b5a-a5e3e10c4ba4" containerID="5335547fe096a0c8db3ef049b0bdabe9e8e04a213e13c79d8696fbb738fcc9fa" exitCode=0 Nov 26 15:20:38 crc kubenswrapper[4785]: I1126 15:20:38.213889 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-gkxdl" event={"ID":"5539d39a-e2bc-4e7f-8b5a-a5e3e10c4ba4","Type":"ContainerDied","Data":"5335547fe096a0c8db3ef049b0bdabe9e8e04a213e13c79d8696fbb738fcc9fa"} Nov 26 15:20:38 crc kubenswrapper[4785]: I1126 15:20:38.214421 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-gkxdl" event={"ID":"5539d39a-e2bc-4e7f-8b5a-a5e3e10c4ba4","Type":"ContainerStarted","Data":"a6ffa546d6cd3d829cf51dc79161c420c203f2371af580e656e6ea6f7619320e"} Nov 26 15:20:38 crc kubenswrapper[4785]: I1126 15:20:38.214444 4785 scope.go:117] "RemoveContainer" containerID="535a7eec54b083cab8f1c3d8c54d6dd26d51d22ccda14e30a0448b01a72f1e1c" Nov 26 15:20:38 crc kubenswrapper[4785]: I1126 15:20:38.218824 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/5d473c3169f40b179d14921c90af2c8546b7b757fe551b7dba7d903f5d6ndpx" event={"ID":"362049d6-ebac-4703-b856-408cc878f2b6","Type":"ContainerStarted","Data":"5cb9b4fdf678a8f5e1d252c0406775c264b9855147050fef44c243536a2ad0cd"} Nov 26 15:20:39 crc kubenswrapper[4785]: I1126 15:20:39.230097 4785 generic.go:334] "Generic (PLEG): container finished" podID="362049d6-ebac-4703-b856-408cc878f2b6" containerID="5cb9b4fdf678a8f5e1d252c0406775c264b9855147050fef44c243536a2ad0cd" exitCode=0 Nov 26 15:20:39 crc kubenswrapper[4785]: I1126 15:20:39.230194 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/5d473c3169f40b179d14921c90af2c8546b7b757fe551b7dba7d903f5d6ndpx" event={"ID":"362049d6-ebac-4703-b856-408cc878f2b6","Type":"ContainerDied","Data":"5cb9b4fdf678a8f5e1d252c0406775c264b9855147050fef44c243536a2ad0cd"} Nov 26 15:20:40 crc kubenswrapper[4785]: I1126 15:20:40.244257 4785 generic.go:334] "Generic (PLEG): container finished" podID="362049d6-ebac-4703-b856-408cc878f2b6" containerID="2ffe7084f15b6eeab8a5f8321de3ebc46c6126083b3f73257ab2908c2700f24a" exitCode=0 Nov 26 15:20:40 crc kubenswrapper[4785]: I1126 15:20:40.244318 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/5d473c3169f40b179d14921c90af2c8546b7b757fe551b7dba7d903f5d6ndpx" event={"ID":"362049d6-ebac-4703-b856-408cc878f2b6","Type":"ContainerDied","Data":"2ffe7084f15b6eeab8a5f8321de3ebc46c6126083b3f73257ab2908c2700f24a"} Nov 26 15:20:41 crc kubenswrapper[4785]: I1126 15:20:41.513356 4785 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/5d473c3169f40b179d14921c90af2c8546b7b757fe551b7dba7d903f5d6ndpx" Nov 26 15:20:41 crc kubenswrapper[4785]: I1126 15:20:41.619316 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qh45t\" (UniqueName: \"kubernetes.io/projected/362049d6-ebac-4703-b856-408cc878f2b6-kube-api-access-qh45t\") pod \"362049d6-ebac-4703-b856-408cc878f2b6\" (UID: \"362049d6-ebac-4703-b856-408cc878f2b6\") " Nov 26 15:20:41 crc kubenswrapper[4785]: I1126 15:20:41.619386 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/362049d6-ebac-4703-b856-408cc878f2b6-util\") pod \"362049d6-ebac-4703-b856-408cc878f2b6\" (UID: \"362049d6-ebac-4703-b856-408cc878f2b6\") " Nov 26 15:20:41 crc kubenswrapper[4785]: I1126 15:20:41.619440 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/362049d6-ebac-4703-b856-408cc878f2b6-bundle\") pod \"362049d6-ebac-4703-b856-408cc878f2b6\" (UID: \"362049d6-ebac-4703-b856-408cc878f2b6\") " Nov 26 15:20:41 crc kubenswrapper[4785]: I1126 15:20:41.620475 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/362049d6-ebac-4703-b856-408cc878f2b6-bundle" (OuterVolumeSpecName: "bundle") pod "362049d6-ebac-4703-b856-408cc878f2b6" (UID: "362049d6-ebac-4703-b856-408cc878f2b6"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 15:20:41 crc kubenswrapper[4785]: I1126 15:20:41.626822 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/362049d6-ebac-4703-b856-408cc878f2b6-kube-api-access-qh45t" (OuterVolumeSpecName: "kube-api-access-qh45t") pod "362049d6-ebac-4703-b856-408cc878f2b6" (UID: "362049d6-ebac-4703-b856-408cc878f2b6"). InnerVolumeSpecName "kube-api-access-qh45t". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 15:20:41 crc kubenswrapper[4785]: I1126 15:20:41.635097 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/362049d6-ebac-4703-b856-408cc878f2b6-util" (OuterVolumeSpecName: "util") pod "362049d6-ebac-4703-b856-408cc878f2b6" (UID: "362049d6-ebac-4703-b856-408cc878f2b6"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 15:20:41 crc kubenswrapper[4785]: I1126 15:20:41.720677 4785 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qh45t\" (UniqueName: \"kubernetes.io/projected/362049d6-ebac-4703-b856-408cc878f2b6-kube-api-access-qh45t\") on node \"crc\" DevicePath \"\"" Nov 26 15:20:41 crc kubenswrapper[4785]: I1126 15:20:41.721030 4785 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/362049d6-ebac-4703-b856-408cc878f2b6-util\") on node \"crc\" DevicePath \"\"" Nov 26 15:20:41 crc kubenswrapper[4785]: I1126 15:20:41.721044 4785 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/362049d6-ebac-4703-b856-408cc878f2b6-bundle\") on node \"crc\" DevicePath \"\"" Nov 26 15:20:42 crc kubenswrapper[4785]: I1126 15:20:42.258567 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/5d473c3169f40b179d14921c90af2c8546b7b757fe551b7dba7d903f5d6ndpx" event={"ID":"362049d6-ebac-4703-b856-408cc878f2b6","Type":"ContainerDied","Data":"1e62885264539155bcf77c2239951b201d455e2a4ba83576bd0d7e0a263afda0"} Nov 26 15:20:42 crc kubenswrapper[4785]: I1126 15:20:42.258608 4785 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1e62885264539155bcf77c2239951b201d455e2a4ba83576bd0d7e0a263afda0" Nov 26 15:20:42 crc kubenswrapper[4785]: I1126 15:20:42.258743 4785 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/5d473c3169f40b179d14921c90af2c8546b7b757fe551b7dba7d903f5d6ndpx" Nov 26 15:20:46 crc kubenswrapper[4785]: I1126 15:20:46.808460 4785 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-f68bdc44b-4p65x"] Nov 26 15:20:46 crc kubenswrapper[4785]: E1126 15:20:46.809326 4785 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="362049d6-ebac-4703-b856-408cc878f2b6" containerName="pull" Nov 26 15:20:46 crc kubenswrapper[4785]: I1126 15:20:46.809342 4785 state_mem.go:107] "Deleted CPUSet assignment" podUID="362049d6-ebac-4703-b856-408cc878f2b6" containerName="pull" Nov 26 15:20:46 crc kubenswrapper[4785]: E1126 15:20:46.809359 4785 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="362049d6-ebac-4703-b856-408cc878f2b6" containerName="extract" Nov 26 15:20:46 crc kubenswrapper[4785]: I1126 15:20:46.809370 4785 state_mem.go:107] "Deleted CPUSet assignment" podUID="362049d6-ebac-4703-b856-408cc878f2b6" containerName="extract" Nov 26 15:20:46 crc kubenswrapper[4785]: E1126 15:20:46.809387 4785 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="362049d6-ebac-4703-b856-408cc878f2b6" containerName="util" Nov 26 15:20:46 crc kubenswrapper[4785]: I1126 15:20:46.809395 4785 state_mem.go:107] "Deleted CPUSet assignment" podUID="362049d6-ebac-4703-b856-408cc878f2b6" containerName="util" Nov 26 15:20:46 crc kubenswrapper[4785]: I1126 15:20:46.809530 4785 memory_manager.go:354] "RemoveStaleState removing state" podUID="362049d6-ebac-4703-b856-408cc878f2b6" containerName="extract" Nov 26 15:20:46 crc kubenswrapper[4785]: I1126 15:20:46.810269 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-f68bdc44b-4p65x" Nov 26 15:20:46 crc kubenswrapper[4785]: I1126 15:20:46.813651 4785 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-service-cert" Nov 26 15:20:46 crc kubenswrapper[4785]: I1126 15:20:46.814856 4785 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-6wjlm" Nov 26 15:20:46 crc kubenswrapper[4785]: I1126 15:20:46.892007 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/b37e9a18-1d3b-4a5b-aa23-d9cc2f6394e3-apiservice-cert\") pod \"infra-operator-controller-manager-f68bdc44b-4p65x\" (UID: \"b37e9a18-1d3b-4a5b-aa23-d9cc2f6394e3\") " pod="openstack-operators/infra-operator-controller-manager-f68bdc44b-4p65x" Nov 26 15:20:46 crc kubenswrapper[4785]: I1126 15:20:46.892076 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rbvxk\" (UniqueName: \"kubernetes.io/projected/b37e9a18-1d3b-4a5b-aa23-d9cc2f6394e3-kube-api-access-rbvxk\") pod \"infra-operator-controller-manager-f68bdc44b-4p65x\" (UID: \"b37e9a18-1d3b-4a5b-aa23-d9cc2f6394e3\") " pod="openstack-operators/infra-operator-controller-manager-f68bdc44b-4p65x" Nov 26 15:20:46 crc kubenswrapper[4785]: I1126 15:20:46.892135 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/b37e9a18-1d3b-4a5b-aa23-d9cc2f6394e3-webhook-cert\") pod \"infra-operator-controller-manager-f68bdc44b-4p65x\" (UID: \"b37e9a18-1d3b-4a5b-aa23-d9cc2f6394e3\") " pod="openstack-operators/infra-operator-controller-manager-f68bdc44b-4p65x" Nov 26 15:20:46 crc kubenswrapper[4785]: I1126 15:20:46.901679 4785 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-f68bdc44b-4p65x"] Nov 26 15:20:46 crc kubenswrapper[4785]: I1126 15:20:46.993416 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/b37e9a18-1d3b-4a5b-aa23-d9cc2f6394e3-apiservice-cert\") pod \"infra-operator-controller-manager-f68bdc44b-4p65x\" (UID: \"b37e9a18-1d3b-4a5b-aa23-d9cc2f6394e3\") " pod="openstack-operators/infra-operator-controller-manager-f68bdc44b-4p65x" Nov 26 15:20:46 crc kubenswrapper[4785]: I1126 15:20:46.993482 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rbvxk\" (UniqueName: \"kubernetes.io/projected/b37e9a18-1d3b-4a5b-aa23-d9cc2f6394e3-kube-api-access-rbvxk\") pod \"infra-operator-controller-manager-f68bdc44b-4p65x\" (UID: \"b37e9a18-1d3b-4a5b-aa23-d9cc2f6394e3\") " pod="openstack-operators/infra-operator-controller-manager-f68bdc44b-4p65x" Nov 26 15:20:46 crc kubenswrapper[4785]: I1126 15:20:46.993518 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/b37e9a18-1d3b-4a5b-aa23-d9cc2f6394e3-webhook-cert\") pod \"infra-operator-controller-manager-f68bdc44b-4p65x\" (UID: \"b37e9a18-1d3b-4a5b-aa23-d9cc2f6394e3\") " pod="openstack-operators/infra-operator-controller-manager-f68bdc44b-4p65x" Nov 26 15:20:47 crc kubenswrapper[4785]: I1126 15:20:47.003237 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/b37e9a18-1d3b-4a5b-aa23-d9cc2f6394e3-apiservice-cert\") pod \"infra-operator-controller-manager-f68bdc44b-4p65x\" (UID: \"b37e9a18-1d3b-4a5b-aa23-d9cc2f6394e3\") " pod="openstack-operators/infra-operator-controller-manager-f68bdc44b-4p65x" Nov 26 15:20:47 crc kubenswrapper[4785]: I1126 15:20:47.003237 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/b37e9a18-1d3b-4a5b-aa23-d9cc2f6394e3-webhook-cert\") pod \"infra-operator-controller-manager-f68bdc44b-4p65x\" (UID: \"b37e9a18-1d3b-4a5b-aa23-d9cc2f6394e3\") " pod="openstack-operators/infra-operator-controller-manager-f68bdc44b-4p65x" Nov 26 15:20:47 crc kubenswrapper[4785]: I1126 15:20:47.018824 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rbvxk\" (UniqueName: \"kubernetes.io/projected/b37e9a18-1d3b-4a5b-aa23-d9cc2f6394e3-kube-api-access-rbvxk\") pod \"infra-operator-controller-manager-f68bdc44b-4p65x\" (UID: \"b37e9a18-1d3b-4a5b-aa23-d9cc2f6394e3\") " pod="openstack-operators/infra-operator-controller-manager-f68bdc44b-4p65x" Nov 26 15:20:47 crc kubenswrapper[4785]: I1126 15:20:47.129300 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-f68bdc44b-4p65x" Nov 26 15:20:47 crc kubenswrapper[4785]: I1126 15:20:47.588634 4785 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-f68bdc44b-4p65x"] Nov 26 15:20:47 crc kubenswrapper[4785]: W1126 15:20:47.592419 4785 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb37e9a18_1d3b_4a5b_aa23_d9cc2f6394e3.slice/crio-3e21e6fbe3cb6bcc0ee60ba0df358a7c068cc4e3e67e680a92a63e8e761f0361 WatchSource:0}: Error finding container 3e21e6fbe3cb6bcc0ee60ba0df358a7c068cc4e3e67e680a92a63e8e761f0361: Status 404 returned error can't find the container with id 3e21e6fbe3cb6bcc0ee60ba0df358a7c068cc4e3e67e680a92a63e8e761f0361 Nov 26 15:20:48 crc kubenswrapper[4785]: I1126 15:20:48.296307 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-f68bdc44b-4p65x" event={"ID":"b37e9a18-1d3b-4a5b-aa23-d9cc2f6394e3","Type":"ContainerStarted","Data":"3e21e6fbe3cb6bcc0ee60ba0df358a7c068cc4e3e67e680a92a63e8e761f0361"} Nov 26 15:20:50 crc kubenswrapper[4785]: I1126 15:20:50.309326 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-f68bdc44b-4p65x" event={"ID":"b37e9a18-1d3b-4a5b-aa23-d9cc2f6394e3","Type":"ContainerStarted","Data":"cd10c2f7bdcff41f08931840f5f399b0f572b7eed28573f56f2a292c3fb744c1"} Nov 26 15:20:51 crc kubenswrapper[4785]: I1126 15:20:51.082727 4785 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/openstack-galera-0"] Nov 26 15:20:51 crc kubenswrapper[4785]: I1126 15:20:51.083789 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/openstack-galera-0" Nov 26 15:20:51 crc kubenswrapper[4785]: I1126 15:20:51.085778 4785 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"galera-openstack-dockercfg-xjh5b" Nov 26 15:20:51 crc kubenswrapper[4785]: I1126 15:20:51.086176 4785 reflector.go:368] Caches populated for *v1.ConfigMap from object-"glance-kuttl-tests"/"openstack-config-data" Nov 26 15:20:51 crc kubenswrapper[4785]: I1126 15:20:51.086594 4785 reflector.go:368] Caches populated for *v1.ConfigMap from object-"glance-kuttl-tests"/"openstack-scripts" Nov 26 15:20:51 crc kubenswrapper[4785]: I1126 15:20:51.086619 4785 reflector.go:368] Caches populated for *v1.ConfigMap from object-"glance-kuttl-tests"/"openshift-service-ca.crt" Nov 26 15:20:51 crc kubenswrapper[4785]: I1126 15:20:51.086924 4785 reflector.go:368] Caches populated for *v1.ConfigMap from object-"glance-kuttl-tests"/"kube-root-ca.crt" Nov 26 15:20:51 crc kubenswrapper[4785]: I1126 15:20:51.093185 4785 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/openstack-galera-1"] Nov 26 15:20:51 crc kubenswrapper[4785]: I1126 15:20:51.095453 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/openstack-galera-1" Nov 26 15:20:51 crc kubenswrapper[4785]: I1126 15:20:51.097382 4785 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/openstack-galera-2"] Nov 26 15:20:51 crc kubenswrapper[4785]: I1126 15:20:51.098410 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/openstack-galera-2" Nov 26 15:20:51 crc kubenswrapper[4785]: I1126 15:20:51.119300 4785 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/openstack-galera-0"] Nov 26 15:20:51 crc kubenswrapper[4785]: I1126 15:20:51.128115 4785 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/openstack-galera-1"] Nov 26 15:20:51 crc kubenswrapper[4785]: I1126 15:20:51.150856 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4sd2w\" (UniqueName: \"kubernetes.io/projected/b8f3a4fb-df39-4059-a9dd-4f566b1e4860-kube-api-access-4sd2w\") pod \"openstack-galera-2\" (UID: \"b8f3a4fb-df39-4059-a9dd-4f566b1e4860\") " pod="glance-kuttl-tests/openstack-galera-2" Nov 26 15:20:51 crc kubenswrapper[4785]: I1126 15:20:51.150957 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a9248619-c310-43ae-b33a-b51f3e9d0a03-operator-scripts\") pod \"openstack-galera-1\" (UID: \"a9248619-c310-43ae-b33a-b51f3e9d0a03\") " pod="glance-kuttl-tests/openstack-galera-1" Nov 26 15:20:51 crc kubenswrapper[4785]: I1126 15:20:51.151010 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/a9248619-c310-43ae-b33a-b51f3e9d0a03-config-data-default\") pod \"openstack-galera-1\" (UID: \"a9248619-c310-43ae-b33a-b51f3e9d0a03\") " pod="glance-kuttl-tests/openstack-galera-1" Nov 26 15:20:51 crc kubenswrapper[4785]: I1126 15:20:51.151055 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/a9248619-c310-43ae-b33a-b51f3e9d0a03-kolla-config\") pod \"openstack-galera-1\" (UID: \"a9248619-c310-43ae-b33a-b51f3e9d0a03\") " pod="glance-kuttl-tests/openstack-galera-1" Nov 26 15:20:51 crc kubenswrapper[4785]: I1126 15:20:51.151120 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mcbbz\" (UniqueName: \"kubernetes.io/projected/a9248619-c310-43ae-b33a-b51f3e9d0a03-kube-api-access-mcbbz\") pod \"openstack-galera-1\" (UID: \"a9248619-c310-43ae-b33a-b51f3e9d0a03\") " pod="glance-kuttl-tests/openstack-galera-1" Nov 26 15:20:51 crc kubenswrapper[4785]: I1126 15:20:51.151184 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/b8f3a4fb-df39-4059-a9dd-4f566b1e4860-kolla-config\") pod \"openstack-galera-2\" (UID: \"b8f3a4fb-df39-4059-a9dd-4f566b1e4860\") " pod="glance-kuttl-tests/openstack-galera-2" Nov 26 15:20:51 crc kubenswrapper[4785]: I1126 15:20:51.151243 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b8f3a4fb-df39-4059-a9dd-4f566b1e4860-operator-scripts\") pod \"openstack-galera-2\" (UID: \"b8f3a4fb-df39-4059-a9dd-4f566b1e4860\") " pod="glance-kuttl-tests/openstack-galera-2" Nov 26 15:20:51 crc kubenswrapper[4785]: I1126 15:20:51.151300 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/b8f3a4fb-df39-4059-a9dd-4f566b1e4860-config-data-generated\") pod \"openstack-galera-2\" (UID: \"b8f3a4fb-df39-4059-a9dd-4f566b1e4860\") " pod="glance-kuttl-tests/openstack-galera-2" Nov 26 15:20:51 crc kubenswrapper[4785]: I1126 15:20:51.151356 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/a9248619-c310-43ae-b33a-b51f3e9d0a03-config-data-generated\") pod \"openstack-galera-1\" (UID: \"a9248619-c310-43ae-b33a-b51f3e9d0a03\") " pod="glance-kuttl-tests/openstack-galera-1" Nov 26 15:20:51 crc kubenswrapper[4785]: I1126 15:20:51.151420 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"openstack-galera-2\" (UID: \"b8f3a4fb-df39-4059-a9dd-4f566b1e4860\") " pod="glance-kuttl-tests/openstack-galera-2" Nov 26 15:20:51 crc kubenswrapper[4785]: I1126 15:20:51.151479 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/b8f3a4fb-df39-4059-a9dd-4f566b1e4860-config-data-default\") pod \"openstack-galera-2\" (UID: \"b8f3a4fb-df39-4059-a9dd-4f566b1e4860\") " pod="glance-kuttl-tests/openstack-galera-2" Nov 26 15:20:51 crc kubenswrapper[4785]: I1126 15:20:51.151589 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"openstack-galera-1\" (UID: \"a9248619-c310-43ae-b33a-b51f3e9d0a03\") " pod="glance-kuttl-tests/openstack-galera-1" Nov 26 15:20:51 crc kubenswrapper[4785]: I1126 15:20:51.179186 4785 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/openstack-galera-2"] Nov 26 15:20:51 crc kubenswrapper[4785]: I1126 15:20:51.252977 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mcbbz\" (UniqueName: \"kubernetes.io/projected/a9248619-c310-43ae-b33a-b51f3e9d0a03-kube-api-access-mcbbz\") pod \"openstack-galera-1\" (UID: \"a9248619-c310-43ae-b33a-b51f3e9d0a03\") " pod="glance-kuttl-tests/openstack-galera-1" Nov 26 15:20:51 crc kubenswrapper[4785]: I1126 15:20:51.253027 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/b8f3a4fb-df39-4059-a9dd-4f566b1e4860-kolla-config\") pod \"openstack-galera-2\" (UID: \"b8f3a4fb-df39-4059-a9dd-4f566b1e4860\") " pod="glance-kuttl-tests/openstack-galera-2" Nov 26 15:20:51 crc kubenswrapper[4785]: I1126 15:20:51.253051 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b8f3a4fb-df39-4059-a9dd-4f566b1e4860-operator-scripts\") pod \"openstack-galera-2\" (UID: \"b8f3a4fb-df39-4059-a9dd-4f566b1e4860\") " pod="glance-kuttl-tests/openstack-galera-2" Nov 26 15:20:51 crc kubenswrapper[4785]: I1126 15:20:51.253072 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/b8f3a4fb-df39-4059-a9dd-4f566b1e4860-config-data-generated\") pod \"openstack-galera-2\" (UID: \"b8f3a4fb-df39-4059-a9dd-4f566b1e4860\") " pod="glance-kuttl-tests/openstack-galera-2" Nov 26 15:20:51 crc kubenswrapper[4785]: I1126 15:20:51.253088 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/a9248619-c310-43ae-b33a-b51f3e9d0a03-config-data-generated\") pod \"openstack-galera-1\" (UID: \"a9248619-c310-43ae-b33a-b51f3e9d0a03\") " pod="glance-kuttl-tests/openstack-galera-1" Nov 26 15:20:51 crc kubenswrapper[4785]: I1126 15:20:51.253112 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/b12a01ef-3cf0-4e03-b38b-9b306ce01fdf-kolla-config\") pod \"openstack-galera-0\" (UID: \"b12a01ef-3cf0-4e03-b38b-9b306ce01fdf\") " pod="glance-kuttl-tests/openstack-galera-0" Nov 26 15:20:51 crc kubenswrapper[4785]: I1126 15:20:51.253131 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"openstack-galera-2\" (UID: \"b8f3a4fb-df39-4059-a9dd-4f566b1e4860\") " pod="glance-kuttl-tests/openstack-galera-2" Nov 26 15:20:51 crc kubenswrapper[4785]: I1126 15:20:51.253153 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/b12a01ef-3cf0-4e03-b38b-9b306ce01fdf-config-data-default\") pod \"openstack-galera-0\" (UID: \"b12a01ef-3cf0-4e03-b38b-9b306ce01fdf\") " pod="glance-kuttl-tests/openstack-galera-0" Nov 26 15:20:51 crc kubenswrapper[4785]: I1126 15:20:51.253174 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m7vww\" (UniqueName: \"kubernetes.io/projected/b12a01ef-3cf0-4e03-b38b-9b306ce01fdf-kube-api-access-m7vww\") pod \"openstack-galera-0\" (UID: \"b12a01ef-3cf0-4e03-b38b-9b306ce01fdf\") " pod="glance-kuttl-tests/openstack-galera-0" Nov 26 15:20:51 crc kubenswrapper[4785]: I1126 15:20:51.253192 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/b8f3a4fb-df39-4059-a9dd-4f566b1e4860-config-data-default\") pod \"openstack-galera-2\" (UID: \"b8f3a4fb-df39-4059-a9dd-4f566b1e4860\") " pod="glance-kuttl-tests/openstack-galera-2" Nov 26 15:20:51 crc kubenswrapper[4785]: I1126 15:20:51.253216 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b12a01ef-3cf0-4e03-b38b-9b306ce01fdf-operator-scripts\") pod \"openstack-galera-0\" (UID: \"b12a01ef-3cf0-4e03-b38b-9b306ce01fdf\") " pod="glance-kuttl-tests/openstack-galera-0" Nov 26 15:20:51 crc kubenswrapper[4785]: I1126 15:20:51.253234 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"openstack-galera-1\" (UID: \"a9248619-c310-43ae-b33a-b51f3e9d0a03\") " pod="glance-kuttl-tests/openstack-galera-1" Nov 26 15:20:51 crc kubenswrapper[4785]: I1126 15:20:51.253334 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"openstack-galera-0\" (UID: \"b12a01ef-3cf0-4e03-b38b-9b306ce01fdf\") " pod="glance-kuttl-tests/openstack-galera-0" Nov 26 15:20:51 crc kubenswrapper[4785]: I1126 15:20:51.253377 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/b12a01ef-3cf0-4e03-b38b-9b306ce01fdf-config-data-generated\") pod \"openstack-galera-0\" (UID: \"b12a01ef-3cf0-4e03-b38b-9b306ce01fdf\") " pod="glance-kuttl-tests/openstack-galera-0" Nov 26 15:20:51 crc kubenswrapper[4785]: I1126 15:20:51.253472 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4sd2w\" (UniqueName: \"kubernetes.io/projected/b8f3a4fb-df39-4059-a9dd-4f566b1e4860-kube-api-access-4sd2w\") pod \"openstack-galera-2\" (UID: \"b8f3a4fb-df39-4059-a9dd-4f566b1e4860\") " pod="glance-kuttl-tests/openstack-galera-2" Nov 26 15:20:51 crc kubenswrapper[4785]: I1126 15:20:51.253501 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/a9248619-c310-43ae-b33a-b51f3e9d0a03-config-data-generated\") pod \"openstack-galera-1\" (UID: \"a9248619-c310-43ae-b33a-b51f3e9d0a03\") " pod="glance-kuttl-tests/openstack-galera-1" Nov 26 15:20:51 crc kubenswrapper[4785]: I1126 15:20:51.253514 4785 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"openstack-galera-1\" (UID: \"a9248619-c310-43ae-b33a-b51f3e9d0a03\") device mount path \"/mnt/openstack/pv05\"" pod="glance-kuttl-tests/openstack-galera-1" Nov 26 15:20:51 crc kubenswrapper[4785]: I1126 15:20:51.253515 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a9248619-c310-43ae-b33a-b51f3e9d0a03-operator-scripts\") pod \"openstack-galera-1\" (UID: \"a9248619-c310-43ae-b33a-b51f3e9d0a03\") " pod="glance-kuttl-tests/openstack-galera-1" Nov 26 15:20:51 crc kubenswrapper[4785]: I1126 15:20:51.253519 4785 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"openstack-galera-2\" (UID: \"b8f3a4fb-df39-4059-a9dd-4f566b1e4860\") device mount path \"/mnt/openstack/pv03\"" pod="glance-kuttl-tests/openstack-galera-2" Nov 26 15:20:51 crc kubenswrapper[4785]: I1126 15:20:51.255779 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/b8f3a4fb-df39-4059-a9dd-4f566b1e4860-config-data-default\") pod \"openstack-galera-2\" (UID: \"b8f3a4fb-df39-4059-a9dd-4f566b1e4860\") " pod="glance-kuttl-tests/openstack-galera-2" Nov 26 15:20:51 crc kubenswrapper[4785]: I1126 15:20:51.256337 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b8f3a4fb-df39-4059-a9dd-4f566b1e4860-operator-scripts\") pod \"openstack-galera-2\" (UID: \"b8f3a4fb-df39-4059-a9dd-4f566b1e4860\") " pod="glance-kuttl-tests/openstack-galera-2" Nov 26 15:20:51 crc kubenswrapper[4785]: I1126 15:20:51.256668 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a9248619-c310-43ae-b33a-b51f3e9d0a03-operator-scripts\") pod \"openstack-galera-1\" (UID: \"a9248619-c310-43ae-b33a-b51f3e9d0a03\") " pod="glance-kuttl-tests/openstack-galera-1" Nov 26 15:20:51 crc kubenswrapper[4785]: I1126 15:20:51.257491 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/b8f3a4fb-df39-4059-a9dd-4f566b1e4860-config-data-generated\") pod \"openstack-galera-2\" (UID: \"b8f3a4fb-df39-4059-a9dd-4f566b1e4860\") " pod="glance-kuttl-tests/openstack-galera-2" Nov 26 15:20:51 crc kubenswrapper[4785]: I1126 15:20:51.259653 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/a9248619-c310-43ae-b33a-b51f3e9d0a03-config-data-default\") pod \"openstack-galera-1\" (UID: \"a9248619-c310-43ae-b33a-b51f3e9d0a03\") " pod="glance-kuttl-tests/openstack-galera-1" Nov 26 15:20:51 crc kubenswrapper[4785]: I1126 15:20:51.259712 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/a9248619-c310-43ae-b33a-b51f3e9d0a03-kolla-config\") pod \"openstack-galera-1\" (UID: \"a9248619-c310-43ae-b33a-b51f3e9d0a03\") " pod="glance-kuttl-tests/openstack-galera-1" Nov 26 15:20:51 crc kubenswrapper[4785]: I1126 15:20:51.260266 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/a9248619-c310-43ae-b33a-b51f3e9d0a03-kolla-config\") pod \"openstack-galera-1\" (UID: \"a9248619-c310-43ae-b33a-b51f3e9d0a03\") " pod="glance-kuttl-tests/openstack-galera-1" Nov 26 15:20:51 crc kubenswrapper[4785]: I1126 15:20:51.261477 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/a9248619-c310-43ae-b33a-b51f3e9d0a03-config-data-default\") pod \"openstack-galera-1\" (UID: \"a9248619-c310-43ae-b33a-b51f3e9d0a03\") " pod="glance-kuttl-tests/openstack-galera-1" Nov 26 15:20:51 crc kubenswrapper[4785]: I1126 15:20:51.274094 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/b8f3a4fb-df39-4059-a9dd-4f566b1e4860-kolla-config\") pod \"openstack-galera-2\" (UID: \"b8f3a4fb-df39-4059-a9dd-4f566b1e4860\") " pod="glance-kuttl-tests/openstack-galera-2" Nov 26 15:20:51 crc kubenswrapper[4785]: I1126 15:20:51.284542 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mcbbz\" (UniqueName: \"kubernetes.io/projected/a9248619-c310-43ae-b33a-b51f3e9d0a03-kube-api-access-mcbbz\") pod \"openstack-galera-1\" (UID: \"a9248619-c310-43ae-b33a-b51f3e9d0a03\") " pod="glance-kuttl-tests/openstack-galera-1" Nov 26 15:20:51 crc kubenswrapper[4785]: I1126 15:20:51.291864 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"openstack-galera-1\" (UID: \"a9248619-c310-43ae-b33a-b51f3e9d0a03\") " pod="glance-kuttl-tests/openstack-galera-1" Nov 26 15:20:51 crc kubenswrapper[4785]: I1126 15:20:51.298300 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"openstack-galera-2\" (UID: \"b8f3a4fb-df39-4059-a9dd-4f566b1e4860\") " pod="glance-kuttl-tests/openstack-galera-2" Nov 26 15:20:51 crc kubenswrapper[4785]: I1126 15:20:51.309990 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4sd2w\" (UniqueName: \"kubernetes.io/projected/b8f3a4fb-df39-4059-a9dd-4f566b1e4860-kube-api-access-4sd2w\") pod \"openstack-galera-2\" (UID: \"b8f3a4fb-df39-4059-a9dd-4f566b1e4860\") " pod="glance-kuttl-tests/openstack-galera-2" Nov 26 15:20:51 crc kubenswrapper[4785]: I1126 15:20:51.361461 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"openstack-galera-0\" (UID: \"b12a01ef-3cf0-4e03-b38b-9b306ce01fdf\") " pod="glance-kuttl-tests/openstack-galera-0" Nov 26 15:20:51 crc kubenswrapper[4785]: I1126 15:20:51.361579 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/b12a01ef-3cf0-4e03-b38b-9b306ce01fdf-config-data-generated\") pod \"openstack-galera-0\" (UID: \"b12a01ef-3cf0-4e03-b38b-9b306ce01fdf\") " pod="glance-kuttl-tests/openstack-galera-0" Nov 26 15:20:51 crc kubenswrapper[4785]: I1126 15:20:51.361853 4785 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"openstack-galera-0\" (UID: \"b12a01ef-3cf0-4e03-b38b-9b306ce01fdf\") device mount path \"/mnt/openstack/pv01\"" pod="glance-kuttl-tests/openstack-galera-0" Nov 26 15:20:51 crc kubenswrapper[4785]: I1126 15:20:51.362018 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/b12a01ef-3cf0-4e03-b38b-9b306ce01fdf-config-data-generated\") pod \"openstack-galera-0\" (UID: \"b12a01ef-3cf0-4e03-b38b-9b306ce01fdf\") " pod="glance-kuttl-tests/openstack-galera-0" Nov 26 15:20:51 crc kubenswrapper[4785]: I1126 15:20:51.365039 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/b12a01ef-3cf0-4e03-b38b-9b306ce01fdf-kolla-config\") pod \"openstack-galera-0\" (UID: \"b12a01ef-3cf0-4e03-b38b-9b306ce01fdf\") " pod="glance-kuttl-tests/openstack-galera-0" Nov 26 15:20:51 crc kubenswrapper[4785]: I1126 15:20:51.363952 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/b12a01ef-3cf0-4e03-b38b-9b306ce01fdf-kolla-config\") pod \"openstack-galera-0\" (UID: \"b12a01ef-3cf0-4e03-b38b-9b306ce01fdf\") " pod="glance-kuttl-tests/openstack-galera-0" Nov 26 15:20:51 crc kubenswrapper[4785]: I1126 15:20:51.366036 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/b12a01ef-3cf0-4e03-b38b-9b306ce01fdf-config-data-default\") pod \"openstack-galera-0\" (UID: \"b12a01ef-3cf0-4e03-b38b-9b306ce01fdf\") " pod="glance-kuttl-tests/openstack-galera-0" Nov 26 15:20:51 crc kubenswrapper[4785]: I1126 15:20:51.366132 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/b12a01ef-3cf0-4e03-b38b-9b306ce01fdf-config-data-default\") pod \"openstack-galera-0\" (UID: \"b12a01ef-3cf0-4e03-b38b-9b306ce01fdf\") " pod="glance-kuttl-tests/openstack-galera-0" Nov 26 15:20:51 crc kubenswrapper[4785]: I1126 15:20:51.366229 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m7vww\" (UniqueName: \"kubernetes.io/projected/b12a01ef-3cf0-4e03-b38b-9b306ce01fdf-kube-api-access-m7vww\") pod \"openstack-galera-0\" (UID: \"b12a01ef-3cf0-4e03-b38b-9b306ce01fdf\") " pod="glance-kuttl-tests/openstack-galera-0" Nov 26 15:20:51 crc kubenswrapper[4785]: I1126 15:20:51.366335 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b12a01ef-3cf0-4e03-b38b-9b306ce01fdf-operator-scripts\") pod \"openstack-galera-0\" (UID: \"b12a01ef-3cf0-4e03-b38b-9b306ce01fdf\") " pod="glance-kuttl-tests/openstack-galera-0" Nov 26 15:20:51 crc kubenswrapper[4785]: I1126 15:20:51.367541 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b12a01ef-3cf0-4e03-b38b-9b306ce01fdf-operator-scripts\") pod \"openstack-galera-0\" (UID: \"b12a01ef-3cf0-4e03-b38b-9b306ce01fdf\") " pod="glance-kuttl-tests/openstack-galera-0" Nov 26 15:20:51 crc kubenswrapper[4785]: I1126 15:20:51.383537 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m7vww\" (UniqueName: \"kubernetes.io/projected/b12a01ef-3cf0-4e03-b38b-9b306ce01fdf-kube-api-access-m7vww\") pod \"openstack-galera-0\" (UID: \"b12a01ef-3cf0-4e03-b38b-9b306ce01fdf\") " pod="glance-kuttl-tests/openstack-galera-0" Nov 26 15:20:51 crc kubenswrapper[4785]: I1126 15:20:51.383833 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"openstack-galera-0\" (UID: \"b12a01ef-3cf0-4e03-b38b-9b306ce01fdf\") " pod="glance-kuttl-tests/openstack-galera-0" Nov 26 15:20:51 crc kubenswrapper[4785]: I1126 15:20:51.454802 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/openstack-galera-0" Nov 26 15:20:51 crc kubenswrapper[4785]: I1126 15:20:51.474958 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/openstack-galera-1" Nov 26 15:20:51 crc kubenswrapper[4785]: I1126 15:20:51.482393 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/openstack-galera-2" Nov 26 15:20:51 crc kubenswrapper[4785]: I1126 15:20:51.891122 4785 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/openstack-galera-0"] Nov 26 15:20:52 crc kubenswrapper[4785]: W1126 15:20:52.032972 4785 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb12a01ef_3cf0_4e03_b38b_9b306ce01fdf.slice/crio-11ab23fc6e1f040f03b167b6ce57ef9fff39182e83cd4947e2acf5709ec580d1 WatchSource:0}: Error finding container 11ab23fc6e1f040f03b167b6ce57ef9fff39182e83cd4947e2acf5709ec580d1: Status 404 returned error can't find the container with id 11ab23fc6e1f040f03b167b6ce57ef9fff39182e83cd4947e2acf5709ec580d1 Nov 26 15:20:52 crc kubenswrapper[4785]: I1126 15:20:52.333978 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/openstack-galera-0" event={"ID":"b12a01ef-3cf0-4e03-b38b-9b306ce01fdf","Type":"ContainerStarted","Data":"11ab23fc6e1f040f03b167b6ce57ef9fff39182e83cd4947e2acf5709ec580d1"} Nov 26 15:20:52 crc kubenswrapper[4785]: I1126 15:20:52.773168 4785 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/openstack-galera-2"] Nov 26 15:20:53 crc kubenswrapper[4785]: W1126 15:20:52.998738 4785 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb8f3a4fb_df39_4059_a9dd_4f566b1e4860.slice/crio-ace197738dde21f5363f50faaddc0963d52854c7b47e09b1bcf00f217af0a96f WatchSource:0}: Error finding container ace197738dde21f5363f50faaddc0963d52854c7b47e09b1bcf00f217af0a96f: Status 404 returned error can't find the container with id ace197738dde21f5363f50faaddc0963d52854c7b47e09b1bcf00f217af0a96f Nov 26 15:20:53 crc kubenswrapper[4785]: I1126 15:20:53.349679 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-f68bdc44b-4p65x" event={"ID":"b37e9a18-1d3b-4a5b-aa23-d9cc2f6394e3","Type":"ContainerStarted","Data":"8c7daa00b0486982cdac3e81a1031828108e75e5fc9ce268b91a45f22857c06b"} Nov 26 15:20:53 crc kubenswrapper[4785]: I1126 15:20:53.350039 4785 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-f68bdc44b-4p65x" Nov 26 15:20:53 crc kubenswrapper[4785]: I1126 15:20:53.350919 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/openstack-galera-2" event={"ID":"b8f3a4fb-df39-4059-a9dd-4f566b1e4860","Type":"ContainerStarted","Data":"ace197738dde21f5363f50faaddc0963d52854c7b47e09b1bcf00f217af0a96f"} Nov 26 15:20:53 crc kubenswrapper[4785]: I1126 15:20:53.371165 4785 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-f68bdc44b-4p65x" podStartSLOduration=1.929461662 podStartE2EDuration="7.371149143s" podCreationTimestamp="2025-11-26 15:20:46 +0000 UTC" firstStartedPulling="2025-11-26 15:20:47.596132582 +0000 UTC m=+811.274498346" lastFinishedPulling="2025-11-26 15:20:53.037820043 +0000 UTC m=+816.716185827" observedRunningTime="2025-11-26 15:20:53.367943356 +0000 UTC m=+817.046309130" watchObservedRunningTime="2025-11-26 15:20:53.371149143 +0000 UTC m=+817.049514907" Nov 26 15:20:53 crc kubenswrapper[4785]: I1126 15:20:53.383398 4785 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/openstack-galera-1"] Nov 26 15:20:54 crc kubenswrapper[4785]: I1126 15:20:54.358119 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/openstack-galera-1" event={"ID":"a9248619-c310-43ae-b33a-b51f3e9d0a03","Type":"ContainerStarted","Data":"e626e169265aba93df6d354863365750cd4b0f8fc166fad8529987555d8921e4"} Nov 26 15:20:57 crc kubenswrapper[4785]: I1126 15:20:57.134278 4785 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-f68bdc44b-4p65x" Nov 26 15:21:00 crc kubenswrapper[4785]: I1126 15:21:00.396520 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/openstack-galera-0" event={"ID":"b12a01ef-3cf0-4e03-b38b-9b306ce01fdf","Type":"ContainerStarted","Data":"71107d285a39b58f8846694487e586f9c08be0e51530bdac438f704c10c8281b"} Nov 26 15:21:00 crc kubenswrapper[4785]: I1126 15:21:00.398176 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/openstack-galera-2" event={"ID":"b8f3a4fb-df39-4059-a9dd-4f566b1e4860","Type":"ContainerStarted","Data":"32ff011d3b3a405323889fdde6d7a4092343716b220b4b702509f709663c7dd3"} Nov 26 15:21:00 crc kubenswrapper[4785]: I1126 15:21:00.400104 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/openstack-galera-1" event={"ID":"a9248619-c310-43ae-b33a-b51f3e9d0a03","Type":"ContainerStarted","Data":"6811b67b5412db2325a761152556a1ac9510242d65ff9425452a7a26af93a392"} Nov 26 15:21:00 crc kubenswrapper[4785]: I1126 15:21:00.590416 4785 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-5txwc"] Nov 26 15:21:00 crc kubenswrapper[4785]: I1126 15:21:00.591649 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5txwc" Nov 26 15:21:00 crc kubenswrapper[4785]: I1126 15:21:00.619965 4785 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-5txwc"] Nov 26 15:21:00 crc kubenswrapper[4785]: I1126 15:21:00.723229 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/15d94918-e52c-429d-a366-cbe3ac99ba3b-catalog-content\") pod \"redhat-marketplace-5txwc\" (UID: \"15d94918-e52c-429d-a366-cbe3ac99ba3b\") " pod="openshift-marketplace/redhat-marketplace-5txwc" Nov 26 15:21:00 crc kubenswrapper[4785]: I1126 15:21:00.723392 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/15d94918-e52c-429d-a366-cbe3ac99ba3b-utilities\") pod \"redhat-marketplace-5txwc\" (UID: \"15d94918-e52c-429d-a366-cbe3ac99ba3b\") " pod="openshift-marketplace/redhat-marketplace-5txwc" Nov 26 15:21:00 crc kubenswrapper[4785]: I1126 15:21:00.723423 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bk74h\" (UniqueName: \"kubernetes.io/projected/15d94918-e52c-429d-a366-cbe3ac99ba3b-kube-api-access-bk74h\") pod \"redhat-marketplace-5txwc\" (UID: \"15d94918-e52c-429d-a366-cbe3ac99ba3b\") " pod="openshift-marketplace/redhat-marketplace-5txwc" Nov 26 15:21:00 crc kubenswrapper[4785]: I1126 15:21:00.824270 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bk74h\" (UniqueName: \"kubernetes.io/projected/15d94918-e52c-429d-a366-cbe3ac99ba3b-kube-api-access-bk74h\") pod \"redhat-marketplace-5txwc\" (UID: \"15d94918-e52c-429d-a366-cbe3ac99ba3b\") " pod="openshift-marketplace/redhat-marketplace-5txwc" Nov 26 15:21:00 crc kubenswrapper[4785]: I1126 15:21:00.824345 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/15d94918-e52c-429d-a366-cbe3ac99ba3b-catalog-content\") pod \"redhat-marketplace-5txwc\" (UID: \"15d94918-e52c-429d-a366-cbe3ac99ba3b\") " pod="openshift-marketplace/redhat-marketplace-5txwc" Nov 26 15:21:00 crc kubenswrapper[4785]: I1126 15:21:00.824409 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/15d94918-e52c-429d-a366-cbe3ac99ba3b-utilities\") pod \"redhat-marketplace-5txwc\" (UID: \"15d94918-e52c-429d-a366-cbe3ac99ba3b\") " pod="openshift-marketplace/redhat-marketplace-5txwc" Nov 26 15:21:00 crc kubenswrapper[4785]: I1126 15:21:00.824834 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/15d94918-e52c-429d-a366-cbe3ac99ba3b-utilities\") pod \"redhat-marketplace-5txwc\" (UID: \"15d94918-e52c-429d-a366-cbe3ac99ba3b\") " pod="openshift-marketplace/redhat-marketplace-5txwc" Nov 26 15:21:00 crc kubenswrapper[4785]: I1126 15:21:00.825478 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/15d94918-e52c-429d-a366-cbe3ac99ba3b-catalog-content\") pod \"redhat-marketplace-5txwc\" (UID: \"15d94918-e52c-429d-a366-cbe3ac99ba3b\") " pod="openshift-marketplace/redhat-marketplace-5txwc" Nov 26 15:21:00 crc kubenswrapper[4785]: I1126 15:21:00.845473 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bk74h\" (UniqueName: \"kubernetes.io/projected/15d94918-e52c-429d-a366-cbe3ac99ba3b-kube-api-access-bk74h\") pod \"redhat-marketplace-5txwc\" (UID: \"15d94918-e52c-429d-a366-cbe3ac99ba3b\") " pod="openshift-marketplace/redhat-marketplace-5txwc" Nov 26 15:21:00 crc kubenswrapper[4785]: I1126 15:21:00.904577 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5txwc" Nov 26 15:21:01 crc kubenswrapper[4785]: I1126 15:21:01.323467 4785 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-5txwc"] Nov 26 15:21:01 crc kubenswrapper[4785]: W1126 15:21:01.332832 4785 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod15d94918_e52c_429d_a366_cbe3ac99ba3b.slice/crio-468a9d0fb92b2001484d82d7beaa2e337bf3bb795718dc6f481fe4b311b0fbd2 WatchSource:0}: Error finding container 468a9d0fb92b2001484d82d7beaa2e337bf3bb795718dc6f481fe4b311b0fbd2: Status 404 returned error can't find the container with id 468a9d0fb92b2001484d82d7beaa2e337bf3bb795718dc6f481fe4b311b0fbd2 Nov 26 15:21:01 crc kubenswrapper[4785]: I1126 15:21:01.405860 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5txwc" event={"ID":"15d94918-e52c-429d-a366-cbe3ac99ba3b","Type":"ContainerStarted","Data":"468a9d0fb92b2001484d82d7beaa2e337bf3bb795718dc6f481fe4b311b0fbd2"} Nov 26 15:21:02 crc kubenswrapper[4785]: I1126 15:21:02.412229 4785 generic.go:334] "Generic (PLEG): container finished" podID="15d94918-e52c-429d-a366-cbe3ac99ba3b" containerID="fceab1068f597d560db1d4b854a3508200d1bc40a8f8ab195acdecbf15f97bff" exitCode=0 Nov 26 15:21:02 crc kubenswrapper[4785]: I1126 15:21:02.412338 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5txwc" event={"ID":"15d94918-e52c-429d-a366-cbe3ac99ba3b","Type":"ContainerDied","Data":"fceab1068f597d560db1d4b854a3508200d1bc40a8f8ab195acdecbf15f97bff"} Nov 26 15:21:02 crc kubenswrapper[4785]: I1126 15:21:02.701389 4785 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/memcached-0"] Nov 26 15:21:02 crc kubenswrapper[4785]: I1126 15:21:02.702331 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/memcached-0" Nov 26 15:21:02 crc kubenswrapper[4785]: I1126 15:21:02.703507 4785 reflector.go:368] Caches populated for *v1.ConfigMap from object-"glance-kuttl-tests"/"memcached-config-data" Nov 26 15:21:02 crc kubenswrapper[4785]: I1126 15:21:02.703852 4785 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"memcached-memcached-dockercfg-qnhwd" Nov 26 15:21:02 crc kubenswrapper[4785]: I1126 15:21:02.716652 4785 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/memcached-0"] Nov 26 15:21:02 crc kubenswrapper[4785]: I1126 15:21:02.855086 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a1feca48-dc9b-434e-8caf-608727943291-config-data\") pod \"memcached-0\" (UID: \"a1feca48-dc9b-434e-8caf-608727943291\") " pod="glance-kuttl-tests/memcached-0" Nov 26 15:21:02 crc kubenswrapper[4785]: I1126 15:21:02.855155 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/a1feca48-dc9b-434e-8caf-608727943291-kolla-config\") pod \"memcached-0\" (UID: \"a1feca48-dc9b-434e-8caf-608727943291\") " pod="glance-kuttl-tests/memcached-0" Nov 26 15:21:02 crc kubenswrapper[4785]: I1126 15:21:02.855216 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4wcvj\" (UniqueName: \"kubernetes.io/projected/a1feca48-dc9b-434e-8caf-608727943291-kube-api-access-4wcvj\") pod \"memcached-0\" (UID: \"a1feca48-dc9b-434e-8caf-608727943291\") " pod="glance-kuttl-tests/memcached-0" Nov 26 15:21:02 crc kubenswrapper[4785]: I1126 15:21:02.956103 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a1feca48-dc9b-434e-8caf-608727943291-config-data\") pod \"memcached-0\" (UID: \"a1feca48-dc9b-434e-8caf-608727943291\") " pod="glance-kuttl-tests/memcached-0" Nov 26 15:21:02 crc kubenswrapper[4785]: I1126 15:21:02.956165 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/a1feca48-dc9b-434e-8caf-608727943291-kolla-config\") pod \"memcached-0\" (UID: \"a1feca48-dc9b-434e-8caf-608727943291\") " pod="glance-kuttl-tests/memcached-0" Nov 26 15:21:02 crc kubenswrapper[4785]: I1126 15:21:02.956193 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4wcvj\" (UniqueName: \"kubernetes.io/projected/a1feca48-dc9b-434e-8caf-608727943291-kube-api-access-4wcvj\") pod \"memcached-0\" (UID: \"a1feca48-dc9b-434e-8caf-608727943291\") " pod="glance-kuttl-tests/memcached-0" Nov 26 15:21:02 crc kubenswrapper[4785]: I1126 15:21:02.957209 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a1feca48-dc9b-434e-8caf-608727943291-config-data\") pod \"memcached-0\" (UID: \"a1feca48-dc9b-434e-8caf-608727943291\") " pod="glance-kuttl-tests/memcached-0" Nov 26 15:21:02 crc kubenswrapper[4785]: I1126 15:21:02.957291 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/a1feca48-dc9b-434e-8caf-608727943291-kolla-config\") pod \"memcached-0\" (UID: \"a1feca48-dc9b-434e-8caf-608727943291\") " pod="glance-kuttl-tests/memcached-0" Nov 26 15:21:02 crc kubenswrapper[4785]: I1126 15:21:02.988226 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4wcvj\" (UniqueName: \"kubernetes.io/projected/a1feca48-dc9b-434e-8caf-608727943291-kube-api-access-4wcvj\") pod \"memcached-0\" (UID: \"a1feca48-dc9b-434e-8caf-608727943291\") " pod="glance-kuttl-tests/memcached-0" Nov 26 15:21:03 crc kubenswrapper[4785]: I1126 15:21:03.053909 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/memcached-0" Nov 26 15:21:03 crc kubenswrapper[4785]: I1126 15:21:03.497685 4785 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/memcached-0"] Nov 26 15:21:03 crc kubenswrapper[4785]: W1126 15:21:03.502648 4785 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda1feca48_dc9b_434e_8caf_608727943291.slice/crio-8767afb289096426106e2fbe5361462a22d011118fc8043f59a7e74055663d71 WatchSource:0}: Error finding container 8767afb289096426106e2fbe5361462a22d011118fc8043f59a7e74055663d71: Status 404 returned error can't find the container with id 8767afb289096426106e2fbe5361462a22d011118fc8043f59a7e74055663d71 Nov 26 15:21:04 crc kubenswrapper[4785]: I1126 15:21:04.443030 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/memcached-0" event={"ID":"a1feca48-dc9b-434e-8caf-608727943291","Type":"ContainerStarted","Data":"8767afb289096426106e2fbe5361462a22d011118fc8043f59a7e74055663d71"} Nov 26 15:21:04 crc kubenswrapper[4785]: I1126 15:21:04.444978 4785 generic.go:334] "Generic (PLEG): container finished" podID="15d94918-e52c-429d-a366-cbe3ac99ba3b" containerID="85e2635820e0e9cf77c30329b8c0b0fe27fa8891fd60e8191c6f2eb7712597d5" exitCode=0 Nov 26 15:21:04 crc kubenswrapper[4785]: I1126 15:21:04.445034 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5txwc" event={"ID":"15d94918-e52c-429d-a366-cbe3ac99ba3b","Type":"ContainerDied","Data":"85e2635820e0e9cf77c30329b8c0b0fe27fa8891fd60e8191c6f2eb7712597d5"} Nov 26 15:21:05 crc kubenswrapper[4785]: I1126 15:21:05.453264 4785 generic.go:334] "Generic (PLEG): container finished" podID="b8f3a4fb-df39-4059-a9dd-4f566b1e4860" containerID="32ff011d3b3a405323889fdde6d7a4092343716b220b4b702509f709663c7dd3" exitCode=0 Nov 26 15:21:05 crc kubenswrapper[4785]: I1126 15:21:05.453360 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/openstack-galera-2" event={"ID":"b8f3a4fb-df39-4059-a9dd-4f566b1e4860","Type":"ContainerDied","Data":"32ff011d3b3a405323889fdde6d7a4092343716b220b4b702509f709663c7dd3"} Nov 26 15:21:05 crc kubenswrapper[4785]: I1126 15:21:05.456435 4785 generic.go:334] "Generic (PLEG): container finished" podID="a9248619-c310-43ae-b33a-b51f3e9d0a03" containerID="6811b67b5412db2325a761152556a1ac9510242d65ff9425452a7a26af93a392" exitCode=0 Nov 26 15:21:05 crc kubenswrapper[4785]: I1126 15:21:05.456467 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/openstack-galera-1" event={"ID":"a9248619-c310-43ae-b33a-b51f3e9d0a03","Type":"ContainerDied","Data":"6811b67b5412db2325a761152556a1ac9510242d65ff9425452a7a26af93a392"} Nov 26 15:21:05 crc kubenswrapper[4785]: I1126 15:21:05.457859 4785 generic.go:334] "Generic (PLEG): container finished" podID="b12a01ef-3cf0-4e03-b38b-9b306ce01fdf" containerID="71107d285a39b58f8846694487e586f9c08be0e51530bdac438f704c10c8281b" exitCode=0 Nov 26 15:21:05 crc kubenswrapper[4785]: I1126 15:21:05.457883 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/openstack-galera-0" event={"ID":"b12a01ef-3cf0-4e03-b38b-9b306ce01fdf","Type":"ContainerDied","Data":"71107d285a39b58f8846694487e586f9c08be0e51530bdac438f704c10c8281b"} Nov 26 15:21:06 crc kubenswrapper[4785]: I1126 15:21:06.473351 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/openstack-galera-1" event={"ID":"a9248619-c310-43ae-b33a-b51f3e9d0a03","Type":"ContainerStarted","Data":"49cceb8adb3975b75b0478a1dc17a4b2673e80608bd3587612774c093f6e0b9e"} Nov 26 15:21:06 crc kubenswrapper[4785]: I1126 15:21:06.476054 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/openstack-galera-0" event={"ID":"b12a01ef-3cf0-4e03-b38b-9b306ce01fdf","Type":"ContainerStarted","Data":"cd3bfd715314567e3cb6e02afe7e422e280bdae3d714dc9361b473b353e6e3ef"} Nov 26 15:21:06 crc kubenswrapper[4785]: I1126 15:21:06.480871 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/memcached-0" event={"ID":"a1feca48-dc9b-434e-8caf-608727943291","Type":"ContainerStarted","Data":"9a4105b8e5b86f91a9aec5d8f593c58c0dac832fb9a814e8f784b0d102269581"} Nov 26 15:21:06 crc kubenswrapper[4785]: I1126 15:21:06.481147 4785 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/memcached-0" Nov 26 15:21:06 crc kubenswrapper[4785]: I1126 15:21:06.484456 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5txwc" event={"ID":"15d94918-e52c-429d-a366-cbe3ac99ba3b","Type":"ContainerStarted","Data":"e72d61d2bd7b776255f6e158bbd49ae6bae9a467fc08bea8caf8c000bf49967d"} Nov 26 15:21:06 crc kubenswrapper[4785]: I1126 15:21:06.486075 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/openstack-galera-2" event={"ID":"b8f3a4fb-df39-4059-a9dd-4f566b1e4860","Type":"ContainerStarted","Data":"7fce38fa44944c507931572bc187b781ffeeb0bf6d4850fb4df47e39b7a6c2f2"} Nov 26 15:21:06 crc kubenswrapper[4785]: I1126 15:21:06.504249 4785 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/openstack-galera-1" podStartSLOduration=9.769193505 podStartE2EDuration="16.504232888s" podCreationTimestamp="2025-11-26 15:20:50 +0000 UTC" firstStartedPulling="2025-11-26 15:20:53.412293597 +0000 UTC m=+817.090659361" lastFinishedPulling="2025-11-26 15:21:00.14733297 +0000 UTC m=+823.825698744" observedRunningTime="2025-11-26 15:21:06.501069492 +0000 UTC m=+830.179435256" watchObservedRunningTime="2025-11-26 15:21:06.504232888 +0000 UTC m=+830.182598652" Nov 26 15:21:06 crc kubenswrapper[4785]: I1126 15:21:06.522426 4785 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/openstack-galera-0" podStartSLOduration=8.383162663 podStartE2EDuration="16.522408205s" podCreationTimestamp="2025-11-26 15:20:50 +0000 UTC" firstStartedPulling="2025-11-26 15:20:52.034683964 +0000 UTC m=+815.713049728" lastFinishedPulling="2025-11-26 15:21:00.173929506 +0000 UTC m=+823.852295270" observedRunningTime="2025-11-26 15:21:06.51858439 +0000 UTC m=+830.196950164" watchObservedRunningTime="2025-11-26 15:21:06.522408205 +0000 UTC m=+830.200773969" Nov 26 15:21:06 crc kubenswrapper[4785]: I1126 15:21:06.541475 4785 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/openstack-galera-2" podStartSLOduration=9.410865412 podStartE2EDuration="16.541452195s" podCreationTimestamp="2025-11-26 15:20:50 +0000 UTC" firstStartedPulling="2025-11-26 15:20:53.001029518 +0000 UTC m=+816.679395292" lastFinishedPulling="2025-11-26 15:21:00.131616291 +0000 UTC m=+823.809982075" observedRunningTime="2025-11-26 15:21:06.538192236 +0000 UTC m=+830.216558000" watchObservedRunningTime="2025-11-26 15:21:06.541452195 +0000 UTC m=+830.219817959" Nov 26 15:21:06 crc kubenswrapper[4785]: I1126 15:21:06.558816 4785 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/memcached-0" podStartSLOduration=2.64075515 podStartE2EDuration="4.558794878s" podCreationTimestamp="2025-11-26 15:21:02 +0000 UTC" firstStartedPulling="2025-11-26 15:21:03.505143376 +0000 UTC m=+827.183509140" lastFinishedPulling="2025-11-26 15:21:05.423183104 +0000 UTC m=+829.101548868" observedRunningTime="2025-11-26 15:21:06.553900654 +0000 UTC m=+830.232266448" watchObservedRunningTime="2025-11-26 15:21:06.558794878 +0000 UTC m=+830.237160652" Nov 26 15:21:06 crc kubenswrapper[4785]: I1126 15:21:06.577718 4785 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-5txwc" podStartSLOduration=3.572614947 podStartE2EDuration="6.577697764s" podCreationTimestamp="2025-11-26 15:21:00 +0000 UTC" firstStartedPulling="2025-11-26 15:21:02.414042366 +0000 UTC m=+826.092408130" lastFinishedPulling="2025-11-26 15:21:05.419125183 +0000 UTC m=+829.097490947" observedRunningTime="2025-11-26 15:21:06.575376451 +0000 UTC m=+830.253742225" watchObservedRunningTime="2025-11-26 15:21:06.577697764 +0000 UTC m=+830.256063538" Nov 26 15:21:06 crc kubenswrapper[4785]: I1126 15:21:06.584949 4785 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-index-znj27"] Nov 26 15:21:06 crc kubenswrapper[4785]: I1126 15:21:06.586047 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-index-znj27" Nov 26 15:21:06 crc kubenswrapper[4785]: I1126 15:21:06.588363 4785 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-index-dockercfg-m98qt" Nov 26 15:21:06 crc kubenswrapper[4785]: I1126 15:21:06.590576 4785 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-index-znj27"] Nov 26 15:21:06 crc kubenswrapper[4785]: I1126 15:21:06.707818 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dbrq4\" (UniqueName: \"kubernetes.io/projected/50491112-0d7a-44d0-b66f-9920357a2eff-kube-api-access-dbrq4\") pod \"rabbitmq-cluster-operator-index-znj27\" (UID: \"50491112-0d7a-44d0-b66f-9920357a2eff\") " pod="openstack-operators/rabbitmq-cluster-operator-index-znj27" Nov 26 15:21:06 crc kubenswrapper[4785]: I1126 15:21:06.810177 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dbrq4\" (UniqueName: \"kubernetes.io/projected/50491112-0d7a-44d0-b66f-9920357a2eff-kube-api-access-dbrq4\") pod \"rabbitmq-cluster-operator-index-znj27\" (UID: \"50491112-0d7a-44d0-b66f-9920357a2eff\") " pod="openstack-operators/rabbitmq-cluster-operator-index-znj27" Nov 26 15:21:06 crc kubenswrapper[4785]: I1126 15:21:06.829002 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dbrq4\" (UniqueName: \"kubernetes.io/projected/50491112-0d7a-44d0-b66f-9920357a2eff-kube-api-access-dbrq4\") pod \"rabbitmq-cluster-operator-index-znj27\" (UID: \"50491112-0d7a-44d0-b66f-9920357a2eff\") " pod="openstack-operators/rabbitmq-cluster-operator-index-znj27" Nov 26 15:21:06 crc kubenswrapper[4785]: I1126 15:21:06.904621 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-index-znj27" Nov 26 15:21:07 crc kubenswrapper[4785]: I1126 15:21:07.340986 4785 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-index-znj27"] Nov 26 15:21:07 crc kubenswrapper[4785]: W1126 15:21:07.348723 4785 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod50491112_0d7a_44d0_b66f_9920357a2eff.slice/crio-02d77c0a0aabb747218e9c88e238c26038c675cd002e6e37496542c4e336c30b WatchSource:0}: Error finding container 02d77c0a0aabb747218e9c88e238c26038c675cd002e6e37496542c4e336c30b: Status 404 returned error can't find the container with id 02d77c0a0aabb747218e9c88e238c26038c675cd002e6e37496542c4e336c30b Nov 26 15:21:07 crc kubenswrapper[4785]: I1126 15:21:07.493889 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-index-znj27" event={"ID":"50491112-0d7a-44d0-b66f-9920357a2eff","Type":"ContainerStarted","Data":"02d77c0a0aabb747218e9c88e238c26038c675cd002e6e37496542c4e336c30b"} Nov 26 15:21:08 crc kubenswrapper[4785]: E1126 15:21:08.352285 4785 upgradeaware.go:441] Error proxying data from backend to client: writeto tcp 38.102.83.44:45966->38.102.83.44:36007: read tcp 38.102.83.44:45966->38.102.83.44:36007: read: connection reset by peer Nov 26 15:21:10 crc kubenswrapper[4785]: I1126 15:21:10.905287 4785 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-5txwc" Nov 26 15:21:10 crc kubenswrapper[4785]: I1126 15:21:10.906212 4785 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-5txwc" Nov 26 15:21:10 crc kubenswrapper[4785]: I1126 15:21:10.958041 4785 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-5txwc" Nov 26 15:21:11 crc kubenswrapper[4785]: I1126 15:21:11.455918 4785 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/openstack-galera-0" Nov 26 15:21:11 crc kubenswrapper[4785]: I1126 15:21:11.455966 4785 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/openstack-galera-0" Nov 26 15:21:11 crc kubenswrapper[4785]: I1126 15:21:11.475978 4785 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/openstack-galera-1" Nov 26 15:21:11 crc kubenswrapper[4785]: I1126 15:21:11.476029 4785 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/openstack-galera-1" Nov 26 15:21:11 crc kubenswrapper[4785]: I1126 15:21:11.483733 4785 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/openstack-galera-2" Nov 26 15:21:11 crc kubenswrapper[4785]: I1126 15:21:11.483850 4785 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/openstack-galera-2" Nov 26 15:21:11 crc kubenswrapper[4785]: I1126 15:21:11.529007 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-index-znj27" event={"ID":"50491112-0d7a-44d0-b66f-9920357a2eff","Type":"ContainerStarted","Data":"df11807e5648c9fbce706f2fbd0b70d3d44ea8566d7f3fc3c6845db41f3163b2"} Nov 26 15:21:11 crc kubenswrapper[4785]: I1126 15:21:11.587021 4785 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-5txwc" Nov 26 15:21:11 crc kubenswrapper[4785]: I1126 15:21:11.616583 4785 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-index-znj27" podStartSLOduration=2.404787189 podStartE2EDuration="5.616565308s" podCreationTimestamp="2025-11-26 15:21:06 +0000 UTC" firstStartedPulling="2025-11-26 15:21:07.35035743 +0000 UTC m=+831.028723214" lastFinishedPulling="2025-11-26 15:21:10.562135569 +0000 UTC m=+834.240501333" observedRunningTime="2025-11-26 15:21:11.553960159 +0000 UTC m=+835.232325933" watchObservedRunningTime="2025-11-26 15:21:11.616565308 +0000 UTC m=+835.294931082" Nov 26 15:21:13 crc kubenswrapper[4785]: I1126 15:21:13.056735 4785 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/memcached-0" Nov 26 15:21:15 crc kubenswrapper[4785]: I1126 15:21:15.573773 4785 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/openstack-galera-2" Nov 26 15:21:15 crc kubenswrapper[4785]: I1126 15:21:15.670429 4785 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/openstack-galera-2" Nov 26 15:21:15 crc kubenswrapper[4785]: E1126 15:21:15.875180 4785 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.44:39886->38.102.83.44:36007: write tcp 38.102.83.44:39886->38.102.83.44:36007: write: broken pipe Nov 26 15:21:15 crc kubenswrapper[4785]: E1126 15:21:15.915955 4785 upgradeaware.go:441] Error proxying data from backend to client: writeto tcp 38.102.83.44:39894->38.102.83.44:36007: read tcp 38.102.83.44:39894->38.102.83.44:36007: read: connection reset by peer Nov 26 15:21:15 crc kubenswrapper[4785]: I1126 15:21:15.975255 4785 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-5txwc"] Nov 26 15:21:15 crc kubenswrapper[4785]: I1126 15:21:15.975675 4785 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-5txwc" podUID="15d94918-e52c-429d-a366-cbe3ac99ba3b" containerName="registry-server" containerID="cri-o://e72d61d2bd7b776255f6e158bbd49ae6bae9a467fc08bea8caf8c000bf49967d" gracePeriod=2 Nov 26 15:21:16 crc kubenswrapper[4785]: I1126 15:21:16.463170 4785 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5txwc" Nov 26 15:21:16 crc kubenswrapper[4785]: I1126 15:21:16.544533 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bk74h\" (UniqueName: \"kubernetes.io/projected/15d94918-e52c-429d-a366-cbe3ac99ba3b-kube-api-access-bk74h\") pod \"15d94918-e52c-429d-a366-cbe3ac99ba3b\" (UID: \"15d94918-e52c-429d-a366-cbe3ac99ba3b\") " Nov 26 15:21:16 crc kubenswrapper[4785]: I1126 15:21:16.544647 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/15d94918-e52c-429d-a366-cbe3ac99ba3b-utilities\") pod \"15d94918-e52c-429d-a366-cbe3ac99ba3b\" (UID: \"15d94918-e52c-429d-a366-cbe3ac99ba3b\") " Nov 26 15:21:16 crc kubenswrapper[4785]: I1126 15:21:16.544724 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/15d94918-e52c-429d-a366-cbe3ac99ba3b-catalog-content\") pod \"15d94918-e52c-429d-a366-cbe3ac99ba3b\" (UID: \"15d94918-e52c-429d-a366-cbe3ac99ba3b\") " Nov 26 15:21:16 crc kubenswrapper[4785]: I1126 15:21:16.545817 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/15d94918-e52c-429d-a366-cbe3ac99ba3b-utilities" (OuterVolumeSpecName: "utilities") pod "15d94918-e52c-429d-a366-cbe3ac99ba3b" (UID: "15d94918-e52c-429d-a366-cbe3ac99ba3b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 15:21:16 crc kubenswrapper[4785]: I1126 15:21:16.553410 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/15d94918-e52c-429d-a366-cbe3ac99ba3b-kube-api-access-bk74h" (OuterVolumeSpecName: "kube-api-access-bk74h") pod "15d94918-e52c-429d-a366-cbe3ac99ba3b" (UID: "15d94918-e52c-429d-a366-cbe3ac99ba3b"). InnerVolumeSpecName "kube-api-access-bk74h". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 15:21:16 crc kubenswrapper[4785]: I1126 15:21:16.566988 4785 generic.go:334] "Generic (PLEG): container finished" podID="15d94918-e52c-429d-a366-cbe3ac99ba3b" containerID="e72d61d2bd7b776255f6e158bbd49ae6bae9a467fc08bea8caf8c000bf49967d" exitCode=0 Nov 26 15:21:16 crc kubenswrapper[4785]: I1126 15:21:16.567032 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5txwc" event={"ID":"15d94918-e52c-429d-a366-cbe3ac99ba3b","Type":"ContainerDied","Data":"e72d61d2bd7b776255f6e158bbd49ae6bae9a467fc08bea8caf8c000bf49967d"} Nov 26 15:21:16 crc kubenswrapper[4785]: I1126 15:21:16.567064 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5txwc" event={"ID":"15d94918-e52c-429d-a366-cbe3ac99ba3b","Type":"ContainerDied","Data":"468a9d0fb92b2001484d82d7beaa2e337bf3bb795718dc6f481fe4b311b0fbd2"} Nov 26 15:21:16 crc kubenswrapper[4785]: I1126 15:21:16.567089 4785 scope.go:117] "RemoveContainer" containerID="e72d61d2bd7b776255f6e158bbd49ae6bae9a467fc08bea8caf8c000bf49967d" Nov 26 15:21:16 crc kubenswrapper[4785]: I1126 15:21:16.567036 4785 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5txwc" Nov 26 15:21:16 crc kubenswrapper[4785]: I1126 15:21:16.587310 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/15d94918-e52c-429d-a366-cbe3ac99ba3b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "15d94918-e52c-429d-a366-cbe3ac99ba3b" (UID: "15d94918-e52c-429d-a366-cbe3ac99ba3b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 15:21:16 crc kubenswrapper[4785]: I1126 15:21:16.601247 4785 scope.go:117] "RemoveContainer" containerID="85e2635820e0e9cf77c30329b8c0b0fe27fa8891fd60e8191c6f2eb7712597d5" Nov 26 15:21:16 crc kubenswrapper[4785]: I1126 15:21:16.618412 4785 scope.go:117] "RemoveContainer" containerID="fceab1068f597d560db1d4b854a3508200d1bc40a8f8ab195acdecbf15f97bff" Nov 26 15:21:16 crc kubenswrapper[4785]: I1126 15:21:16.645954 4785 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/15d94918-e52c-429d-a366-cbe3ac99ba3b-utilities\") on node \"crc\" DevicePath \"\"" Nov 26 15:21:16 crc kubenswrapper[4785]: I1126 15:21:16.645989 4785 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/15d94918-e52c-429d-a366-cbe3ac99ba3b-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 26 15:21:16 crc kubenswrapper[4785]: I1126 15:21:16.646004 4785 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bk74h\" (UniqueName: \"kubernetes.io/projected/15d94918-e52c-429d-a366-cbe3ac99ba3b-kube-api-access-bk74h\") on node \"crc\" DevicePath \"\"" Nov 26 15:21:16 crc kubenswrapper[4785]: I1126 15:21:16.658090 4785 scope.go:117] "RemoveContainer" containerID="e72d61d2bd7b776255f6e158bbd49ae6bae9a467fc08bea8caf8c000bf49967d" Nov 26 15:21:16 crc kubenswrapper[4785]: E1126 15:21:16.658519 4785 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e72d61d2bd7b776255f6e158bbd49ae6bae9a467fc08bea8caf8c000bf49967d\": container with ID starting with e72d61d2bd7b776255f6e158bbd49ae6bae9a467fc08bea8caf8c000bf49967d not found: ID does not exist" containerID="e72d61d2bd7b776255f6e158bbd49ae6bae9a467fc08bea8caf8c000bf49967d" Nov 26 15:21:16 crc kubenswrapper[4785]: I1126 15:21:16.658566 4785 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e72d61d2bd7b776255f6e158bbd49ae6bae9a467fc08bea8caf8c000bf49967d"} err="failed to get container status \"e72d61d2bd7b776255f6e158bbd49ae6bae9a467fc08bea8caf8c000bf49967d\": rpc error: code = NotFound desc = could not find container \"e72d61d2bd7b776255f6e158bbd49ae6bae9a467fc08bea8caf8c000bf49967d\": container with ID starting with e72d61d2bd7b776255f6e158bbd49ae6bae9a467fc08bea8caf8c000bf49967d not found: ID does not exist" Nov 26 15:21:16 crc kubenswrapper[4785]: I1126 15:21:16.658593 4785 scope.go:117] "RemoveContainer" containerID="85e2635820e0e9cf77c30329b8c0b0fe27fa8891fd60e8191c6f2eb7712597d5" Nov 26 15:21:16 crc kubenswrapper[4785]: E1126 15:21:16.659077 4785 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"85e2635820e0e9cf77c30329b8c0b0fe27fa8891fd60e8191c6f2eb7712597d5\": container with ID starting with 85e2635820e0e9cf77c30329b8c0b0fe27fa8891fd60e8191c6f2eb7712597d5 not found: ID does not exist" containerID="85e2635820e0e9cf77c30329b8c0b0fe27fa8891fd60e8191c6f2eb7712597d5" Nov 26 15:21:16 crc kubenswrapper[4785]: I1126 15:21:16.659104 4785 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"85e2635820e0e9cf77c30329b8c0b0fe27fa8891fd60e8191c6f2eb7712597d5"} err="failed to get container status \"85e2635820e0e9cf77c30329b8c0b0fe27fa8891fd60e8191c6f2eb7712597d5\": rpc error: code = NotFound desc = could not find container \"85e2635820e0e9cf77c30329b8c0b0fe27fa8891fd60e8191c6f2eb7712597d5\": container with ID starting with 85e2635820e0e9cf77c30329b8c0b0fe27fa8891fd60e8191c6f2eb7712597d5 not found: ID does not exist" Nov 26 15:21:16 crc kubenswrapper[4785]: I1126 15:21:16.659123 4785 scope.go:117] "RemoveContainer" containerID="fceab1068f597d560db1d4b854a3508200d1bc40a8f8ab195acdecbf15f97bff" Nov 26 15:21:16 crc kubenswrapper[4785]: E1126 15:21:16.659403 4785 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fceab1068f597d560db1d4b854a3508200d1bc40a8f8ab195acdecbf15f97bff\": container with ID starting with fceab1068f597d560db1d4b854a3508200d1bc40a8f8ab195acdecbf15f97bff not found: ID does not exist" containerID="fceab1068f597d560db1d4b854a3508200d1bc40a8f8ab195acdecbf15f97bff" Nov 26 15:21:16 crc kubenswrapper[4785]: I1126 15:21:16.659448 4785 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fceab1068f597d560db1d4b854a3508200d1bc40a8f8ab195acdecbf15f97bff"} err="failed to get container status \"fceab1068f597d560db1d4b854a3508200d1bc40a8f8ab195acdecbf15f97bff\": rpc error: code = NotFound desc = could not find container \"fceab1068f597d560db1d4b854a3508200d1bc40a8f8ab195acdecbf15f97bff\": container with ID starting with fceab1068f597d560db1d4b854a3508200d1bc40a8f8ab195acdecbf15f97bff not found: ID does not exist" Nov 26 15:21:16 crc kubenswrapper[4785]: I1126 15:21:16.905044 4785 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/rabbitmq-cluster-operator-index-znj27" Nov 26 15:21:16 crc kubenswrapper[4785]: I1126 15:21:16.905290 4785 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/rabbitmq-cluster-operator-index-znj27" Nov 26 15:21:16 crc kubenswrapper[4785]: I1126 15:21:16.924928 4785 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-5txwc"] Nov 26 15:21:16 crc kubenswrapper[4785]: I1126 15:21:16.933623 4785 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/rabbitmq-cluster-operator-index-znj27" Nov 26 15:21:16 crc kubenswrapper[4785]: I1126 15:21:16.950026 4785 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-5txwc"] Nov 26 15:21:17 crc kubenswrapper[4785]: I1126 15:21:17.048497 4785 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="15d94918-e52c-429d-a366-cbe3ac99ba3b" path="/var/lib/kubelet/pods/15d94918-e52c-429d-a366-cbe3ac99ba3b/volumes" Nov 26 15:21:17 crc kubenswrapper[4785]: I1126 15:21:17.619463 4785 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/rabbitmq-cluster-operator-index-znj27" Nov 26 15:21:19 crc kubenswrapper[4785]: I1126 15:21:19.257928 4785 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590mtm2r"] Nov 26 15:21:19 crc kubenswrapper[4785]: E1126 15:21:19.259335 4785 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="15d94918-e52c-429d-a366-cbe3ac99ba3b" containerName="registry-server" Nov 26 15:21:19 crc kubenswrapper[4785]: I1126 15:21:19.259415 4785 state_mem.go:107] "Deleted CPUSet assignment" podUID="15d94918-e52c-429d-a366-cbe3ac99ba3b" containerName="registry-server" Nov 26 15:21:19 crc kubenswrapper[4785]: E1126 15:21:19.259484 4785 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="15d94918-e52c-429d-a366-cbe3ac99ba3b" containerName="extract-utilities" Nov 26 15:21:19 crc kubenswrapper[4785]: I1126 15:21:19.259544 4785 state_mem.go:107] "Deleted CPUSet assignment" podUID="15d94918-e52c-429d-a366-cbe3ac99ba3b" containerName="extract-utilities" Nov 26 15:21:19 crc kubenswrapper[4785]: E1126 15:21:19.259651 4785 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="15d94918-e52c-429d-a366-cbe3ac99ba3b" containerName="extract-content" Nov 26 15:21:19 crc kubenswrapper[4785]: I1126 15:21:19.259735 4785 state_mem.go:107] "Deleted CPUSet assignment" podUID="15d94918-e52c-429d-a366-cbe3ac99ba3b" containerName="extract-content" Nov 26 15:21:19 crc kubenswrapper[4785]: I1126 15:21:19.260033 4785 memory_manager.go:354] "RemoveStaleState removing state" podUID="15d94918-e52c-429d-a366-cbe3ac99ba3b" containerName="registry-server" Nov 26 15:21:19 crc kubenswrapper[4785]: I1126 15:21:19.261750 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590mtm2r" Nov 26 15:21:19 crc kubenswrapper[4785]: I1126 15:21:19.268042 4785 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-sfhll" Nov 26 15:21:19 crc kubenswrapper[4785]: I1126 15:21:19.281595 4785 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590mtm2r"] Nov 26 15:21:19 crc kubenswrapper[4785]: I1126 15:21:19.284843 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-22rlk\" (UniqueName: \"kubernetes.io/projected/c784173a-04b2-490a-b83a-ce589d9b5459-kube-api-access-22rlk\") pod \"9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590mtm2r\" (UID: \"c784173a-04b2-490a-b83a-ce589d9b5459\") " pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590mtm2r" Nov 26 15:21:19 crc kubenswrapper[4785]: I1126 15:21:19.284900 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c784173a-04b2-490a-b83a-ce589d9b5459-bundle\") pod \"9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590mtm2r\" (UID: \"c784173a-04b2-490a-b83a-ce589d9b5459\") " pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590mtm2r" Nov 26 15:21:19 crc kubenswrapper[4785]: I1126 15:21:19.284918 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c784173a-04b2-490a-b83a-ce589d9b5459-util\") pod \"9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590mtm2r\" (UID: \"c784173a-04b2-490a-b83a-ce589d9b5459\") " pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590mtm2r" Nov 26 15:21:19 crc kubenswrapper[4785]: I1126 15:21:19.385630 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-22rlk\" (UniqueName: \"kubernetes.io/projected/c784173a-04b2-490a-b83a-ce589d9b5459-kube-api-access-22rlk\") pod \"9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590mtm2r\" (UID: \"c784173a-04b2-490a-b83a-ce589d9b5459\") " pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590mtm2r" Nov 26 15:21:19 crc kubenswrapper[4785]: I1126 15:21:19.385682 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c784173a-04b2-490a-b83a-ce589d9b5459-bundle\") pod \"9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590mtm2r\" (UID: \"c784173a-04b2-490a-b83a-ce589d9b5459\") " pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590mtm2r" Nov 26 15:21:19 crc kubenswrapper[4785]: I1126 15:21:19.385700 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c784173a-04b2-490a-b83a-ce589d9b5459-util\") pod \"9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590mtm2r\" (UID: \"c784173a-04b2-490a-b83a-ce589d9b5459\") " pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590mtm2r" Nov 26 15:21:19 crc kubenswrapper[4785]: I1126 15:21:19.386105 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c784173a-04b2-490a-b83a-ce589d9b5459-util\") pod \"9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590mtm2r\" (UID: \"c784173a-04b2-490a-b83a-ce589d9b5459\") " pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590mtm2r" Nov 26 15:21:19 crc kubenswrapper[4785]: I1126 15:21:19.386517 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c784173a-04b2-490a-b83a-ce589d9b5459-bundle\") pod \"9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590mtm2r\" (UID: \"c784173a-04b2-490a-b83a-ce589d9b5459\") " pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590mtm2r" Nov 26 15:21:19 crc kubenswrapper[4785]: I1126 15:21:19.404191 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-22rlk\" (UniqueName: \"kubernetes.io/projected/c784173a-04b2-490a-b83a-ce589d9b5459-kube-api-access-22rlk\") pod \"9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590mtm2r\" (UID: \"c784173a-04b2-490a-b83a-ce589d9b5459\") " pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590mtm2r" Nov 26 15:21:19 crc kubenswrapper[4785]: I1126 15:21:19.583469 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590mtm2r" Nov 26 15:21:20 crc kubenswrapper[4785]: I1126 15:21:20.581517 4785 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-8fq5w"] Nov 26 15:21:20 crc kubenswrapper[4785]: I1126 15:21:20.583425 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8fq5w" Nov 26 15:21:20 crc kubenswrapper[4785]: I1126 15:21:20.595593 4785 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-8fq5w"] Nov 26 15:21:20 crc kubenswrapper[4785]: I1126 15:21:20.701375 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nlp7m\" (UniqueName: \"kubernetes.io/projected/fb85f3c0-cee4-43c6-a07b-058c3e0cd451-kube-api-access-nlp7m\") pod \"certified-operators-8fq5w\" (UID: \"fb85f3c0-cee4-43c6-a07b-058c3e0cd451\") " pod="openshift-marketplace/certified-operators-8fq5w" Nov 26 15:21:20 crc kubenswrapper[4785]: I1126 15:21:20.701473 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fb85f3c0-cee4-43c6-a07b-058c3e0cd451-utilities\") pod \"certified-operators-8fq5w\" (UID: \"fb85f3c0-cee4-43c6-a07b-058c3e0cd451\") " pod="openshift-marketplace/certified-operators-8fq5w" Nov 26 15:21:20 crc kubenswrapper[4785]: I1126 15:21:20.701503 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fb85f3c0-cee4-43c6-a07b-058c3e0cd451-catalog-content\") pod \"certified-operators-8fq5w\" (UID: \"fb85f3c0-cee4-43c6-a07b-058c3e0cd451\") " pod="openshift-marketplace/certified-operators-8fq5w" Nov 26 15:21:20 crc kubenswrapper[4785]: I1126 15:21:20.802785 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fb85f3c0-cee4-43c6-a07b-058c3e0cd451-catalog-content\") pod \"certified-operators-8fq5w\" (UID: \"fb85f3c0-cee4-43c6-a07b-058c3e0cd451\") " pod="openshift-marketplace/certified-operators-8fq5w" Nov 26 15:21:20 crc kubenswrapper[4785]: I1126 15:21:20.802907 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nlp7m\" (UniqueName: \"kubernetes.io/projected/fb85f3c0-cee4-43c6-a07b-058c3e0cd451-kube-api-access-nlp7m\") pod \"certified-operators-8fq5w\" (UID: \"fb85f3c0-cee4-43c6-a07b-058c3e0cd451\") " pod="openshift-marketplace/certified-operators-8fq5w" Nov 26 15:21:20 crc kubenswrapper[4785]: I1126 15:21:20.802966 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fb85f3c0-cee4-43c6-a07b-058c3e0cd451-utilities\") pod \"certified-operators-8fq5w\" (UID: \"fb85f3c0-cee4-43c6-a07b-058c3e0cd451\") " pod="openshift-marketplace/certified-operators-8fq5w" Nov 26 15:21:20 crc kubenswrapper[4785]: I1126 15:21:20.803292 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fb85f3c0-cee4-43c6-a07b-058c3e0cd451-catalog-content\") pod \"certified-operators-8fq5w\" (UID: \"fb85f3c0-cee4-43c6-a07b-058c3e0cd451\") " pod="openshift-marketplace/certified-operators-8fq5w" Nov 26 15:21:20 crc kubenswrapper[4785]: I1126 15:21:20.803387 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fb85f3c0-cee4-43c6-a07b-058c3e0cd451-utilities\") pod \"certified-operators-8fq5w\" (UID: \"fb85f3c0-cee4-43c6-a07b-058c3e0cd451\") " pod="openshift-marketplace/certified-operators-8fq5w" Nov 26 15:21:20 crc kubenswrapper[4785]: I1126 15:21:20.827505 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nlp7m\" (UniqueName: \"kubernetes.io/projected/fb85f3c0-cee4-43c6-a07b-058c3e0cd451-kube-api-access-nlp7m\") pod \"certified-operators-8fq5w\" (UID: \"fb85f3c0-cee4-43c6-a07b-058c3e0cd451\") " pod="openshift-marketplace/certified-operators-8fq5w" Nov 26 15:21:20 crc kubenswrapper[4785]: I1126 15:21:20.918075 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8fq5w" Nov 26 15:21:21 crc kubenswrapper[4785]: I1126 15:21:21.340269 4785 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590mtm2r"] Nov 26 15:21:21 crc kubenswrapper[4785]: I1126 15:21:21.462087 4785 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-8fq5w"] Nov 26 15:21:21 crc kubenswrapper[4785]: I1126 15:21:21.573603 4785 prober.go:107] "Probe failed" probeType="Readiness" pod="glance-kuttl-tests/openstack-galera-2" podUID="b8f3a4fb-df39-4059-a9dd-4f566b1e4860" containerName="galera" probeResult="failure" output=< Nov 26 15:21:21 crc kubenswrapper[4785]: wsrep_local_state_comment (Donor/Desynced) differs from Synced Nov 26 15:21:21 crc kubenswrapper[4785]: > Nov 26 15:21:21 crc kubenswrapper[4785]: I1126 15:21:21.607939 4785 generic.go:334] "Generic (PLEG): container finished" podID="c784173a-04b2-490a-b83a-ce589d9b5459" containerID="7407c367c0a084f670a71be0e7d9433c60f61689dc570c5c84fe91b261e2003f" exitCode=0 Nov 26 15:21:21 crc kubenswrapper[4785]: I1126 15:21:21.608013 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590mtm2r" event={"ID":"c784173a-04b2-490a-b83a-ce589d9b5459","Type":"ContainerDied","Data":"7407c367c0a084f670a71be0e7d9433c60f61689dc570c5c84fe91b261e2003f"} Nov 26 15:21:21 crc kubenswrapper[4785]: I1126 15:21:21.608042 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590mtm2r" event={"ID":"c784173a-04b2-490a-b83a-ce589d9b5459","Type":"ContainerStarted","Data":"1dd3bd0bfa37a0473c3315128823a74be1fbf35d06c50c803ce2e6e84461ac8c"} Nov 26 15:21:21 crc kubenswrapper[4785]: I1126 15:21:21.609510 4785 generic.go:334] "Generic (PLEG): container finished" podID="fb85f3c0-cee4-43c6-a07b-058c3e0cd451" containerID="c00e06cc7a05a2d571507e4b49a027693250ade30f90f09643bc4218fd49dda6" exitCode=0 Nov 26 15:21:21 crc kubenswrapper[4785]: I1126 15:21:21.609540 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8fq5w" event={"ID":"fb85f3c0-cee4-43c6-a07b-058c3e0cd451","Type":"ContainerDied","Data":"c00e06cc7a05a2d571507e4b49a027693250ade30f90f09643bc4218fd49dda6"} Nov 26 15:21:21 crc kubenswrapper[4785]: I1126 15:21:21.609577 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8fq5w" event={"ID":"fb85f3c0-cee4-43c6-a07b-058c3e0cd451","Type":"ContainerStarted","Data":"6f478ca87dfe188b1da2806f45f39c76627dd884083f980ed8854770eb7804f1"} Nov 26 15:21:22 crc kubenswrapper[4785]: I1126 15:21:22.618793 4785 generic.go:334] "Generic (PLEG): container finished" podID="fb85f3c0-cee4-43c6-a07b-058c3e0cd451" containerID="93066a7bba53f02c6e29a69041fb41177aea6aab57c239a20bc375bb1e2737e2" exitCode=0 Nov 26 15:21:22 crc kubenswrapper[4785]: I1126 15:21:22.618936 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8fq5w" event={"ID":"fb85f3c0-cee4-43c6-a07b-058c3e0cd451","Type":"ContainerDied","Data":"93066a7bba53f02c6e29a69041fb41177aea6aab57c239a20bc375bb1e2737e2"} Nov 26 15:21:23 crc kubenswrapper[4785]: I1126 15:21:23.626619 4785 generic.go:334] "Generic (PLEG): container finished" podID="c784173a-04b2-490a-b83a-ce589d9b5459" containerID="7790ded16cbe1d356540d2f54b0c57a3643a82dcc27fdbf2f8eeed33f6800f06" exitCode=0 Nov 26 15:21:23 crc kubenswrapper[4785]: I1126 15:21:23.626929 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590mtm2r" event={"ID":"c784173a-04b2-490a-b83a-ce589d9b5459","Type":"ContainerDied","Data":"7790ded16cbe1d356540d2f54b0c57a3643a82dcc27fdbf2f8eeed33f6800f06"} Nov 26 15:21:23 crc kubenswrapper[4785]: I1126 15:21:23.630273 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8fq5w" event={"ID":"fb85f3c0-cee4-43c6-a07b-058c3e0cd451","Type":"ContainerStarted","Data":"5eb98e3e59d931e676f7d50c8ecbcea3340a6bb9c400f819925d4c6f8d29467f"} Nov 26 15:21:23 crc kubenswrapper[4785]: I1126 15:21:23.683228 4785 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-8fq5w" podStartSLOduration=2.142448155 podStartE2EDuration="3.683163363s" podCreationTimestamp="2025-11-26 15:21:20 +0000 UTC" firstStartedPulling="2025-11-26 15:21:21.610727193 +0000 UTC m=+845.289092957" lastFinishedPulling="2025-11-26 15:21:23.151442401 +0000 UTC m=+846.829808165" observedRunningTime="2025-11-26 15:21:23.675887837 +0000 UTC m=+847.354253641" watchObservedRunningTime="2025-11-26 15:21:23.683163363 +0000 UTC m=+847.361529177" Nov 26 15:21:24 crc kubenswrapper[4785]: I1126 15:21:24.639176 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590mtm2r" event={"ID":"c784173a-04b2-490a-b83a-ce589d9b5459","Type":"ContainerStarted","Data":"22c6c5aa7a65d1dc88cc7b94faf89563407a34be6d881eda462322b186a91b75"} Nov 26 15:21:24 crc kubenswrapper[4785]: I1126 15:21:24.655474 4785 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590mtm2r" podStartSLOduration=4.531896897 podStartE2EDuration="5.655453517s" podCreationTimestamp="2025-11-26 15:21:19 +0000 UTC" firstStartedPulling="2025-11-26 15:21:21.609176221 +0000 UTC m=+845.287541985" lastFinishedPulling="2025-11-26 15:21:22.732732841 +0000 UTC m=+846.411098605" observedRunningTime="2025-11-26 15:21:24.654146242 +0000 UTC m=+848.332512026" watchObservedRunningTime="2025-11-26 15:21:24.655453517 +0000 UTC m=+848.333819281" Nov 26 15:21:25 crc kubenswrapper[4785]: I1126 15:21:25.648520 4785 generic.go:334] "Generic (PLEG): container finished" podID="c784173a-04b2-490a-b83a-ce589d9b5459" containerID="22c6c5aa7a65d1dc88cc7b94faf89563407a34be6d881eda462322b186a91b75" exitCode=0 Nov 26 15:21:25 crc kubenswrapper[4785]: I1126 15:21:25.648589 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590mtm2r" event={"ID":"c784173a-04b2-490a-b83a-ce589d9b5459","Type":"ContainerDied","Data":"22c6c5aa7a65d1dc88cc7b94faf89563407a34be6d881eda462322b186a91b75"} Nov 26 15:21:27 crc kubenswrapper[4785]: I1126 15:21:27.016451 4785 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590mtm2r" Nov 26 15:21:27 crc kubenswrapper[4785]: I1126 15:21:27.198525 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c784173a-04b2-490a-b83a-ce589d9b5459-bundle\") pod \"c784173a-04b2-490a-b83a-ce589d9b5459\" (UID: \"c784173a-04b2-490a-b83a-ce589d9b5459\") " Nov 26 15:21:27 crc kubenswrapper[4785]: I1126 15:21:27.198591 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-22rlk\" (UniqueName: \"kubernetes.io/projected/c784173a-04b2-490a-b83a-ce589d9b5459-kube-api-access-22rlk\") pod \"c784173a-04b2-490a-b83a-ce589d9b5459\" (UID: \"c784173a-04b2-490a-b83a-ce589d9b5459\") " Nov 26 15:21:27 crc kubenswrapper[4785]: I1126 15:21:27.198632 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c784173a-04b2-490a-b83a-ce589d9b5459-util\") pod \"c784173a-04b2-490a-b83a-ce589d9b5459\" (UID: \"c784173a-04b2-490a-b83a-ce589d9b5459\") " Nov 26 15:21:27 crc kubenswrapper[4785]: I1126 15:21:27.199023 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c784173a-04b2-490a-b83a-ce589d9b5459-bundle" (OuterVolumeSpecName: "bundle") pod "c784173a-04b2-490a-b83a-ce589d9b5459" (UID: "c784173a-04b2-490a-b83a-ce589d9b5459"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 15:21:27 crc kubenswrapper[4785]: I1126 15:21:27.205998 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c784173a-04b2-490a-b83a-ce589d9b5459-kube-api-access-22rlk" (OuterVolumeSpecName: "kube-api-access-22rlk") pod "c784173a-04b2-490a-b83a-ce589d9b5459" (UID: "c784173a-04b2-490a-b83a-ce589d9b5459"). InnerVolumeSpecName "kube-api-access-22rlk". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 15:21:27 crc kubenswrapper[4785]: I1126 15:21:27.209112 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c784173a-04b2-490a-b83a-ce589d9b5459-util" (OuterVolumeSpecName: "util") pod "c784173a-04b2-490a-b83a-ce589d9b5459" (UID: "c784173a-04b2-490a-b83a-ce589d9b5459"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 15:21:27 crc kubenswrapper[4785]: I1126 15:21:27.300215 4785 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c784173a-04b2-490a-b83a-ce589d9b5459-bundle\") on node \"crc\" DevicePath \"\"" Nov 26 15:21:27 crc kubenswrapper[4785]: I1126 15:21:27.300514 4785 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-22rlk\" (UniqueName: \"kubernetes.io/projected/c784173a-04b2-490a-b83a-ce589d9b5459-kube-api-access-22rlk\") on node \"crc\" DevicePath \"\"" Nov 26 15:21:27 crc kubenswrapper[4785]: I1126 15:21:27.300527 4785 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c784173a-04b2-490a-b83a-ce589d9b5459-util\") on node \"crc\" DevicePath \"\"" Nov 26 15:21:27 crc kubenswrapper[4785]: I1126 15:21:27.612439 4785 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/openstack-galera-0" Nov 26 15:21:27 crc kubenswrapper[4785]: I1126 15:21:27.673383 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590mtm2r" event={"ID":"c784173a-04b2-490a-b83a-ce589d9b5459","Type":"ContainerDied","Data":"1dd3bd0bfa37a0473c3315128823a74be1fbf35d06c50c803ce2e6e84461ac8c"} Nov 26 15:21:27 crc kubenswrapper[4785]: I1126 15:21:27.673438 4785 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1dd3bd0bfa37a0473c3315128823a74be1fbf35d06c50c803ce2e6e84461ac8c" Nov 26 15:21:27 crc kubenswrapper[4785]: I1126 15:21:27.673454 4785 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590mtm2r" Nov 26 15:21:27 crc kubenswrapper[4785]: I1126 15:21:27.714908 4785 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/openstack-galera-0" Nov 26 15:21:27 crc kubenswrapper[4785]: E1126 15:21:27.813824 4785 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc784173a_04b2_490a_b83a_ce589d9b5459.slice\": RecentStats: unable to find data in memory cache]" Nov 26 15:21:30 crc kubenswrapper[4785]: I1126 15:21:30.161302 4785 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/openstack-galera-1" Nov 26 15:21:30 crc kubenswrapper[4785]: I1126 15:21:30.245526 4785 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/openstack-galera-1" Nov 26 15:21:30 crc kubenswrapper[4785]: I1126 15:21:30.589530 4785 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-lcdxx"] Nov 26 15:21:30 crc kubenswrapper[4785]: E1126 15:21:30.589777 4785 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c784173a-04b2-490a-b83a-ce589d9b5459" containerName="pull" Nov 26 15:21:30 crc kubenswrapper[4785]: I1126 15:21:30.589789 4785 state_mem.go:107] "Deleted CPUSet assignment" podUID="c784173a-04b2-490a-b83a-ce589d9b5459" containerName="pull" Nov 26 15:21:30 crc kubenswrapper[4785]: E1126 15:21:30.589806 4785 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c784173a-04b2-490a-b83a-ce589d9b5459" containerName="extract" Nov 26 15:21:30 crc kubenswrapper[4785]: I1126 15:21:30.589812 4785 state_mem.go:107] "Deleted CPUSet assignment" podUID="c784173a-04b2-490a-b83a-ce589d9b5459" containerName="extract" Nov 26 15:21:30 crc kubenswrapper[4785]: E1126 15:21:30.589830 4785 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c784173a-04b2-490a-b83a-ce589d9b5459" containerName="util" Nov 26 15:21:30 crc kubenswrapper[4785]: I1126 15:21:30.589835 4785 state_mem.go:107] "Deleted CPUSet assignment" podUID="c784173a-04b2-490a-b83a-ce589d9b5459" containerName="util" Nov 26 15:21:30 crc kubenswrapper[4785]: I1126 15:21:30.589948 4785 memory_manager.go:354] "RemoveStaleState removing state" podUID="c784173a-04b2-490a-b83a-ce589d9b5459" containerName="extract" Nov 26 15:21:30 crc kubenswrapper[4785]: I1126 15:21:30.590749 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lcdxx" Nov 26 15:21:30 crc kubenswrapper[4785]: I1126 15:21:30.612256 4785 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-lcdxx"] Nov 26 15:21:30 crc kubenswrapper[4785]: I1126 15:21:30.746416 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e588c635-bfa9-4276-ba15-0ca8b125fc67-utilities\") pod \"community-operators-lcdxx\" (UID: \"e588c635-bfa9-4276-ba15-0ca8b125fc67\") " pod="openshift-marketplace/community-operators-lcdxx" Nov 26 15:21:30 crc kubenswrapper[4785]: I1126 15:21:30.746494 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ch629\" (UniqueName: \"kubernetes.io/projected/e588c635-bfa9-4276-ba15-0ca8b125fc67-kube-api-access-ch629\") pod \"community-operators-lcdxx\" (UID: \"e588c635-bfa9-4276-ba15-0ca8b125fc67\") " pod="openshift-marketplace/community-operators-lcdxx" Nov 26 15:21:30 crc kubenswrapper[4785]: I1126 15:21:30.746650 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e588c635-bfa9-4276-ba15-0ca8b125fc67-catalog-content\") pod \"community-operators-lcdxx\" (UID: \"e588c635-bfa9-4276-ba15-0ca8b125fc67\") " pod="openshift-marketplace/community-operators-lcdxx" Nov 26 15:21:30 crc kubenswrapper[4785]: I1126 15:21:30.847726 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e588c635-bfa9-4276-ba15-0ca8b125fc67-utilities\") pod \"community-operators-lcdxx\" (UID: \"e588c635-bfa9-4276-ba15-0ca8b125fc67\") " pod="openshift-marketplace/community-operators-lcdxx" Nov 26 15:21:30 crc kubenswrapper[4785]: I1126 15:21:30.847791 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ch629\" (UniqueName: \"kubernetes.io/projected/e588c635-bfa9-4276-ba15-0ca8b125fc67-kube-api-access-ch629\") pod \"community-operators-lcdxx\" (UID: \"e588c635-bfa9-4276-ba15-0ca8b125fc67\") " pod="openshift-marketplace/community-operators-lcdxx" Nov 26 15:21:30 crc kubenswrapper[4785]: I1126 15:21:30.847827 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e588c635-bfa9-4276-ba15-0ca8b125fc67-catalog-content\") pod \"community-operators-lcdxx\" (UID: \"e588c635-bfa9-4276-ba15-0ca8b125fc67\") " pod="openshift-marketplace/community-operators-lcdxx" Nov 26 15:21:30 crc kubenswrapper[4785]: I1126 15:21:30.848319 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e588c635-bfa9-4276-ba15-0ca8b125fc67-utilities\") pod \"community-operators-lcdxx\" (UID: \"e588c635-bfa9-4276-ba15-0ca8b125fc67\") " pod="openshift-marketplace/community-operators-lcdxx" Nov 26 15:21:30 crc kubenswrapper[4785]: I1126 15:21:30.848382 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e588c635-bfa9-4276-ba15-0ca8b125fc67-catalog-content\") pod \"community-operators-lcdxx\" (UID: \"e588c635-bfa9-4276-ba15-0ca8b125fc67\") " pod="openshift-marketplace/community-operators-lcdxx" Nov 26 15:21:30 crc kubenswrapper[4785]: I1126 15:21:30.865460 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ch629\" (UniqueName: \"kubernetes.io/projected/e588c635-bfa9-4276-ba15-0ca8b125fc67-kube-api-access-ch629\") pod \"community-operators-lcdxx\" (UID: \"e588c635-bfa9-4276-ba15-0ca8b125fc67\") " pod="openshift-marketplace/community-operators-lcdxx" Nov 26 15:21:30 crc kubenswrapper[4785]: I1126 15:21:30.906940 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lcdxx" Nov 26 15:21:30 crc kubenswrapper[4785]: I1126 15:21:30.920914 4785 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-8fq5w" Nov 26 15:21:30 crc kubenswrapper[4785]: I1126 15:21:30.920970 4785 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-8fq5w" Nov 26 15:21:30 crc kubenswrapper[4785]: I1126 15:21:30.974518 4785 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-8fq5w" Nov 26 15:21:31 crc kubenswrapper[4785]: I1126 15:21:31.337362 4785 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-lcdxx"] Nov 26 15:21:31 crc kubenswrapper[4785]: W1126 15:21:31.345450 4785 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode588c635_bfa9_4276_ba15_0ca8b125fc67.slice/crio-9bda3538e010896f68129a5da684464b486b4059024f66111619423dc2404a23 WatchSource:0}: Error finding container 9bda3538e010896f68129a5da684464b486b4059024f66111619423dc2404a23: Status 404 returned error can't find the container with id 9bda3538e010896f68129a5da684464b486b4059024f66111619423dc2404a23 Nov 26 15:21:31 crc kubenswrapper[4785]: I1126 15:21:31.695635 4785 generic.go:334] "Generic (PLEG): container finished" podID="e588c635-bfa9-4276-ba15-0ca8b125fc67" containerID="3b056477d76163b01d77bb7dc1c45d38b517a57297375029ac48516bf431561d" exitCode=0 Nov 26 15:21:31 crc kubenswrapper[4785]: I1126 15:21:31.695694 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lcdxx" event={"ID":"e588c635-bfa9-4276-ba15-0ca8b125fc67","Type":"ContainerDied","Data":"3b056477d76163b01d77bb7dc1c45d38b517a57297375029ac48516bf431561d"} Nov 26 15:21:31 crc kubenswrapper[4785]: I1126 15:21:31.695730 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lcdxx" event={"ID":"e588c635-bfa9-4276-ba15-0ca8b125fc67","Type":"ContainerStarted","Data":"9bda3538e010896f68129a5da684464b486b4059024f66111619423dc2404a23"} Nov 26 15:21:31 crc kubenswrapper[4785]: I1126 15:21:31.742129 4785 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-8fq5w" Nov 26 15:21:34 crc kubenswrapper[4785]: I1126 15:21:34.580690 4785 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-6gnd4"] Nov 26 15:21:34 crc kubenswrapper[4785]: I1126 15:21:34.582339 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6gnd4" Nov 26 15:21:34 crc kubenswrapper[4785]: I1126 15:21:34.595136 4785 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-6gnd4"] Nov 26 15:21:34 crc kubenswrapper[4785]: I1126 15:21:34.706788 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e9f21285-8a49-46c8-996d-6f51474bbc1c-utilities\") pod \"redhat-operators-6gnd4\" (UID: \"e9f21285-8a49-46c8-996d-6f51474bbc1c\") " pod="openshift-marketplace/redhat-operators-6gnd4" Nov 26 15:21:34 crc kubenswrapper[4785]: I1126 15:21:34.706865 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e9f21285-8a49-46c8-996d-6f51474bbc1c-catalog-content\") pod \"redhat-operators-6gnd4\" (UID: \"e9f21285-8a49-46c8-996d-6f51474bbc1c\") " pod="openshift-marketplace/redhat-operators-6gnd4" Nov 26 15:21:34 crc kubenswrapper[4785]: I1126 15:21:34.706919 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v88xm\" (UniqueName: \"kubernetes.io/projected/e9f21285-8a49-46c8-996d-6f51474bbc1c-kube-api-access-v88xm\") pod \"redhat-operators-6gnd4\" (UID: \"e9f21285-8a49-46c8-996d-6f51474bbc1c\") " pod="openshift-marketplace/redhat-operators-6gnd4" Nov 26 15:21:34 crc kubenswrapper[4785]: I1126 15:21:34.809278 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e9f21285-8a49-46c8-996d-6f51474bbc1c-catalog-content\") pod \"redhat-operators-6gnd4\" (UID: \"e9f21285-8a49-46c8-996d-6f51474bbc1c\") " pod="openshift-marketplace/redhat-operators-6gnd4" Nov 26 15:21:34 crc kubenswrapper[4785]: I1126 15:21:34.809368 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v88xm\" (UniqueName: \"kubernetes.io/projected/e9f21285-8a49-46c8-996d-6f51474bbc1c-kube-api-access-v88xm\") pod \"redhat-operators-6gnd4\" (UID: \"e9f21285-8a49-46c8-996d-6f51474bbc1c\") " pod="openshift-marketplace/redhat-operators-6gnd4" Nov 26 15:21:34 crc kubenswrapper[4785]: I1126 15:21:34.809435 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e9f21285-8a49-46c8-996d-6f51474bbc1c-utilities\") pod \"redhat-operators-6gnd4\" (UID: \"e9f21285-8a49-46c8-996d-6f51474bbc1c\") " pod="openshift-marketplace/redhat-operators-6gnd4" Nov 26 15:21:34 crc kubenswrapper[4785]: I1126 15:21:34.809923 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e9f21285-8a49-46c8-996d-6f51474bbc1c-utilities\") pod \"redhat-operators-6gnd4\" (UID: \"e9f21285-8a49-46c8-996d-6f51474bbc1c\") " pod="openshift-marketplace/redhat-operators-6gnd4" Nov 26 15:21:34 crc kubenswrapper[4785]: I1126 15:21:34.810199 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e9f21285-8a49-46c8-996d-6f51474bbc1c-catalog-content\") pod \"redhat-operators-6gnd4\" (UID: \"e9f21285-8a49-46c8-996d-6f51474bbc1c\") " pod="openshift-marketplace/redhat-operators-6gnd4" Nov 26 15:21:34 crc kubenswrapper[4785]: I1126 15:21:34.838951 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v88xm\" (UniqueName: \"kubernetes.io/projected/e9f21285-8a49-46c8-996d-6f51474bbc1c-kube-api-access-v88xm\") pod \"redhat-operators-6gnd4\" (UID: \"e9f21285-8a49-46c8-996d-6f51474bbc1c\") " pod="openshift-marketplace/redhat-operators-6gnd4" Nov 26 15:21:34 crc kubenswrapper[4785]: I1126 15:21:34.907162 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6gnd4" Nov 26 15:21:36 crc kubenswrapper[4785]: I1126 15:21:36.444432 4785 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-779fc9694b-6pwlx"] Nov 26 15:21:36 crc kubenswrapper[4785]: I1126 15:21:36.446407 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-6pwlx" Nov 26 15:21:36 crc kubenswrapper[4785]: I1126 15:21:36.452111 4785 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-dockercfg-xx9r4" Nov 26 15:21:36 crc kubenswrapper[4785]: I1126 15:21:36.455786 4785 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-779fc9694b-6pwlx"] Nov 26 15:21:36 crc kubenswrapper[4785]: I1126 15:21:36.637323 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mc5l7\" (UniqueName: \"kubernetes.io/projected/c39759f2-3183-48fa-aaee-14b24c5337d7-kube-api-access-mc5l7\") pod \"rabbitmq-cluster-operator-779fc9694b-6pwlx\" (UID: \"c39759f2-3183-48fa-aaee-14b24c5337d7\") " pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-6pwlx" Nov 26 15:21:36 crc kubenswrapper[4785]: I1126 15:21:36.732272 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lcdxx" event={"ID":"e588c635-bfa9-4276-ba15-0ca8b125fc67","Type":"ContainerStarted","Data":"a08da3af6dfa612bfe145e7993363b44f20cc48c2b949b12a167b49953a729f0"} Nov 26 15:21:36 crc kubenswrapper[4785]: I1126 15:21:36.739446 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mc5l7\" (UniqueName: \"kubernetes.io/projected/c39759f2-3183-48fa-aaee-14b24c5337d7-kube-api-access-mc5l7\") pod \"rabbitmq-cluster-operator-779fc9694b-6pwlx\" (UID: \"c39759f2-3183-48fa-aaee-14b24c5337d7\") " pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-6pwlx" Nov 26 15:21:36 crc kubenswrapper[4785]: I1126 15:21:36.759453 4785 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-6gnd4"] Nov 26 15:21:36 crc kubenswrapper[4785]: I1126 15:21:36.766273 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mc5l7\" (UniqueName: \"kubernetes.io/projected/c39759f2-3183-48fa-aaee-14b24c5337d7-kube-api-access-mc5l7\") pod \"rabbitmq-cluster-operator-779fc9694b-6pwlx\" (UID: \"c39759f2-3183-48fa-aaee-14b24c5337d7\") " pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-6pwlx" Nov 26 15:21:36 crc kubenswrapper[4785]: I1126 15:21:36.784701 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-6pwlx" Nov 26 15:21:37 crc kubenswrapper[4785]: I1126 15:21:37.257858 4785 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-779fc9694b-6pwlx"] Nov 26 15:21:37 crc kubenswrapper[4785]: W1126 15:21:37.269392 4785 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podc39759f2_3183_48fa_aaee_14b24c5337d7.slice/crio-39bd9203591aa928eaf2ddfab96535fa5d673a127c19858c0d8f950e221d5f07 WatchSource:0}: Error finding container 39bd9203591aa928eaf2ddfab96535fa5d673a127c19858c0d8f950e221d5f07: Status 404 returned error can't find the container with id 39bd9203591aa928eaf2ddfab96535fa5d673a127c19858c0d8f950e221d5f07 Nov 26 15:21:37 crc kubenswrapper[4785]: I1126 15:21:37.756798 4785 generic.go:334] "Generic (PLEG): container finished" podID="e9f21285-8a49-46c8-996d-6f51474bbc1c" containerID="15b23977197dfd5e3a700230df43e60cb1f996522cfc8e8a485df41cba6907ed" exitCode=0 Nov 26 15:21:37 crc kubenswrapper[4785]: I1126 15:21:37.756931 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6gnd4" event={"ID":"e9f21285-8a49-46c8-996d-6f51474bbc1c","Type":"ContainerDied","Data":"15b23977197dfd5e3a700230df43e60cb1f996522cfc8e8a485df41cba6907ed"} Nov 26 15:21:37 crc kubenswrapper[4785]: I1126 15:21:37.757283 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6gnd4" event={"ID":"e9f21285-8a49-46c8-996d-6f51474bbc1c","Type":"ContainerStarted","Data":"610ffa28bc3e89ba39b5ecefc63e0f846cb828c0aad9c69b24188b4dd8c1487e"} Nov 26 15:21:37 crc kubenswrapper[4785]: I1126 15:21:37.758615 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-6pwlx" event={"ID":"c39759f2-3183-48fa-aaee-14b24c5337d7","Type":"ContainerStarted","Data":"39bd9203591aa928eaf2ddfab96535fa5d673a127c19858c0d8f950e221d5f07"} Nov 26 15:21:37 crc kubenswrapper[4785]: I1126 15:21:37.760233 4785 generic.go:334] "Generic (PLEG): container finished" podID="e588c635-bfa9-4276-ba15-0ca8b125fc67" containerID="a08da3af6dfa612bfe145e7993363b44f20cc48c2b949b12a167b49953a729f0" exitCode=0 Nov 26 15:21:37 crc kubenswrapper[4785]: I1126 15:21:37.760261 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lcdxx" event={"ID":"e588c635-bfa9-4276-ba15-0ca8b125fc67","Type":"ContainerDied","Data":"a08da3af6dfa612bfe145e7993363b44f20cc48c2b949b12a167b49953a729f0"} Nov 26 15:21:39 crc kubenswrapper[4785]: I1126 15:21:39.370828 4785 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-8fq5w"] Nov 26 15:21:39 crc kubenswrapper[4785]: I1126 15:21:39.371415 4785 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-8fq5w" podUID="fb85f3c0-cee4-43c6-a07b-058c3e0cd451" containerName="registry-server" containerID="cri-o://5eb98e3e59d931e676f7d50c8ecbcea3340a6bb9c400f819925d4c6f8d29467f" gracePeriod=2 Nov 26 15:21:39 crc kubenswrapper[4785]: I1126 15:21:39.775210 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lcdxx" event={"ID":"e588c635-bfa9-4276-ba15-0ca8b125fc67","Type":"ContainerStarted","Data":"bcb4a786bec2bef04aa25f9053640a8563f23db793b4f92972c93917e035c044"} Nov 26 15:21:39 crc kubenswrapper[4785]: I1126 15:21:39.784697 4785 generic.go:334] "Generic (PLEG): container finished" podID="fb85f3c0-cee4-43c6-a07b-058c3e0cd451" containerID="5eb98e3e59d931e676f7d50c8ecbcea3340a6bb9c400f819925d4c6f8d29467f" exitCode=0 Nov 26 15:21:39 crc kubenswrapper[4785]: I1126 15:21:39.784749 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8fq5w" event={"ID":"fb85f3c0-cee4-43c6-a07b-058c3e0cd451","Type":"ContainerDied","Data":"5eb98e3e59d931e676f7d50c8ecbcea3340a6bb9c400f819925d4c6f8d29467f"} Nov 26 15:21:40 crc kubenswrapper[4785]: I1126 15:21:40.907625 4785 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-lcdxx" Nov 26 15:21:40 crc kubenswrapper[4785]: I1126 15:21:40.908022 4785 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-lcdxx" Nov 26 15:21:40 crc kubenswrapper[4785]: E1126 15:21:40.920400 4785 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 5eb98e3e59d931e676f7d50c8ecbcea3340a6bb9c400f819925d4c6f8d29467f is running failed: container process not found" containerID="5eb98e3e59d931e676f7d50c8ecbcea3340a6bb9c400f819925d4c6f8d29467f" cmd=["grpc_health_probe","-addr=:50051"] Nov 26 15:21:40 crc kubenswrapper[4785]: E1126 15:21:40.921027 4785 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 5eb98e3e59d931e676f7d50c8ecbcea3340a6bb9c400f819925d4c6f8d29467f is running failed: container process not found" containerID="5eb98e3e59d931e676f7d50c8ecbcea3340a6bb9c400f819925d4c6f8d29467f" cmd=["grpc_health_probe","-addr=:50051"] Nov 26 15:21:40 crc kubenswrapper[4785]: E1126 15:21:40.921348 4785 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 5eb98e3e59d931e676f7d50c8ecbcea3340a6bb9c400f819925d4c6f8d29467f is running failed: container process not found" containerID="5eb98e3e59d931e676f7d50c8ecbcea3340a6bb9c400f819925d4c6f8d29467f" cmd=["grpc_health_probe","-addr=:50051"] Nov 26 15:21:40 crc kubenswrapper[4785]: E1126 15:21:40.921390 4785 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 5eb98e3e59d931e676f7d50c8ecbcea3340a6bb9c400f819925d4c6f8d29467f is running failed: container process not found" probeType="Readiness" pod="openshift-marketplace/certified-operators-8fq5w" podUID="fb85f3c0-cee4-43c6-a07b-058c3e0cd451" containerName="registry-server" Nov 26 15:21:41 crc kubenswrapper[4785]: I1126 15:21:41.186821 4785 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8fq5w" Nov 26 15:21:41 crc kubenswrapper[4785]: I1126 15:21:41.211454 4785 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-lcdxx" podStartSLOduration=3.612372481 podStartE2EDuration="11.211429688s" podCreationTimestamp="2025-11-26 15:21:30 +0000 UTC" firstStartedPulling="2025-11-26 15:21:31.696879719 +0000 UTC m=+855.375245483" lastFinishedPulling="2025-11-26 15:21:39.295936926 +0000 UTC m=+862.974302690" observedRunningTime="2025-11-26 15:21:39.794030541 +0000 UTC m=+863.472396315" watchObservedRunningTime="2025-11-26 15:21:41.211429688 +0000 UTC m=+864.889795462" Nov 26 15:21:41 crc kubenswrapper[4785]: I1126 15:21:41.303127 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nlp7m\" (UniqueName: \"kubernetes.io/projected/fb85f3c0-cee4-43c6-a07b-058c3e0cd451-kube-api-access-nlp7m\") pod \"fb85f3c0-cee4-43c6-a07b-058c3e0cd451\" (UID: \"fb85f3c0-cee4-43c6-a07b-058c3e0cd451\") " Nov 26 15:21:41 crc kubenswrapper[4785]: I1126 15:21:41.303479 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fb85f3c0-cee4-43c6-a07b-058c3e0cd451-catalog-content\") pod \"fb85f3c0-cee4-43c6-a07b-058c3e0cd451\" (UID: \"fb85f3c0-cee4-43c6-a07b-058c3e0cd451\") " Nov 26 15:21:41 crc kubenswrapper[4785]: I1126 15:21:41.303526 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fb85f3c0-cee4-43c6-a07b-058c3e0cd451-utilities\") pod \"fb85f3c0-cee4-43c6-a07b-058c3e0cd451\" (UID: \"fb85f3c0-cee4-43c6-a07b-058c3e0cd451\") " Nov 26 15:21:41 crc kubenswrapper[4785]: I1126 15:21:41.305209 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fb85f3c0-cee4-43c6-a07b-058c3e0cd451-utilities" (OuterVolumeSpecName: "utilities") pod "fb85f3c0-cee4-43c6-a07b-058c3e0cd451" (UID: "fb85f3c0-cee4-43c6-a07b-058c3e0cd451"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 15:21:41 crc kubenswrapper[4785]: I1126 15:21:41.308706 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fb85f3c0-cee4-43c6-a07b-058c3e0cd451-kube-api-access-nlp7m" (OuterVolumeSpecName: "kube-api-access-nlp7m") pod "fb85f3c0-cee4-43c6-a07b-058c3e0cd451" (UID: "fb85f3c0-cee4-43c6-a07b-058c3e0cd451"). InnerVolumeSpecName "kube-api-access-nlp7m". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 15:21:41 crc kubenswrapper[4785]: I1126 15:21:41.359260 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fb85f3c0-cee4-43c6-a07b-058c3e0cd451-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "fb85f3c0-cee4-43c6-a07b-058c3e0cd451" (UID: "fb85f3c0-cee4-43c6-a07b-058c3e0cd451"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 15:21:41 crc kubenswrapper[4785]: I1126 15:21:41.405186 4785 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nlp7m\" (UniqueName: \"kubernetes.io/projected/fb85f3c0-cee4-43c6-a07b-058c3e0cd451-kube-api-access-nlp7m\") on node \"crc\" DevicePath \"\"" Nov 26 15:21:41 crc kubenswrapper[4785]: I1126 15:21:41.405218 4785 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fb85f3c0-cee4-43c6-a07b-058c3e0cd451-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 26 15:21:41 crc kubenswrapper[4785]: I1126 15:21:41.405227 4785 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fb85f3c0-cee4-43c6-a07b-058c3e0cd451-utilities\") on node \"crc\" DevicePath \"\"" Nov 26 15:21:41 crc kubenswrapper[4785]: I1126 15:21:41.796738 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6gnd4" event={"ID":"e9f21285-8a49-46c8-996d-6f51474bbc1c","Type":"ContainerStarted","Data":"7f75a5827f03483f8e61b3a4a213a8e29a16119f70a42941bbcb297be555d736"} Nov 26 15:21:41 crc kubenswrapper[4785]: I1126 15:21:41.799664 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-6pwlx" event={"ID":"c39759f2-3183-48fa-aaee-14b24c5337d7","Type":"ContainerStarted","Data":"a8852f789a2088c60fe9325c6132d607841f1460ce489ff9a31ce9c9aaf74710"} Nov 26 15:21:41 crc kubenswrapper[4785]: I1126 15:21:41.802611 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8fq5w" event={"ID":"fb85f3c0-cee4-43c6-a07b-058c3e0cd451","Type":"ContainerDied","Data":"6f478ca87dfe188b1da2806f45f39c76627dd884083f980ed8854770eb7804f1"} Nov 26 15:21:41 crc kubenswrapper[4785]: I1126 15:21:41.802648 4785 scope.go:117] "RemoveContainer" containerID="5eb98e3e59d931e676f7d50c8ecbcea3340a6bb9c400f819925d4c6f8d29467f" Nov 26 15:21:41 crc kubenswrapper[4785]: I1126 15:21:41.802701 4785 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8fq5w" Nov 26 15:21:41 crc kubenswrapper[4785]: I1126 15:21:41.823828 4785 scope.go:117] "RemoveContainer" containerID="93066a7bba53f02c6e29a69041fb41177aea6aab57c239a20bc375bb1e2737e2" Nov 26 15:21:41 crc kubenswrapper[4785]: I1126 15:21:41.833415 4785 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-6pwlx" podStartSLOduration=1.88896765 podStartE2EDuration="5.833397988s" podCreationTimestamp="2025-11-26 15:21:36 +0000 UTC" firstStartedPulling="2025-11-26 15:21:37.272812477 +0000 UTC m=+860.951178281" lastFinishedPulling="2025-11-26 15:21:41.217242855 +0000 UTC m=+864.895608619" observedRunningTime="2025-11-26 15:21:41.831688131 +0000 UTC m=+865.510053895" watchObservedRunningTime="2025-11-26 15:21:41.833397988 +0000 UTC m=+865.511763752" Nov 26 15:21:41 crc kubenswrapper[4785]: I1126 15:21:41.853154 4785 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-8fq5w"] Nov 26 15:21:41 crc kubenswrapper[4785]: I1126 15:21:41.856981 4785 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-8fq5w"] Nov 26 15:21:41 crc kubenswrapper[4785]: I1126 15:21:41.857258 4785 scope.go:117] "RemoveContainer" containerID="c00e06cc7a05a2d571507e4b49a027693250ade30f90f09643bc4218fd49dda6" Nov 26 15:21:41 crc kubenswrapper[4785]: I1126 15:21:41.959781 4785 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-lcdxx" podUID="e588c635-bfa9-4276-ba15-0ca8b125fc67" containerName="registry-server" probeResult="failure" output=< Nov 26 15:21:41 crc kubenswrapper[4785]: timeout: failed to connect service ":50051" within 1s Nov 26 15:21:41 crc kubenswrapper[4785]: > Nov 26 15:21:42 crc kubenswrapper[4785]: I1126 15:21:42.813528 4785 generic.go:334] "Generic (PLEG): container finished" podID="e9f21285-8a49-46c8-996d-6f51474bbc1c" containerID="7f75a5827f03483f8e61b3a4a213a8e29a16119f70a42941bbcb297be555d736" exitCode=0 Nov 26 15:21:42 crc kubenswrapper[4785]: I1126 15:21:42.813666 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6gnd4" event={"ID":"e9f21285-8a49-46c8-996d-6f51474bbc1c","Type":"ContainerDied","Data":"7f75a5827f03483f8e61b3a4a213a8e29a16119f70a42941bbcb297be555d736"} Nov 26 15:21:43 crc kubenswrapper[4785]: I1126 15:21:43.046785 4785 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fb85f3c0-cee4-43c6-a07b-058c3e0cd451" path="/var/lib/kubelet/pods/fb85f3c0-cee4-43c6-a07b-058c3e0cd451/volumes" Nov 26 15:21:43 crc kubenswrapper[4785]: I1126 15:21:43.826451 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6gnd4" event={"ID":"e9f21285-8a49-46c8-996d-6f51474bbc1c","Type":"ContainerStarted","Data":"961b45c658edcafe3090667b932756b3c36daa61350c5b1739f295f830165f1a"} Nov 26 15:21:43 crc kubenswrapper[4785]: I1126 15:21:43.847351 4785 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-6gnd4" podStartSLOduration=4.364826975 podStartE2EDuration="9.847336139s" podCreationTimestamp="2025-11-26 15:21:34 +0000 UTC" firstStartedPulling="2025-11-26 15:21:37.762716881 +0000 UTC m=+861.441082645" lastFinishedPulling="2025-11-26 15:21:43.245226035 +0000 UTC m=+866.923591809" observedRunningTime="2025-11-26 15:21:43.842983942 +0000 UTC m=+867.521349726" watchObservedRunningTime="2025-11-26 15:21:43.847336139 +0000 UTC m=+867.525701903" Nov 26 15:21:44 crc kubenswrapper[4785]: I1126 15:21:44.908148 4785 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-6gnd4" Nov 26 15:21:44 crc kubenswrapper[4785]: I1126 15:21:44.908343 4785 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-6gnd4" Nov 26 15:21:45 crc kubenswrapper[4785]: I1126 15:21:45.721932 4785 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/rabbitmq-server-0"] Nov 26 15:21:45 crc kubenswrapper[4785]: E1126 15:21:45.722448 4785 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb85f3c0-cee4-43c6-a07b-058c3e0cd451" containerName="registry-server" Nov 26 15:21:45 crc kubenswrapper[4785]: I1126 15:21:45.722463 4785 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb85f3c0-cee4-43c6-a07b-058c3e0cd451" containerName="registry-server" Nov 26 15:21:45 crc kubenswrapper[4785]: E1126 15:21:45.722482 4785 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb85f3c0-cee4-43c6-a07b-058c3e0cd451" containerName="extract-utilities" Nov 26 15:21:45 crc kubenswrapper[4785]: I1126 15:21:45.722489 4785 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb85f3c0-cee4-43c6-a07b-058c3e0cd451" containerName="extract-utilities" Nov 26 15:21:45 crc kubenswrapper[4785]: E1126 15:21:45.722505 4785 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb85f3c0-cee4-43c6-a07b-058c3e0cd451" containerName="extract-content" Nov 26 15:21:45 crc kubenswrapper[4785]: I1126 15:21:45.722512 4785 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb85f3c0-cee4-43c6-a07b-058c3e0cd451" containerName="extract-content" Nov 26 15:21:45 crc kubenswrapper[4785]: I1126 15:21:45.722655 4785 memory_manager.go:354] "RemoveStaleState removing state" podUID="fb85f3c0-cee4-43c6-a07b-058c3e0cd451" containerName="registry-server" Nov 26 15:21:45 crc kubenswrapper[4785]: I1126 15:21:45.723313 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/rabbitmq-server-0" Nov 26 15:21:45 crc kubenswrapper[4785]: I1126 15:21:45.725464 4785 reflector.go:368] Caches populated for *v1.ConfigMap from object-"glance-kuttl-tests"/"rabbitmq-server-conf" Nov 26 15:21:45 crc kubenswrapper[4785]: I1126 15:21:45.725507 4785 reflector.go:368] Caches populated for *v1.ConfigMap from object-"glance-kuttl-tests"/"rabbitmq-plugins-conf" Nov 26 15:21:45 crc kubenswrapper[4785]: I1126 15:21:45.725855 4785 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"rabbitmq-erlang-cookie" Nov 26 15:21:45 crc kubenswrapper[4785]: I1126 15:21:45.729089 4785 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"rabbitmq-default-user" Nov 26 15:21:45 crc kubenswrapper[4785]: I1126 15:21:45.729248 4785 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"rabbitmq-server-dockercfg-4mfcx" Nov 26 15:21:45 crc kubenswrapper[4785]: I1126 15:21:45.761012 4785 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/rabbitmq-server-0"] Nov 26 15:21:45 crc kubenswrapper[4785]: I1126 15:21:45.865208 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/4ace84da-cbee-4e2b-b473-67dac2985d5e-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"4ace84da-cbee-4e2b-b473-67dac2985d5e\") " pod="glance-kuttl-tests/rabbitmq-server-0" Nov 26 15:21:45 crc kubenswrapper[4785]: I1126 15:21:45.865298 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/4ace84da-cbee-4e2b-b473-67dac2985d5e-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"4ace84da-cbee-4e2b-b473-67dac2985d5e\") " pod="glance-kuttl-tests/rabbitmq-server-0" Nov 26 15:21:45 crc kubenswrapper[4785]: I1126 15:21:45.865331 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/4ace84da-cbee-4e2b-b473-67dac2985d5e-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"4ace84da-cbee-4e2b-b473-67dac2985d5e\") " pod="glance-kuttl-tests/rabbitmq-server-0" Nov 26 15:21:45 crc kubenswrapper[4785]: I1126 15:21:45.865359 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/4ace84da-cbee-4e2b-b473-67dac2985d5e-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"4ace84da-cbee-4e2b-b473-67dac2985d5e\") " pod="glance-kuttl-tests/rabbitmq-server-0" Nov 26 15:21:45 crc kubenswrapper[4785]: I1126 15:21:45.865407 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/4ace84da-cbee-4e2b-b473-67dac2985d5e-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"4ace84da-cbee-4e2b-b473-67dac2985d5e\") " pod="glance-kuttl-tests/rabbitmq-server-0" Nov 26 15:21:45 crc kubenswrapper[4785]: I1126 15:21:45.865493 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rf6dj\" (UniqueName: \"kubernetes.io/projected/4ace84da-cbee-4e2b-b473-67dac2985d5e-kube-api-access-rf6dj\") pod \"rabbitmq-server-0\" (UID: \"4ace84da-cbee-4e2b-b473-67dac2985d5e\") " pod="glance-kuttl-tests/rabbitmq-server-0" Nov 26 15:21:45 crc kubenswrapper[4785]: I1126 15:21:45.865546 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/4ace84da-cbee-4e2b-b473-67dac2985d5e-pod-info\") pod \"rabbitmq-server-0\" (UID: \"4ace84da-cbee-4e2b-b473-67dac2985d5e\") " pod="glance-kuttl-tests/rabbitmq-server-0" Nov 26 15:21:45 crc kubenswrapper[4785]: I1126 15:21:45.865609 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-3c7853a7-c9b0-4f56-b58b-e383d658d745\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-3c7853a7-c9b0-4f56-b58b-e383d658d745\") pod \"rabbitmq-server-0\" (UID: \"4ace84da-cbee-4e2b-b473-67dac2985d5e\") " pod="glance-kuttl-tests/rabbitmq-server-0" Nov 26 15:21:45 crc kubenswrapper[4785]: I1126 15:21:45.950181 4785 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-6gnd4" podUID="e9f21285-8a49-46c8-996d-6f51474bbc1c" containerName="registry-server" probeResult="failure" output=< Nov 26 15:21:45 crc kubenswrapper[4785]: timeout: failed to connect service ":50051" within 1s Nov 26 15:21:45 crc kubenswrapper[4785]: > Nov 26 15:21:45 crc kubenswrapper[4785]: I1126 15:21:45.967296 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/4ace84da-cbee-4e2b-b473-67dac2985d5e-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"4ace84da-cbee-4e2b-b473-67dac2985d5e\") " pod="glance-kuttl-tests/rabbitmq-server-0" Nov 26 15:21:45 crc kubenswrapper[4785]: I1126 15:21:45.967356 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/4ace84da-cbee-4e2b-b473-67dac2985d5e-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"4ace84da-cbee-4e2b-b473-67dac2985d5e\") " pod="glance-kuttl-tests/rabbitmq-server-0" Nov 26 15:21:45 crc kubenswrapper[4785]: I1126 15:21:45.967379 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/4ace84da-cbee-4e2b-b473-67dac2985d5e-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"4ace84da-cbee-4e2b-b473-67dac2985d5e\") " pod="glance-kuttl-tests/rabbitmq-server-0" Nov 26 15:21:45 crc kubenswrapper[4785]: I1126 15:21:45.967400 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/4ace84da-cbee-4e2b-b473-67dac2985d5e-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"4ace84da-cbee-4e2b-b473-67dac2985d5e\") " pod="glance-kuttl-tests/rabbitmq-server-0" Nov 26 15:21:45 crc kubenswrapper[4785]: I1126 15:21:45.967434 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rf6dj\" (UniqueName: \"kubernetes.io/projected/4ace84da-cbee-4e2b-b473-67dac2985d5e-kube-api-access-rf6dj\") pod \"rabbitmq-server-0\" (UID: \"4ace84da-cbee-4e2b-b473-67dac2985d5e\") " pod="glance-kuttl-tests/rabbitmq-server-0" Nov 26 15:21:45 crc kubenswrapper[4785]: I1126 15:21:45.967455 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/4ace84da-cbee-4e2b-b473-67dac2985d5e-pod-info\") pod \"rabbitmq-server-0\" (UID: \"4ace84da-cbee-4e2b-b473-67dac2985d5e\") " pod="glance-kuttl-tests/rabbitmq-server-0" Nov 26 15:21:45 crc kubenswrapper[4785]: I1126 15:21:45.967504 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-3c7853a7-c9b0-4f56-b58b-e383d658d745\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-3c7853a7-c9b0-4f56-b58b-e383d658d745\") pod \"rabbitmq-server-0\" (UID: \"4ace84da-cbee-4e2b-b473-67dac2985d5e\") " pod="glance-kuttl-tests/rabbitmq-server-0" Nov 26 15:21:45 crc kubenswrapper[4785]: I1126 15:21:45.967544 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/4ace84da-cbee-4e2b-b473-67dac2985d5e-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"4ace84da-cbee-4e2b-b473-67dac2985d5e\") " pod="glance-kuttl-tests/rabbitmq-server-0" Nov 26 15:21:45 crc kubenswrapper[4785]: I1126 15:21:45.967981 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/4ace84da-cbee-4e2b-b473-67dac2985d5e-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"4ace84da-cbee-4e2b-b473-67dac2985d5e\") " pod="glance-kuttl-tests/rabbitmq-server-0" Nov 26 15:21:45 crc kubenswrapper[4785]: I1126 15:21:45.968062 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/4ace84da-cbee-4e2b-b473-67dac2985d5e-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"4ace84da-cbee-4e2b-b473-67dac2985d5e\") " pod="glance-kuttl-tests/rabbitmq-server-0" Nov 26 15:21:45 crc kubenswrapper[4785]: I1126 15:21:45.968429 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/4ace84da-cbee-4e2b-b473-67dac2985d5e-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"4ace84da-cbee-4e2b-b473-67dac2985d5e\") " pod="glance-kuttl-tests/rabbitmq-server-0" Nov 26 15:21:45 crc kubenswrapper[4785]: I1126 15:21:45.972275 4785 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Nov 26 15:21:45 crc kubenswrapper[4785]: I1126 15:21:45.972331 4785 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-3c7853a7-c9b0-4f56-b58b-e383d658d745\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-3c7853a7-c9b0-4f56-b58b-e383d658d745\") pod \"rabbitmq-server-0\" (UID: \"4ace84da-cbee-4e2b-b473-67dac2985d5e\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/d25a9d2af7e3fe3ef35d3dd75b3eea37c26a7e25702e73aa9482f09a0b892c1b/globalmount\"" pod="glance-kuttl-tests/rabbitmq-server-0" Nov 26 15:21:45 crc kubenswrapper[4785]: I1126 15:21:45.973912 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/4ace84da-cbee-4e2b-b473-67dac2985d5e-pod-info\") pod \"rabbitmq-server-0\" (UID: \"4ace84da-cbee-4e2b-b473-67dac2985d5e\") " pod="glance-kuttl-tests/rabbitmq-server-0" Nov 26 15:21:45 crc kubenswrapper[4785]: I1126 15:21:45.974508 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/4ace84da-cbee-4e2b-b473-67dac2985d5e-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"4ace84da-cbee-4e2b-b473-67dac2985d5e\") " pod="glance-kuttl-tests/rabbitmq-server-0" Nov 26 15:21:45 crc kubenswrapper[4785]: I1126 15:21:45.979008 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/4ace84da-cbee-4e2b-b473-67dac2985d5e-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"4ace84da-cbee-4e2b-b473-67dac2985d5e\") " pod="glance-kuttl-tests/rabbitmq-server-0" Nov 26 15:21:45 crc kubenswrapper[4785]: I1126 15:21:45.994938 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rf6dj\" (UniqueName: \"kubernetes.io/projected/4ace84da-cbee-4e2b-b473-67dac2985d5e-kube-api-access-rf6dj\") pod \"rabbitmq-server-0\" (UID: \"4ace84da-cbee-4e2b-b473-67dac2985d5e\") " pod="glance-kuttl-tests/rabbitmq-server-0" Nov 26 15:21:46 crc kubenswrapper[4785]: I1126 15:21:46.007096 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-3c7853a7-c9b0-4f56-b58b-e383d658d745\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-3c7853a7-c9b0-4f56-b58b-e383d658d745\") pod \"rabbitmq-server-0\" (UID: \"4ace84da-cbee-4e2b-b473-67dac2985d5e\") " pod="glance-kuttl-tests/rabbitmq-server-0" Nov 26 15:21:46 crc kubenswrapper[4785]: I1126 15:21:46.039940 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/rabbitmq-server-0" Nov 26 15:21:46 crc kubenswrapper[4785]: I1126 15:21:46.467771 4785 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/rabbitmq-server-0"] Nov 26 15:21:46 crc kubenswrapper[4785]: W1126 15:21:46.482258 4785 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4ace84da_cbee_4e2b_b473_67dac2985d5e.slice/crio-591363d5a370ad1f4878e3f0d52f9f281e52de6e0353157e0d08b31be2224351 WatchSource:0}: Error finding container 591363d5a370ad1f4878e3f0d52f9f281e52de6e0353157e0d08b31be2224351: Status 404 returned error can't find the container with id 591363d5a370ad1f4878e3f0d52f9f281e52de6e0353157e0d08b31be2224351 Nov 26 15:21:46 crc kubenswrapper[4785]: I1126 15:21:46.847521 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/rabbitmq-server-0" event={"ID":"4ace84da-cbee-4e2b-b473-67dac2985d5e","Type":"ContainerStarted","Data":"591363d5a370ad1f4878e3f0d52f9f281e52de6e0353157e0d08b31be2224351"} Nov 26 15:21:50 crc kubenswrapper[4785]: I1126 15:21:50.946168 4785 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-lcdxx" Nov 26 15:21:50 crc kubenswrapper[4785]: I1126 15:21:50.993998 4785 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-lcdxx" Nov 26 15:21:51 crc kubenswrapper[4785]: I1126 15:21:51.384437 4785 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-index-lknwz"] Nov 26 15:21:51 crc kubenswrapper[4785]: I1126 15:21:51.385659 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-index-lknwz" Nov 26 15:21:51 crc kubenswrapper[4785]: I1126 15:21:51.389108 4785 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-index-dockercfg-48tqc" Nov 26 15:21:51 crc kubenswrapper[4785]: I1126 15:21:51.394935 4785 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-index-lknwz"] Nov 26 15:21:51 crc kubenswrapper[4785]: I1126 15:21:51.556260 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8fjwk\" (UniqueName: \"kubernetes.io/projected/e46e72bd-d6c1-48b6-a702-8256e1057ea6-kube-api-access-8fjwk\") pod \"keystone-operator-index-lknwz\" (UID: \"e46e72bd-d6c1-48b6-a702-8256e1057ea6\") " pod="openstack-operators/keystone-operator-index-lknwz" Nov 26 15:21:51 crc kubenswrapper[4785]: I1126 15:21:51.658031 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8fjwk\" (UniqueName: \"kubernetes.io/projected/e46e72bd-d6c1-48b6-a702-8256e1057ea6-kube-api-access-8fjwk\") pod \"keystone-operator-index-lknwz\" (UID: \"e46e72bd-d6c1-48b6-a702-8256e1057ea6\") " pod="openstack-operators/keystone-operator-index-lknwz" Nov 26 15:21:51 crc kubenswrapper[4785]: I1126 15:21:51.676186 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8fjwk\" (UniqueName: \"kubernetes.io/projected/e46e72bd-d6c1-48b6-a702-8256e1057ea6-kube-api-access-8fjwk\") pod \"keystone-operator-index-lknwz\" (UID: \"e46e72bd-d6c1-48b6-a702-8256e1057ea6\") " pod="openstack-operators/keystone-operator-index-lknwz" Nov 26 15:21:51 crc kubenswrapper[4785]: I1126 15:21:51.707990 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-index-lknwz" Nov 26 15:21:54 crc kubenswrapper[4785]: I1126 15:21:54.948464 4785 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-6gnd4" Nov 26 15:21:54 crc kubenswrapper[4785]: I1126 15:21:54.995472 4785 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-6gnd4" Nov 26 15:21:55 crc kubenswrapper[4785]: I1126 15:21:55.717727 4785 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-index-lknwz"] Nov 26 15:21:56 crc kubenswrapper[4785]: W1126 15:21:56.006932 4785 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode46e72bd_d6c1_48b6_a702_8256e1057ea6.slice/crio-161bcf68b8166e7cbe9f273a805bb307985e174542a37d89d8ae4abd1d49f317 WatchSource:0}: Error finding container 161bcf68b8166e7cbe9f273a805bb307985e174542a37d89d8ae4abd1d49f317: Status 404 returned error can't find the container with id 161bcf68b8166e7cbe9f273a805bb307985e174542a37d89d8ae4abd1d49f317 Nov 26 15:21:56 crc kubenswrapper[4785]: I1126 15:21:56.922306 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-index-lknwz" event={"ID":"e46e72bd-d6c1-48b6-a702-8256e1057ea6","Type":"ContainerStarted","Data":"161bcf68b8166e7cbe9f273a805bb307985e174542a37d89d8ae4abd1d49f317"} Nov 26 15:21:57 crc kubenswrapper[4785]: I1126 15:21:57.931689 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/rabbitmq-server-0" event={"ID":"4ace84da-cbee-4e2b-b473-67dac2985d5e","Type":"ContainerStarted","Data":"153412cf8c6e56400ca32eacab5edc81a12defd090ba96f1ef61236ac79f6e05"} Nov 26 15:21:57 crc kubenswrapper[4785]: I1126 15:21:57.933134 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-index-lknwz" event={"ID":"e46e72bd-d6c1-48b6-a702-8256e1057ea6","Type":"ContainerStarted","Data":"12307a345c9579cca14d171155371a6c32acbb65cbad586db455516c7ad21854"} Nov 26 15:21:58 crc kubenswrapper[4785]: I1126 15:21:58.005832 4785 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-index-lknwz" podStartSLOduration=5.940160194 podStartE2EDuration="7.00580823s" podCreationTimestamp="2025-11-26 15:21:51 +0000 UTC" firstStartedPulling="2025-11-26 15:21:56.009046512 +0000 UTC m=+879.687412266" lastFinishedPulling="2025-11-26 15:21:57.074694538 +0000 UTC m=+880.753060302" observedRunningTime="2025-11-26 15:21:58.001787471 +0000 UTC m=+881.680153235" watchObservedRunningTime="2025-11-26 15:21:58.00580823 +0000 UTC m=+881.684173994" Nov 26 15:21:58 crc kubenswrapper[4785]: I1126 15:21:58.626201 4785 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-lcdxx"] Nov 26 15:21:59 crc kubenswrapper[4785]: I1126 15:21:59.379338 4785 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-vmfhd"] Nov 26 15:21:59 crc kubenswrapper[4785]: I1126 15:21:59.379706 4785 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-vmfhd" podUID="5a04c61d-ff4c-486b-bba4-1b133ac1b4bc" containerName="registry-server" containerID="cri-o://de1687996badef926084a3430b2d04e3f233a9c701c5df62ac2e93cb978005b7" gracePeriod=2 Nov 26 15:21:59 crc kubenswrapper[4785]: I1126 15:21:59.796397 4785 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vmfhd" Nov 26 15:21:59 crc kubenswrapper[4785]: I1126 15:21:59.884004 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5a04c61d-ff4c-486b-bba4-1b133ac1b4bc-utilities\") pod \"5a04c61d-ff4c-486b-bba4-1b133ac1b4bc\" (UID: \"5a04c61d-ff4c-486b-bba4-1b133ac1b4bc\") " Nov 26 15:21:59 crc kubenswrapper[4785]: I1126 15:21:59.884078 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rkzws\" (UniqueName: \"kubernetes.io/projected/5a04c61d-ff4c-486b-bba4-1b133ac1b4bc-kube-api-access-rkzws\") pod \"5a04c61d-ff4c-486b-bba4-1b133ac1b4bc\" (UID: \"5a04c61d-ff4c-486b-bba4-1b133ac1b4bc\") " Nov 26 15:21:59 crc kubenswrapper[4785]: I1126 15:21:59.884160 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5a04c61d-ff4c-486b-bba4-1b133ac1b4bc-catalog-content\") pod \"5a04c61d-ff4c-486b-bba4-1b133ac1b4bc\" (UID: \"5a04c61d-ff4c-486b-bba4-1b133ac1b4bc\") " Nov 26 15:21:59 crc kubenswrapper[4785]: I1126 15:21:59.885113 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5a04c61d-ff4c-486b-bba4-1b133ac1b4bc-utilities" (OuterVolumeSpecName: "utilities") pod "5a04c61d-ff4c-486b-bba4-1b133ac1b4bc" (UID: "5a04c61d-ff4c-486b-bba4-1b133ac1b4bc"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 15:21:59 crc kubenswrapper[4785]: I1126 15:21:59.890020 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5a04c61d-ff4c-486b-bba4-1b133ac1b4bc-kube-api-access-rkzws" (OuterVolumeSpecName: "kube-api-access-rkzws") pod "5a04c61d-ff4c-486b-bba4-1b133ac1b4bc" (UID: "5a04c61d-ff4c-486b-bba4-1b133ac1b4bc"). InnerVolumeSpecName "kube-api-access-rkzws". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 15:21:59 crc kubenswrapper[4785]: I1126 15:21:59.943825 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5a04c61d-ff4c-486b-bba4-1b133ac1b4bc-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5a04c61d-ff4c-486b-bba4-1b133ac1b4bc" (UID: "5a04c61d-ff4c-486b-bba4-1b133ac1b4bc"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 15:21:59 crc kubenswrapper[4785]: I1126 15:21:59.949960 4785 generic.go:334] "Generic (PLEG): container finished" podID="5a04c61d-ff4c-486b-bba4-1b133ac1b4bc" containerID="de1687996badef926084a3430b2d04e3f233a9c701c5df62ac2e93cb978005b7" exitCode=0 Nov 26 15:21:59 crc kubenswrapper[4785]: I1126 15:21:59.950004 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vmfhd" event={"ID":"5a04c61d-ff4c-486b-bba4-1b133ac1b4bc","Type":"ContainerDied","Data":"de1687996badef926084a3430b2d04e3f233a9c701c5df62ac2e93cb978005b7"} Nov 26 15:21:59 crc kubenswrapper[4785]: I1126 15:21:59.950038 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vmfhd" event={"ID":"5a04c61d-ff4c-486b-bba4-1b133ac1b4bc","Type":"ContainerDied","Data":"8d8c953e869d1359ac8a59f9ba266751d61a4ce435f334cfbffc0fd59c03e0f5"} Nov 26 15:21:59 crc kubenswrapper[4785]: I1126 15:21:59.950056 4785 scope.go:117] "RemoveContainer" containerID="de1687996badef926084a3430b2d04e3f233a9c701c5df62ac2e93cb978005b7" Nov 26 15:21:59 crc kubenswrapper[4785]: I1126 15:21:59.950047 4785 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vmfhd" Nov 26 15:21:59 crc kubenswrapper[4785]: I1126 15:21:59.967780 4785 scope.go:117] "RemoveContainer" containerID="9d9a1af617b07fed98d5817b0e4f695037d1d6b5e400fe7be6acafbdeb16b804" Nov 26 15:21:59 crc kubenswrapper[4785]: I1126 15:21:59.977482 4785 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-vmfhd"] Nov 26 15:21:59 crc kubenswrapper[4785]: I1126 15:21:59.985438 4785 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5a04c61d-ff4c-486b-bba4-1b133ac1b4bc-utilities\") on node \"crc\" DevicePath \"\"" Nov 26 15:21:59 crc kubenswrapper[4785]: I1126 15:21:59.985466 4785 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rkzws\" (UniqueName: \"kubernetes.io/projected/5a04c61d-ff4c-486b-bba4-1b133ac1b4bc-kube-api-access-rkzws\") on node \"crc\" DevicePath \"\"" Nov 26 15:21:59 crc kubenswrapper[4785]: I1126 15:21:59.985477 4785 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5a04c61d-ff4c-486b-bba4-1b133ac1b4bc-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 26 15:21:59 crc kubenswrapper[4785]: I1126 15:21:59.987526 4785 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-vmfhd"] Nov 26 15:21:59 crc kubenswrapper[4785]: I1126 15:21:59.996629 4785 scope.go:117] "RemoveContainer" containerID="5ca68575ce37f5e734e37f3b770e8d88c75e3fae1f2109218f63df7630d7ae27" Nov 26 15:22:00 crc kubenswrapper[4785]: I1126 15:22:00.013572 4785 scope.go:117] "RemoveContainer" containerID="de1687996badef926084a3430b2d04e3f233a9c701c5df62ac2e93cb978005b7" Nov 26 15:22:00 crc kubenswrapper[4785]: E1126 15:22:00.013932 4785 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"de1687996badef926084a3430b2d04e3f233a9c701c5df62ac2e93cb978005b7\": container with ID starting with de1687996badef926084a3430b2d04e3f233a9c701c5df62ac2e93cb978005b7 not found: ID does not exist" containerID="de1687996badef926084a3430b2d04e3f233a9c701c5df62ac2e93cb978005b7" Nov 26 15:22:00 crc kubenswrapper[4785]: I1126 15:22:00.013961 4785 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"de1687996badef926084a3430b2d04e3f233a9c701c5df62ac2e93cb978005b7"} err="failed to get container status \"de1687996badef926084a3430b2d04e3f233a9c701c5df62ac2e93cb978005b7\": rpc error: code = NotFound desc = could not find container \"de1687996badef926084a3430b2d04e3f233a9c701c5df62ac2e93cb978005b7\": container with ID starting with de1687996badef926084a3430b2d04e3f233a9c701c5df62ac2e93cb978005b7 not found: ID does not exist" Nov 26 15:22:00 crc kubenswrapper[4785]: I1126 15:22:00.013981 4785 scope.go:117] "RemoveContainer" containerID="9d9a1af617b07fed98d5817b0e4f695037d1d6b5e400fe7be6acafbdeb16b804" Nov 26 15:22:00 crc kubenswrapper[4785]: E1126 15:22:00.014833 4785 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9d9a1af617b07fed98d5817b0e4f695037d1d6b5e400fe7be6acafbdeb16b804\": container with ID starting with 9d9a1af617b07fed98d5817b0e4f695037d1d6b5e400fe7be6acafbdeb16b804 not found: ID does not exist" containerID="9d9a1af617b07fed98d5817b0e4f695037d1d6b5e400fe7be6acafbdeb16b804" Nov 26 15:22:00 crc kubenswrapper[4785]: I1126 15:22:00.014854 4785 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9d9a1af617b07fed98d5817b0e4f695037d1d6b5e400fe7be6acafbdeb16b804"} err="failed to get container status \"9d9a1af617b07fed98d5817b0e4f695037d1d6b5e400fe7be6acafbdeb16b804\": rpc error: code = NotFound desc = could not find container \"9d9a1af617b07fed98d5817b0e4f695037d1d6b5e400fe7be6acafbdeb16b804\": container with ID starting with 9d9a1af617b07fed98d5817b0e4f695037d1d6b5e400fe7be6acafbdeb16b804 not found: ID does not exist" Nov 26 15:22:00 crc kubenswrapper[4785]: I1126 15:22:00.014867 4785 scope.go:117] "RemoveContainer" containerID="5ca68575ce37f5e734e37f3b770e8d88c75e3fae1f2109218f63df7630d7ae27" Nov 26 15:22:00 crc kubenswrapper[4785]: E1126 15:22:00.015240 4785 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5ca68575ce37f5e734e37f3b770e8d88c75e3fae1f2109218f63df7630d7ae27\": container with ID starting with 5ca68575ce37f5e734e37f3b770e8d88c75e3fae1f2109218f63df7630d7ae27 not found: ID does not exist" containerID="5ca68575ce37f5e734e37f3b770e8d88c75e3fae1f2109218f63df7630d7ae27" Nov 26 15:22:00 crc kubenswrapper[4785]: I1126 15:22:00.015292 4785 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5ca68575ce37f5e734e37f3b770e8d88c75e3fae1f2109218f63df7630d7ae27"} err="failed to get container status \"5ca68575ce37f5e734e37f3b770e8d88c75e3fae1f2109218f63df7630d7ae27\": rpc error: code = NotFound desc = could not find container \"5ca68575ce37f5e734e37f3b770e8d88c75e3fae1f2109218f63df7630d7ae27\": container with ID starting with 5ca68575ce37f5e734e37f3b770e8d88c75e3fae1f2109218f63df7630d7ae27 not found: ID does not exist" Nov 26 15:22:01 crc kubenswrapper[4785]: I1126 15:22:01.047635 4785 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5a04c61d-ff4c-486b-bba4-1b133ac1b4bc" path="/var/lib/kubelet/pods/5a04c61d-ff4c-486b-bba4-1b133ac1b4bc/volumes" Nov 26 15:22:01 crc kubenswrapper[4785]: I1126 15:22:01.708591 4785 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/keystone-operator-index-lknwz" Nov 26 15:22:01 crc kubenswrapper[4785]: I1126 15:22:01.709023 4785 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-index-lknwz" Nov 26 15:22:01 crc kubenswrapper[4785]: I1126 15:22:01.756087 4785 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/keystone-operator-index-lknwz" Nov 26 15:22:07 crc kubenswrapper[4785]: I1126 15:22:07.383281 4785 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-6gnd4"] Nov 26 15:22:07 crc kubenswrapper[4785]: I1126 15:22:07.384445 4785 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-6gnd4" podUID="e9f21285-8a49-46c8-996d-6f51474bbc1c" containerName="registry-server" containerID="cri-o://961b45c658edcafe3090667b932756b3c36daa61350c5b1739f295f830165f1a" gracePeriod=2 Nov 26 15:22:07 crc kubenswrapper[4785]: I1126 15:22:07.777181 4785 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6gnd4" Nov 26 15:22:07 crc kubenswrapper[4785]: I1126 15:22:07.903976 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v88xm\" (UniqueName: \"kubernetes.io/projected/e9f21285-8a49-46c8-996d-6f51474bbc1c-kube-api-access-v88xm\") pod \"e9f21285-8a49-46c8-996d-6f51474bbc1c\" (UID: \"e9f21285-8a49-46c8-996d-6f51474bbc1c\") " Nov 26 15:22:07 crc kubenswrapper[4785]: I1126 15:22:07.904056 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e9f21285-8a49-46c8-996d-6f51474bbc1c-catalog-content\") pod \"e9f21285-8a49-46c8-996d-6f51474bbc1c\" (UID: \"e9f21285-8a49-46c8-996d-6f51474bbc1c\") " Nov 26 15:22:07 crc kubenswrapper[4785]: I1126 15:22:07.904135 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e9f21285-8a49-46c8-996d-6f51474bbc1c-utilities\") pod \"e9f21285-8a49-46c8-996d-6f51474bbc1c\" (UID: \"e9f21285-8a49-46c8-996d-6f51474bbc1c\") " Nov 26 15:22:07 crc kubenswrapper[4785]: I1126 15:22:07.905356 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e9f21285-8a49-46c8-996d-6f51474bbc1c-utilities" (OuterVolumeSpecName: "utilities") pod "e9f21285-8a49-46c8-996d-6f51474bbc1c" (UID: "e9f21285-8a49-46c8-996d-6f51474bbc1c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 15:22:07 crc kubenswrapper[4785]: I1126 15:22:07.909901 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e9f21285-8a49-46c8-996d-6f51474bbc1c-kube-api-access-v88xm" (OuterVolumeSpecName: "kube-api-access-v88xm") pod "e9f21285-8a49-46c8-996d-6f51474bbc1c" (UID: "e9f21285-8a49-46c8-996d-6f51474bbc1c"). InnerVolumeSpecName "kube-api-access-v88xm". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 15:22:07 crc kubenswrapper[4785]: I1126 15:22:07.985401 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e9f21285-8a49-46c8-996d-6f51474bbc1c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e9f21285-8a49-46c8-996d-6f51474bbc1c" (UID: "e9f21285-8a49-46c8-996d-6f51474bbc1c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 15:22:08 crc kubenswrapper[4785]: I1126 15:22:08.005918 4785 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v88xm\" (UniqueName: \"kubernetes.io/projected/e9f21285-8a49-46c8-996d-6f51474bbc1c-kube-api-access-v88xm\") on node \"crc\" DevicePath \"\"" Nov 26 15:22:08 crc kubenswrapper[4785]: I1126 15:22:08.006131 4785 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e9f21285-8a49-46c8-996d-6f51474bbc1c-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 26 15:22:08 crc kubenswrapper[4785]: I1126 15:22:08.006191 4785 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e9f21285-8a49-46c8-996d-6f51474bbc1c-utilities\") on node \"crc\" DevicePath \"\"" Nov 26 15:22:08 crc kubenswrapper[4785]: I1126 15:22:08.015442 4785 generic.go:334] "Generic (PLEG): container finished" podID="e9f21285-8a49-46c8-996d-6f51474bbc1c" containerID="961b45c658edcafe3090667b932756b3c36daa61350c5b1739f295f830165f1a" exitCode=0 Nov 26 15:22:08 crc kubenswrapper[4785]: I1126 15:22:08.015493 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6gnd4" event={"ID":"e9f21285-8a49-46c8-996d-6f51474bbc1c","Type":"ContainerDied","Data":"961b45c658edcafe3090667b932756b3c36daa61350c5b1739f295f830165f1a"} Nov 26 15:22:08 crc kubenswrapper[4785]: I1126 15:22:08.015508 4785 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6gnd4" Nov 26 15:22:08 crc kubenswrapper[4785]: I1126 15:22:08.015530 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6gnd4" event={"ID":"e9f21285-8a49-46c8-996d-6f51474bbc1c","Type":"ContainerDied","Data":"610ffa28bc3e89ba39b5ecefc63e0f846cb828c0aad9c69b24188b4dd8c1487e"} Nov 26 15:22:08 crc kubenswrapper[4785]: I1126 15:22:08.015567 4785 scope.go:117] "RemoveContainer" containerID="961b45c658edcafe3090667b932756b3c36daa61350c5b1739f295f830165f1a" Nov 26 15:22:08 crc kubenswrapper[4785]: I1126 15:22:08.044061 4785 scope.go:117] "RemoveContainer" containerID="7f75a5827f03483f8e61b3a4a213a8e29a16119f70a42941bbcb297be555d736" Nov 26 15:22:08 crc kubenswrapper[4785]: I1126 15:22:08.052667 4785 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-6gnd4"] Nov 26 15:22:08 crc kubenswrapper[4785]: I1126 15:22:08.056742 4785 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-6gnd4"] Nov 26 15:22:08 crc kubenswrapper[4785]: I1126 15:22:08.062857 4785 scope.go:117] "RemoveContainer" containerID="15b23977197dfd5e3a700230df43e60cb1f996522cfc8e8a485df41cba6907ed" Nov 26 15:22:08 crc kubenswrapper[4785]: I1126 15:22:08.083640 4785 scope.go:117] "RemoveContainer" containerID="961b45c658edcafe3090667b932756b3c36daa61350c5b1739f295f830165f1a" Nov 26 15:22:08 crc kubenswrapper[4785]: E1126 15:22:08.084040 4785 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"961b45c658edcafe3090667b932756b3c36daa61350c5b1739f295f830165f1a\": container with ID starting with 961b45c658edcafe3090667b932756b3c36daa61350c5b1739f295f830165f1a not found: ID does not exist" containerID="961b45c658edcafe3090667b932756b3c36daa61350c5b1739f295f830165f1a" Nov 26 15:22:08 crc kubenswrapper[4785]: I1126 15:22:08.084136 4785 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"961b45c658edcafe3090667b932756b3c36daa61350c5b1739f295f830165f1a"} err="failed to get container status \"961b45c658edcafe3090667b932756b3c36daa61350c5b1739f295f830165f1a\": rpc error: code = NotFound desc = could not find container \"961b45c658edcafe3090667b932756b3c36daa61350c5b1739f295f830165f1a\": container with ID starting with 961b45c658edcafe3090667b932756b3c36daa61350c5b1739f295f830165f1a not found: ID does not exist" Nov 26 15:22:08 crc kubenswrapper[4785]: I1126 15:22:08.084217 4785 scope.go:117] "RemoveContainer" containerID="7f75a5827f03483f8e61b3a4a213a8e29a16119f70a42941bbcb297be555d736" Nov 26 15:22:08 crc kubenswrapper[4785]: E1126 15:22:08.084539 4785 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7f75a5827f03483f8e61b3a4a213a8e29a16119f70a42941bbcb297be555d736\": container with ID starting with 7f75a5827f03483f8e61b3a4a213a8e29a16119f70a42941bbcb297be555d736 not found: ID does not exist" containerID="7f75a5827f03483f8e61b3a4a213a8e29a16119f70a42941bbcb297be555d736" Nov 26 15:22:08 crc kubenswrapper[4785]: I1126 15:22:08.084603 4785 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7f75a5827f03483f8e61b3a4a213a8e29a16119f70a42941bbcb297be555d736"} err="failed to get container status \"7f75a5827f03483f8e61b3a4a213a8e29a16119f70a42941bbcb297be555d736\": rpc error: code = NotFound desc = could not find container \"7f75a5827f03483f8e61b3a4a213a8e29a16119f70a42941bbcb297be555d736\": container with ID starting with 7f75a5827f03483f8e61b3a4a213a8e29a16119f70a42941bbcb297be555d736 not found: ID does not exist" Nov 26 15:22:08 crc kubenswrapper[4785]: I1126 15:22:08.084629 4785 scope.go:117] "RemoveContainer" containerID="15b23977197dfd5e3a700230df43e60cb1f996522cfc8e8a485df41cba6907ed" Nov 26 15:22:08 crc kubenswrapper[4785]: E1126 15:22:08.084877 4785 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"15b23977197dfd5e3a700230df43e60cb1f996522cfc8e8a485df41cba6907ed\": container with ID starting with 15b23977197dfd5e3a700230df43e60cb1f996522cfc8e8a485df41cba6907ed not found: ID does not exist" containerID="15b23977197dfd5e3a700230df43e60cb1f996522cfc8e8a485df41cba6907ed" Nov 26 15:22:08 crc kubenswrapper[4785]: I1126 15:22:08.084918 4785 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"15b23977197dfd5e3a700230df43e60cb1f996522cfc8e8a485df41cba6907ed"} err="failed to get container status \"15b23977197dfd5e3a700230df43e60cb1f996522cfc8e8a485df41cba6907ed\": rpc error: code = NotFound desc = could not find container \"15b23977197dfd5e3a700230df43e60cb1f996522cfc8e8a485df41cba6907ed\": container with ID starting with 15b23977197dfd5e3a700230df43e60cb1f996522cfc8e8a485df41cba6907ed not found: ID does not exist" Nov 26 15:22:09 crc kubenswrapper[4785]: I1126 15:22:09.050544 4785 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e9f21285-8a49-46c8-996d-6f51474bbc1c" path="/var/lib/kubelet/pods/e9f21285-8a49-46c8-996d-6f51474bbc1c/volumes" Nov 26 15:22:11 crc kubenswrapper[4785]: I1126 15:22:11.751927 4785 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-index-lknwz" Nov 26 15:22:14 crc kubenswrapper[4785]: I1126 15:22:14.429388 4785 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/d854280893f664a16f85f7c4268f877fa95509a4e25ae77fea242eaaa362xf6"] Nov 26 15:22:14 crc kubenswrapper[4785]: E1126 15:22:14.429941 4785 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e9f21285-8a49-46c8-996d-6f51474bbc1c" containerName="extract-content" Nov 26 15:22:14 crc kubenswrapper[4785]: I1126 15:22:14.429955 4785 state_mem.go:107] "Deleted CPUSet assignment" podUID="e9f21285-8a49-46c8-996d-6f51474bbc1c" containerName="extract-content" Nov 26 15:22:14 crc kubenswrapper[4785]: E1126 15:22:14.429967 4785 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e9f21285-8a49-46c8-996d-6f51474bbc1c" containerName="registry-server" Nov 26 15:22:14 crc kubenswrapper[4785]: I1126 15:22:14.429974 4785 state_mem.go:107] "Deleted CPUSet assignment" podUID="e9f21285-8a49-46c8-996d-6f51474bbc1c" containerName="registry-server" Nov 26 15:22:14 crc kubenswrapper[4785]: E1126 15:22:14.429993 4785 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a04c61d-ff4c-486b-bba4-1b133ac1b4bc" containerName="registry-server" Nov 26 15:22:14 crc kubenswrapper[4785]: I1126 15:22:14.430002 4785 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a04c61d-ff4c-486b-bba4-1b133ac1b4bc" containerName="registry-server" Nov 26 15:22:14 crc kubenswrapper[4785]: E1126 15:22:14.430016 4785 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a04c61d-ff4c-486b-bba4-1b133ac1b4bc" containerName="extract-content" Nov 26 15:22:14 crc kubenswrapper[4785]: I1126 15:22:14.430024 4785 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a04c61d-ff4c-486b-bba4-1b133ac1b4bc" containerName="extract-content" Nov 26 15:22:14 crc kubenswrapper[4785]: E1126 15:22:14.430042 4785 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e9f21285-8a49-46c8-996d-6f51474bbc1c" containerName="extract-utilities" Nov 26 15:22:14 crc kubenswrapper[4785]: I1126 15:22:14.430051 4785 state_mem.go:107] "Deleted CPUSet assignment" podUID="e9f21285-8a49-46c8-996d-6f51474bbc1c" containerName="extract-utilities" Nov 26 15:22:14 crc kubenswrapper[4785]: E1126 15:22:14.430064 4785 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a04c61d-ff4c-486b-bba4-1b133ac1b4bc" containerName="extract-utilities" Nov 26 15:22:14 crc kubenswrapper[4785]: I1126 15:22:14.430072 4785 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a04c61d-ff4c-486b-bba4-1b133ac1b4bc" containerName="extract-utilities" Nov 26 15:22:14 crc kubenswrapper[4785]: I1126 15:22:14.430199 4785 memory_manager.go:354] "RemoveStaleState removing state" podUID="e9f21285-8a49-46c8-996d-6f51474bbc1c" containerName="registry-server" Nov 26 15:22:14 crc kubenswrapper[4785]: I1126 15:22:14.430218 4785 memory_manager.go:354] "RemoveStaleState removing state" podUID="5a04c61d-ff4c-486b-bba4-1b133ac1b4bc" containerName="registry-server" Nov 26 15:22:14 crc kubenswrapper[4785]: I1126 15:22:14.431195 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/d854280893f664a16f85f7c4268f877fa95509a4e25ae77fea242eaaa362xf6" Nov 26 15:22:14 crc kubenswrapper[4785]: I1126 15:22:14.433302 4785 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-sfhll" Nov 26 15:22:14 crc kubenswrapper[4785]: I1126 15:22:14.447075 4785 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/d854280893f664a16f85f7c4268f877fa95509a4e25ae77fea242eaaa362xf6"] Nov 26 15:22:14 crc kubenswrapper[4785]: I1126 15:22:14.626907 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/bdc1a44d-c408-48d0-8df4-d52394f541eb-bundle\") pod \"d854280893f664a16f85f7c4268f877fa95509a4e25ae77fea242eaaa362xf6\" (UID: \"bdc1a44d-c408-48d0-8df4-d52394f541eb\") " pod="openstack-operators/d854280893f664a16f85f7c4268f877fa95509a4e25ae77fea242eaaa362xf6" Nov 26 15:22:14 crc kubenswrapper[4785]: I1126 15:22:14.627023 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/bdc1a44d-c408-48d0-8df4-d52394f541eb-util\") pod \"d854280893f664a16f85f7c4268f877fa95509a4e25ae77fea242eaaa362xf6\" (UID: \"bdc1a44d-c408-48d0-8df4-d52394f541eb\") " pod="openstack-operators/d854280893f664a16f85f7c4268f877fa95509a4e25ae77fea242eaaa362xf6" Nov 26 15:22:14 crc kubenswrapper[4785]: I1126 15:22:14.627204 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8pvbh\" (UniqueName: \"kubernetes.io/projected/bdc1a44d-c408-48d0-8df4-d52394f541eb-kube-api-access-8pvbh\") pod \"d854280893f664a16f85f7c4268f877fa95509a4e25ae77fea242eaaa362xf6\" (UID: \"bdc1a44d-c408-48d0-8df4-d52394f541eb\") " pod="openstack-operators/d854280893f664a16f85f7c4268f877fa95509a4e25ae77fea242eaaa362xf6" Nov 26 15:22:14 crc kubenswrapper[4785]: I1126 15:22:14.728620 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/bdc1a44d-c408-48d0-8df4-d52394f541eb-bundle\") pod \"d854280893f664a16f85f7c4268f877fa95509a4e25ae77fea242eaaa362xf6\" (UID: \"bdc1a44d-c408-48d0-8df4-d52394f541eb\") " pod="openstack-operators/d854280893f664a16f85f7c4268f877fa95509a4e25ae77fea242eaaa362xf6" Nov 26 15:22:14 crc kubenswrapper[4785]: I1126 15:22:14.728791 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/bdc1a44d-c408-48d0-8df4-d52394f541eb-util\") pod \"d854280893f664a16f85f7c4268f877fa95509a4e25ae77fea242eaaa362xf6\" (UID: \"bdc1a44d-c408-48d0-8df4-d52394f541eb\") " pod="openstack-operators/d854280893f664a16f85f7c4268f877fa95509a4e25ae77fea242eaaa362xf6" Nov 26 15:22:14 crc kubenswrapper[4785]: I1126 15:22:14.728922 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8pvbh\" (UniqueName: \"kubernetes.io/projected/bdc1a44d-c408-48d0-8df4-d52394f541eb-kube-api-access-8pvbh\") pod \"d854280893f664a16f85f7c4268f877fa95509a4e25ae77fea242eaaa362xf6\" (UID: \"bdc1a44d-c408-48d0-8df4-d52394f541eb\") " pod="openstack-operators/d854280893f664a16f85f7c4268f877fa95509a4e25ae77fea242eaaa362xf6" Nov 26 15:22:14 crc kubenswrapper[4785]: I1126 15:22:14.729452 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/bdc1a44d-c408-48d0-8df4-d52394f541eb-bundle\") pod \"d854280893f664a16f85f7c4268f877fa95509a4e25ae77fea242eaaa362xf6\" (UID: \"bdc1a44d-c408-48d0-8df4-d52394f541eb\") " pod="openstack-operators/d854280893f664a16f85f7c4268f877fa95509a4e25ae77fea242eaaa362xf6" Nov 26 15:22:14 crc kubenswrapper[4785]: I1126 15:22:14.729809 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/bdc1a44d-c408-48d0-8df4-d52394f541eb-util\") pod \"d854280893f664a16f85f7c4268f877fa95509a4e25ae77fea242eaaa362xf6\" (UID: \"bdc1a44d-c408-48d0-8df4-d52394f541eb\") " pod="openstack-operators/d854280893f664a16f85f7c4268f877fa95509a4e25ae77fea242eaaa362xf6" Nov 26 15:22:14 crc kubenswrapper[4785]: I1126 15:22:14.768648 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8pvbh\" (UniqueName: \"kubernetes.io/projected/bdc1a44d-c408-48d0-8df4-d52394f541eb-kube-api-access-8pvbh\") pod \"d854280893f664a16f85f7c4268f877fa95509a4e25ae77fea242eaaa362xf6\" (UID: \"bdc1a44d-c408-48d0-8df4-d52394f541eb\") " pod="openstack-operators/d854280893f664a16f85f7c4268f877fa95509a4e25ae77fea242eaaa362xf6" Nov 26 15:22:15 crc kubenswrapper[4785]: I1126 15:22:15.049703 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/d854280893f664a16f85f7c4268f877fa95509a4e25ae77fea242eaaa362xf6" Nov 26 15:22:15 crc kubenswrapper[4785]: I1126 15:22:15.557042 4785 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/d854280893f664a16f85f7c4268f877fa95509a4e25ae77fea242eaaa362xf6"] Nov 26 15:22:15 crc kubenswrapper[4785]: W1126 15:22:15.566727 4785 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbdc1a44d_c408_48d0_8df4_d52394f541eb.slice/crio-b0bace99acb5668defacc59d7042054306be9b17d01a863601184b837c8f37e2 WatchSource:0}: Error finding container b0bace99acb5668defacc59d7042054306be9b17d01a863601184b837c8f37e2: Status 404 returned error can't find the container with id b0bace99acb5668defacc59d7042054306be9b17d01a863601184b837c8f37e2 Nov 26 15:22:16 crc kubenswrapper[4785]: I1126 15:22:16.085613 4785 generic.go:334] "Generic (PLEG): container finished" podID="bdc1a44d-c408-48d0-8df4-d52394f541eb" containerID="dd90cd47fab997baaacb5c0b8f3f4a041143e895eccb622a819457371bcf1d5e" exitCode=0 Nov 26 15:22:16 crc kubenswrapper[4785]: I1126 15:22:16.085833 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/d854280893f664a16f85f7c4268f877fa95509a4e25ae77fea242eaaa362xf6" event={"ID":"bdc1a44d-c408-48d0-8df4-d52394f541eb","Type":"ContainerDied","Data":"dd90cd47fab997baaacb5c0b8f3f4a041143e895eccb622a819457371bcf1d5e"} Nov 26 15:22:16 crc kubenswrapper[4785]: I1126 15:22:16.085967 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/d854280893f664a16f85f7c4268f877fa95509a4e25ae77fea242eaaa362xf6" event={"ID":"bdc1a44d-c408-48d0-8df4-d52394f541eb","Type":"ContainerStarted","Data":"b0bace99acb5668defacc59d7042054306be9b17d01a863601184b837c8f37e2"} Nov 26 15:22:17 crc kubenswrapper[4785]: I1126 15:22:17.094544 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/d854280893f664a16f85f7c4268f877fa95509a4e25ae77fea242eaaa362xf6" event={"ID":"bdc1a44d-c408-48d0-8df4-d52394f541eb","Type":"ContainerStarted","Data":"325854c18696de401251e62f512434385f19bc23c27263fc5db4e6c32b22369f"} Nov 26 15:22:18 crc kubenswrapper[4785]: I1126 15:22:18.101430 4785 generic.go:334] "Generic (PLEG): container finished" podID="bdc1a44d-c408-48d0-8df4-d52394f541eb" containerID="325854c18696de401251e62f512434385f19bc23c27263fc5db4e6c32b22369f" exitCode=0 Nov 26 15:22:18 crc kubenswrapper[4785]: I1126 15:22:18.101493 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/d854280893f664a16f85f7c4268f877fa95509a4e25ae77fea242eaaa362xf6" event={"ID":"bdc1a44d-c408-48d0-8df4-d52394f541eb","Type":"ContainerDied","Data":"325854c18696de401251e62f512434385f19bc23c27263fc5db4e6c32b22369f"} Nov 26 15:22:19 crc kubenswrapper[4785]: I1126 15:22:19.110406 4785 generic.go:334] "Generic (PLEG): container finished" podID="bdc1a44d-c408-48d0-8df4-d52394f541eb" containerID="917fd63b27990d586af02ee4afa5ba439a30a074edfdc19ddaefba2b70efa884" exitCode=0 Nov 26 15:22:19 crc kubenswrapper[4785]: I1126 15:22:19.110459 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/d854280893f664a16f85f7c4268f877fa95509a4e25ae77fea242eaaa362xf6" event={"ID":"bdc1a44d-c408-48d0-8df4-d52394f541eb","Type":"ContainerDied","Data":"917fd63b27990d586af02ee4afa5ba439a30a074edfdc19ddaefba2b70efa884"} Nov 26 15:22:20 crc kubenswrapper[4785]: I1126 15:22:20.339917 4785 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/d854280893f664a16f85f7c4268f877fa95509a4e25ae77fea242eaaa362xf6" Nov 26 15:22:20 crc kubenswrapper[4785]: I1126 15:22:20.420528 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8pvbh\" (UniqueName: \"kubernetes.io/projected/bdc1a44d-c408-48d0-8df4-d52394f541eb-kube-api-access-8pvbh\") pod \"bdc1a44d-c408-48d0-8df4-d52394f541eb\" (UID: \"bdc1a44d-c408-48d0-8df4-d52394f541eb\") " Nov 26 15:22:20 crc kubenswrapper[4785]: I1126 15:22:20.420618 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/bdc1a44d-c408-48d0-8df4-d52394f541eb-bundle\") pod \"bdc1a44d-c408-48d0-8df4-d52394f541eb\" (UID: \"bdc1a44d-c408-48d0-8df4-d52394f541eb\") " Nov 26 15:22:20 crc kubenswrapper[4785]: I1126 15:22:20.420659 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/bdc1a44d-c408-48d0-8df4-d52394f541eb-util\") pod \"bdc1a44d-c408-48d0-8df4-d52394f541eb\" (UID: \"bdc1a44d-c408-48d0-8df4-d52394f541eb\") " Nov 26 15:22:20 crc kubenswrapper[4785]: I1126 15:22:20.421436 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bdc1a44d-c408-48d0-8df4-d52394f541eb-bundle" (OuterVolumeSpecName: "bundle") pod "bdc1a44d-c408-48d0-8df4-d52394f541eb" (UID: "bdc1a44d-c408-48d0-8df4-d52394f541eb"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 15:22:20 crc kubenswrapper[4785]: I1126 15:22:20.425577 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bdc1a44d-c408-48d0-8df4-d52394f541eb-kube-api-access-8pvbh" (OuterVolumeSpecName: "kube-api-access-8pvbh") pod "bdc1a44d-c408-48d0-8df4-d52394f541eb" (UID: "bdc1a44d-c408-48d0-8df4-d52394f541eb"). InnerVolumeSpecName "kube-api-access-8pvbh". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 15:22:20 crc kubenswrapper[4785]: I1126 15:22:20.435119 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bdc1a44d-c408-48d0-8df4-d52394f541eb-util" (OuterVolumeSpecName: "util") pod "bdc1a44d-c408-48d0-8df4-d52394f541eb" (UID: "bdc1a44d-c408-48d0-8df4-d52394f541eb"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 15:22:20 crc kubenswrapper[4785]: I1126 15:22:20.521883 4785 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8pvbh\" (UniqueName: \"kubernetes.io/projected/bdc1a44d-c408-48d0-8df4-d52394f541eb-kube-api-access-8pvbh\") on node \"crc\" DevicePath \"\"" Nov 26 15:22:20 crc kubenswrapper[4785]: I1126 15:22:20.521924 4785 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/bdc1a44d-c408-48d0-8df4-d52394f541eb-bundle\") on node \"crc\" DevicePath \"\"" Nov 26 15:22:20 crc kubenswrapper[4785]: I1126 15:22:20.521933 4785 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/bdc1a44d-c408-48d0-8df4-d52394f541eb-util\") on node \"crc\" DevicePath \"\"" Nov 26 15:22:21 crc kubenswrapper[4785]: I1126 15:22:21.123540 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/d854280893f664a16f85f7c4268f877fa95509a4e25ae77fea242eaaa362xf6" event={"ID":"bdc1a44d-c408-48d0-8df4-d52394f541eb","Type":"ContainerDied","Data":"b0bace99acb5668defacc59d7042054306be9b17d01a863601184b837c8f37e2"} Nov 26 15:22:21 crc kubenswrapper[4785]: I1126 15:22:21.123614 4785 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b0bace99acb5668defacc59d7042054306be9b17d01a863601184b837c8f37e2" Nov 26 15:22:21 crc kubenswrapper[4785]: I1126 15:22:21.123625 4785 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/d854280893f664a16f85f7c4268f877fa95509a4e25ae77fea242eaaa362xf6" Nov 26 15:22:30 crc kubenswrapper[4785]: I1126 15:22:30.192813 4785 generic.go:334] "Generic (PLEG): container finished" podID="4ace84da-cbee-4e2b-b473-67dac2985d5e" containerID="153412cf8c6e56400ca32eacab5edc81a12defd090ba96f1ef61236ac79f6e05" exitCode=0 Nov 26 15:22:30 crc kubenswrapper[4785]: I1126 15:22:30.192940 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/rabbitmq-server-0" event={"ID":"4ace84da-cbee-4e2b-b473-67dac2985d5e","Type":"ContainerDied","Data":"153412cf8c6e56400ca32eacab5edc81a12defd090ba96f1ef61236ac79f6e05"} Nov 26 15:22:30 crc kubenswrapper[4785]: I1126 15:22:30.585685 4785 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-68b4f95d6c-cpkqd"] Nov 26 15:22:30 crc kubenswrapper[4785]: E1126 15:22:30.586599 4785 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bdc1a44d-c408-48d0-8df4-d52394f541eb" containerName="pull" Nov 26 15:22:30 crc kubenswrapper[4785]: I1126 15:22:30.586678 4785 state_mem.go:107] "Deleted CPUSet assignment" podUID="bdc1a44d-c408-48d0-8df4-d52394f541eb" containerName="pull" Nov 26 15:22:30 crc kubenswrapper[4785]: E1126 15:22:30.586767 4785 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bdc1a44d-c408-48d0-8df4-d52394f541eb" containerName="util" Nov 26 15:22:30 crc kubenswrapper[4785]: I1126 15:22:30.586826 4785 state_mem.go:107] "Deleted CPUSet assignment" podUID="bdc1a44d-c408-48d0-8df4-d52394f541eb" containerName="util" Nov 26 15:22:30 crc kubenswrapper[4785]: E1126 15:22:30.586882 4785 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bdc1a44d-c408-48d0-8df4-d52394f541eb" containerName="extract" Nov 26 15:22:30 crc kubenswrapper[4785]: I1126 15:22:30.586940 4785 state_mem.go:107] "Deleted CPUSet assignment" podUID="bdc1a44d-c408-48d0-8df4-d52394f541eb" containerName="extract" Nov 26 15:22:30 crc kubenswrapper[4785]: I1126 15:22:30.587110 4785 memory_manager.go:354] "RemoveStaleState removing state" podUID="bdc1a44d-c408-48d0-8df4-d52394f541eb" containerName="extract" Nov 26 15:22:30 crc kubenswrapper[4785]: I1126 15:22:30.587597 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-68b4f95d6c-cpkqd" Nov 26 15:22:30 crc kubenswrapper[4785]: I1126 15:22:30.589794 4785 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-service-cert" Nov 26 15:22:30 crc kubenswrapper[4785]: I1126 15:22:30.590153 4785 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-4nwqx" Nov 26 15:22:30 crc kubenswrapper[4785]: I1126 15:22:30.605447 4785 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-68b4f95d6c-cpkqd"] Nov 26 15:22:30 crc kubenswrapper[4785]: I1126 15:22:30.668910 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/bc8e3329-ae9c-48b1-a49c-92eeef6ae114-webhook-cert\") pod \"keystone-operator-controller-manager-68b4f95d6c-cpkqd\" (UID: \"bc8e3329-ae9c-48b1-a49c-92eeef6ae114\") " pod="openstack-operators/keystone-operator-controller-manager-68b4f95d6c-cpkqd" Nov 26 15:22:30 crc kubenswrapper[4785]: I1126 15:22:30.668978 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2s4zs\" (UniqueName: \"kubernetes.io/projected/bc8e3329-ae9c-48b1-a49c-92eeef6ae114-kube-api-access-2s4zs\") pod \"keystone-operator-controller-manager-68b4f95d6c-cpkqd\" (UID: \"bc8e3329-ae9c-48b1-a49c-92eeef6ae114\") " pod="openstack-operators/keystone-operator-controller-manager-68b4f95d6c-cpkqd" Nov 26 15:22:30 crc kubenswrapper[4785]: I1126 15:22:30.669126 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/bc8e3329-ae9c-48b1-a49c-92eeef6ae114-apiservice-cert\") pod \"keystone-operator-controller-manager-68b4f95d6c-cpkqd\" (UID: \"bc8e3329-ae9c-48b1-a49c-92eeef6ae114\") " pod="openstack-operators/keystone-operator-controller-manager-68b4f95d6c-cpkqd" Nov 26 15:22:30 crc kubenswrapper[4785]: I1126 15:22:30.770600 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2s4zs\" (UniqueName: \"kubernetes.io/projected/bc8e3329-ae9c-48b1-a49c-92eeef6ae114-kube-api-access-2s4zs\") pod \"keystone-operator-controller-manager-68b4f95d6c-cpkqd\" (UID: \"bc8e3329-ae9c-48b1-a49c-92eeef6ae114\") " pod="openstack-operators/keystone-operator-controller-manager-68b4f95d6c-cpkqd" Nov 26 15:22:30 crc kubenswrapper[4785]: I1126 15:22:30.770684 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/bc8e3329-ae9c-48b1-a49c-92eeef6ae114-apiservice-cert\") pod \"keystone-operator-controller-manager-68b4f95d6c-cpkqd\" (UID: \"bc8e3329-ae9c-48b1-a49c-92eeef6ae114\") " pod="openstack-operators/keystone-operator-controller-manager-68b4f95d6c-cpkqd" Nov 26 15:22:30 crc kubenswrapper[4785]: I1126 15:22:30.770740 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/bc8e3329-ae9c-48b1-a49c-92eeef6ae114-webhook-cert\") pod \"keystone-operator-controller-manager-68b4f95d6c-cpkqd\" (UID: \"bc8e3329-ae9c-48b1-a49c-92eeef6ae114\") " pod="openstack-operators/keystone-operator-controller-manager-68b4f95d6c-cpkqd" Nov 26 15:22:30 crc kubenswrapper[4785]: I1126 15:22:30.775996 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/bc8e3329-ae9c-48b1-a49c-92eeef6ae114-apiservice-cert\") pod \"keystone-operator-controller-manager-68b4f95d6c-cpkqd\" (UID: \"bc8e3329-ae9c-48b1-a49c-92eeef6ae114\") " pod="openstack-operators/keystone-operator-controller-manager-68b4f95d6c-cpkqd" Nov 26 15:22:30 crc kubenswrapper[4785]: I1126 15:22:30.783201 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/bc8e3329-ae9c-48b1-a49c-92eeef6ae114-webhook-cert\") pod \"keystone-operator-controller-manager-68b4f95d6c-cpkqd\" (UID: \"bc8e3329-ae9c-48b1-a49c-92eeef6ae114\") " pod="openstack-operators/keystone-operator-controller-manager-68b4f95d6c-cpkqd" Nov 26 15:22:30 crc kubenswrapper[4785]: I1126 15:22:30.790492 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2s4zs\" (UniqueName: \"kubernetes.io/projected/bc8e3329-ae9c-48b1-a49c-92eeef6ae114-kube-api-access-2s4zs\") pod \"keystone-operator-controller-manager-68b4f95d6c-cpkqd\" (UID: \"bc8e3329-ae9c-48b1-a49c-92eeef6ae114\") " pod="openstack-operators/keystone-operator-controller-manager-68b4f95d6c-cpkqd" Nov 26 15:22:30 crc kubenswrapper[4785]: I1126 15:22:30.901847 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-68b4f95d6c-cpkqd" Nov 26 15:22:31 crc kubenswrapper[4785]: W1126 15:22:31.137684 4785 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbc8e3329_ae9c_48b1_a49c_92eeef6ae114.slice/crio-90e77b943f9e749cf7a050045d53769323c55ee9349ec68d7eb8fe2e460e5160 WatchSource:0}: Error finding container 90e77b943f9e749cf7a050045d53769323c55ee9349ec68d7eb8fe2e460e5160: Status 404 returned error can't find the container with id 90e77b943f9e749cf7a050045d53769323c55ee9349ec68d7eb8fe2e460e5160 Nov 26 15:22:31 crc kubenswrapper[4785]: I1126 15:22:31.141143 4785 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-68b4f95d6c-cpkqd"] Nov 26 15:22:31 crc kubenswrapper[4785]: I1126 15:22:31.217814 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/rabbitmq-server-0" event={"ID":"4ace84da-cbee-4e2b-b473-67dac2985d5e","Type":"ContainerStarted","Data":"d12ee5de90ea7d95d33ec7bcd40611b20d469d9c7f7210d9762480be6384ce1a"} Nov 26 15:22:31 crc kubenswrapper[4785]: I1126 15:22:31.218173 4785 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/rabbitmq-server-0" Nov 26 15:22:31 crc kubenswrapper[4785]: I1126 15:22:31.219706 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-68b4f95d6c-cpkqd" event={"ID":"bc8e3329-ae9c-48b1-a49c-92eeef6ae114","Type":"ContainerStarted","Data":"90e77b943f9e749cf7a050045d53769323c55ee9349ec68d7eb8fe2e460e5160"} Nov 26 15:22:31 crc kubenswrapper[4785]: I1126 15:22:31.246703 4785 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/rabbitmq-server-0" podStartSLOduration=37.647601497 podStartE2EDuration="47.246682559s" podCreationTimestamp="2025-11-26 15:21:44 +0000 UTC" firstStartedPulling="2025-11-26 15:21:46.485534892 +0000 UTC m=+870.163900666" lastFinishedPulling="2025-11-26 15:21:56.084615964 +0000 UTC m=+879.762981728" observedRunningTime="2025-11-26 15:22:31.243615389 +0000 UTC m=+914.921981163" watchObservedRunningTime="2025-11-26 15:22:31.246682559 +0000 UTC m=+914.925048323" Nov 26 15:22:35 crc kubenswrapper[4785]: I1126 15:22:35.251292 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-68b4f95d6c-cpkqd" event={"ID":"bc8e3329-ae9c-48b1-a49c-92eeef6ae114","Type":"ContainerStarted","Data":"6f9f9b6a1f351c004314e3206140ceb097bd494a3c7a042b212fcd42e9327814"} Nov 26 15:22:35 crc kubenswrapper[4785]: I1126 15:22:35.251941 4785 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-68b4f95d6c-cpkqd" Nov 26 15:22:35 crc kubenswrapper[4785]: I1126 15:22:35.268280 4785 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-68b4f95d6c-cpkqd" podStartSLOduration=1.741055899 podStartE2EDuration="5.268258296s" podCreationTimestamp="2025-11-26 15:22:30 +0000 UTC" firstStartedPulling="2025-11-26 15:22:31.140215685 +0000 UTC m=+914.818581449" lastFinishedPulling="2025-11-26 15:22:34.667418082 +0000 UTC m=+918.345783846" observedRunningTime="2025-11-26 15:22:35.265531355 +0000 UTC m=+918.943897159" watchObservedRunningTime="2025-11-26 15:22:35.268258296 +0000 UTC m=+918.946624090" Nov 26 15:22:37 crc kubenswrapper[4785]: I1126 15:22:37.289196 4785 patch_prober.go:28] interesting pod/machine-config-daemon-gkxdl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 26 15:22:37 crc kubenswrapper[4785]: I1126 15:22:37.290029 4785 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-gkxdl" podUID="5539d39a-e2bc-4e7f-8b5a-a5e3e10c4ba4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 26 15:22:40 crc kubenswrapper[4785]: I1126 15:22:40.908505 4785 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-68b4f95d6c-cpkqd" Nov 26 15:22:45 crc kubenswrapper[4785]: I1126 15:22:45.466294 4785 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/keystone-84ae-account-create-update-c7dg2"] Nov 26 15:22:45 crc kubenswrapper[4785]: I1126 15:22:45.467600 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/keystone-84ae-account-create-update-c7dg2" Nov 26 15:22:45 crc kubenswrapper[4785]: I1126 15:22:45.469651 4785 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"keystone-db-secret" Nov 26 15:22:45 crc kubenswrapper[4785]: I1126 15:22:45.479419 4785 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/keystone-84ae-account-create-update-c7dg2"] Nov 26 15:22:45 crc kubenswrapper[4785]: I1126 15:22:45.556703 4785 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/keystone-db-create-8m6gk"] Nov 26 15:22:45 crc kubenswrapper[4785]: I1126 15:22:45.557861 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/keystone-db-create-8m6gk" Nov 26 15:22:45 crc kubenswrapper[4785]: I1126 15:22:45.563482 4785 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/keystone-db-create-8m6gk"] Nov 26 15:22:45 crc kubenswrapper[4785]: I1126 15:22:45.602262 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/74b3e785-2714-48e3-a605-95a8056b8220-operator-scripts\") pod \"keystone-84ae-account-create-update-c7dg2\" (UID: \"74b3e785-2714-48e3-a605-95a8056b8220\") " pod="glance-kuttl-tests/keystone-84ae-account-create-update-c7dg2" Nov 26 15:22:45 crc kubenswrapper[4785]: I1126 15:22:45.602355 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mpc57\" (UniqueName: \"kubernetes.io/projected/74b3e785-2714-48e3-a605-95a8056b8220-kube-api-access-mpc57\") pod \"keystone-84ae-account-create-update-c7dg2\" (UID: \"74b3e785-2714-48e3-a605-95a8056b8220\") " pod="glance-kuttl-tests/keystone-84ae-account-create-update-c7dg2" Nov 26 15:22:45 crc kubenswrapper[4785]: I1126 15:22:45.703934 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/74b3e785-2714-48e3-a605-95a8056b8220-operator-scripts\") pod \"keystone-84ae-account-create-update-c7dg2\" (UID: \"74b3e785-2714-48e3-a605-95a8056b8220\") " pod="glance-kuttl-tests/keystone-84ae-account-create-update-c7dg2" Nov 26 15:22:45 crc kubenswrapper[4785]: I1126 15:22:45.704005 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bzbh4\" (UniqueName: \"kubernetes.io/projected/720626e8-8cf9-499b-970c-95fce7ef463a-kube-api-access-bzbh4\") pod \"keystone-db-create-8m6gk\" (UID: \"720626e8-8cf9-499b-970c-95fce7ef463a\") " pod="glance-kuttl-tests/keystone-db-create-8m6gk" Nov 26 15:22:45 crc kubenswrapper[4785]: I1126 15:22:45.704038 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/720626e8-8cf9-499b-970c-95fce7ef463a-operator-scripts\") pod \"keystone-db-create-8m6gk\" (UID: \"720626e8-8cf9-499b-970c-95fce7ef463a\") " pod="glance-kuttl-tests/keystone-db-create-8m6gk" Nov 26 15:22:45 crc kubenswrapper[4785]: I1126 15:22:45.704092 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mpc57\" (UniqueName: \"kubernetes.io/projected/74b3e785-2714-48e3-a605-95a8056b8220-kube-api-access-mpc57\") pod \"keystone-84ae-account-create-update-c7dg2\" (UID: \"74b3e785-2714-48e3-a605-95a8056b8220\") " pod="glance-kuttl-tests/keystone-84ae-account-create-update-c7dg2" Nov 26 15:22:45 crc kubenswrapper[4785]: I1126 15:22:45.704718 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/74b3e785-2714-48e3-a605-95a8056b8220-operator-scripts\") pod \"keystone-84ae-account-create-update-c7dg2\" (UID: \"74b3e785-2714-48e3-a605-95a8056b8220\") " pod="glance-kuttl-tests/keystone-84ae-account-create-update-c7dg2" Nov 26 15:22:45 crc kubenswrapper[4785]: I1126 15:22:45.743962 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mpc57\" (UniqueName: \"kubernetes.io/projected/74b3e785-2714-48e3-a605-95a8056b8220-kube-api-access-mpc57\") pod \"keystone-84ae-account-create-update-c7dg2\" (UID: \"74b3e785-2714-48e3-a605-95a8056b8220\") " pod="glance-kuttl-tests/keystone-84ae-account-create-update-c7dg2" Nov 26 15:22:45 crc kubenswrapper[4785]: I1126 15:22:45.805029 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bzbh4\" (UniqueName: \"kubernetes.io/projected/720626e8-8cf9-499b-970c-95fce7ef463a-kube-api-access-bzbh4\") pod \"keystone-db-create-8m6gk\" (UID: \"720626e8-8cf9-499b-970c-95fce7ef463a\") " pod="glance-kuttl-tests/keystone-db-create-8m6gk" Nov 26 15:22:45 crc kubenswrapper[4785]: I1126 15:22:45.805085 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/720626e8-8cf9-499b-970c-95fce7ef463a-operator-scripts\") pod \"keystone-db-create-8m6gk\" (UID: \"720626e8-8cf9-499b-970c-95fce7ef463a\") " pod="glance-kuttl-tests/keystone-db-create-8m6gk" Nov 26 15:22:45 crc kubenswrapper[4785]: I1126 15:22:45.805772 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/720626e8-8cf9-499b-970c-95fce7ef463a-operator-scripts\") pod \"keystone-db-create-8m6gk\" (UID: \"720626e8-8cf9-499b-970c-95fce7ef463a\") " pod="glance-kuttl-tests/keystone-db-create-8m6gk" Nov 26 15:22:45 crc kubenswrapper[4785]: I1126 15:22:45.826832 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bzbh4\" (UniqueName: \"kubernetes.io/projected/720626e8-8cf9-499b-970c-95fce7ef463a-kube-api-access-bzbh4\") pod \"keystone-db-create-8m6gk\" (UID: \"720626e8-8cf9-499b-970c-95fce7ef463a\") " pod="glance-kuttl-tests/keystone-db-create-8m6gk" Nov 26 15:22:45 crc kubenswrapper[4785]: I1126 15:22:45.827391 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/keystone-84ae-account-create-update-c7dg2" Nov 26 15:22:45 crc kubenswrapper[4785]: I1126 15:22:45.881329 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/keystone-db-create-8m6gk" Nov 26 15:22:46 crc kubenswrapper[4785]: I1126 15:22:46.084718 4785 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/rabbitmq-server-0" Nov 26 15:22:46 crc kubenswrapper[4785]: I1126 15:22:46.242973 4785 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/keystone-84ae-account-create-update-c7dg2"] Nov 26 15:22:46 crc kubenswrapper[4785]: W1126 15:22:46.250685 4785 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod74b3e785_2714_48e3_a605_95a8056b8220.slice/crio-c6f0d8f142c6a1021ee2e27d6970e2abe9efc47a29f6848ab6a8120810963d4c WatchSource:0}: Error finding container c6f0d8f142c6a1021ee2e27d6970e2abe9efc47a29f6848ab6a8120810963d4c: Status 404 returned error can't find the container with id c6f0d8f142c6a1021ee2e27d6970e2abe9efc47a29f6848ab6a8120810963d4c Nov 26 15:22:46 crc kubenswrapper[4785]: I1126 15:22:46.327598 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/keystone-84ae-account-create-update-c7dg2" event={"ID":"74b3e785-2714-48e3-a605-95a8056b8220","Type":"ContainerStarted","Data":"c6f0d8f142c6a1021ee2e27d6970e2abe9efc47a29f6848ab6a8120810963d4c"} Nov 26 15:22:46 crc kubenswrapper[4785]: I1126 15:22:46.378516 4785 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/keystone-db-create-8m6gk"] Nov 26 15:22:46 crc kubenswrapper[4785]: W1126 15:22:46.383540 4785 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod720626e8_8cf9_499b_970c_95fce7ef463a.slice/crio-9ed0068057cc84c22b01a9197bb026008734c94ed8fe3c9d045a14ac9099451d WatchSource:0}: Error finding container 9ed0068057cc84c22b01a9197bb026008734c94ed8fe3c9d045a14ac9099451d: Status 404 returned error can't find the container with id 9ed0068057cc84c22b01a9197bb026008734c94ed8fe3c9d045a14ac9099451d Nov 26 15:22:47 crc kubenswrapper[4785]: I1126 15:22:47.336312 4785 generic.go:334] "Generic (PLEG): container finished" podID="74b3e785-2714-48e3-a605-95a8056b8220" containerID="0867da53aa2c9940d8e7df4d24d84370f7a4815a48484465e7f9b7abebd3337d" exitCode=0 Nov 26 15:22:47 crc kubenswrapper[4785]: I1126 15:22:47.336369 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/keystone-84ae-account-create-update-c7dg2" event={"ID":"74b3e785-2714-48e3-a605-95a8056b8220","Type":"ContainerDied","Data":"0867da53aa2c9940d8e7df4d24d84370f7a4815a48484465e7f9b7abebd3337d"} Nov 26 15:22:47 crc kubenswrapper[4785]: I1126 15:22:47.339351 4785 generic.go:334] "Generic (PLEG): container finished" podID="720626e8-8cf9-499b-970c-95fce7ef463a" containerID="2ffccd3fa5f3f4730162cea4ccfc21ba0252eb8a41b340b551ef2bffa59e36ff" exitCode=0 Nov 26 15:22:47 crc kubenswrapper[4785]: I1126 15:22:47.339433 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/keystone-db-create-8m6gk" event={"ID":"720626e8-8cf9-499b-970c-95fce7ef463a","Type":"ContainerDied","Data":"2ffccd3fa5f3f4730162cea4ccfc21ba0252eb8a41b340b551ef2bffa59e36ff"} Nov 26 15:22:47 crc kubenswrapper[4785]: I1126 15:22:47.339466 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/keystone-db-create-8m6gk" event={"ID":"720626e8-8cf9-499b-970c-95fce7ef463a","Type":"ContainerStarted","Data":"9ed0068057cc84c22b01a9197bb026008734c94ed8fe3c9d045a14ac9099451d"} Nov 26 15:22:48 crc kubenswrapper[4785]: I1126 15:22:48.176323 4785 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-index-lxj68"] Nov 26 15:22:48 crc kubenswrapper[4785]: I1126 15:22:48.177250 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-index-lxj68" Nov 26 15:22:48 crc kubenswrapper[4785]: I1126 15:22:48.179837 4785 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-index-dockercfg-k26qg" Nov 26 15:22:48 crc kubenswrapper[4785]: I1126 15:22:48.190719 4785 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-index-lxj68"] Nov 26 15:22:48 crc kubenswrapper[4785]: I1126 15:22:48.344821 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zl6fn\" (UniqueName: \"kubernetes.io/projected/d0c4491d-f126-40e0-b816-edda3257c939-kube-api-access-zl6fn\") pod \"horizon-operator-index-lxj68\" (UID: \"d0c4491d-f126-40e0-b816-edda3257c939\") " pod="openstack-operators/horizon-operator-index-lxj68" Nov 26 15:22:48 crc kubenswrapper[4785]: I1126 15:22:48.446831 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zl6fn\" (UniqueName: \"kubernetes.io/projected/d0c4491d-f126-40e0-b816-edda3257c939-kube-api-access-zl6fn\") pod \"horizon-operator-index-lxj68\" (UID: \"d0c4491d-f126-40e0-b816-edda3257c939\") " pod="openstack-operators/horizon-operator-index-lxj68" Nov 26 15:22:48 crc kubenswrapper[4785]: I1126 15:22:48.474780 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zl6fn\" (UniqueName: \"kubernetes.io/projected/d0c4491d-f126-40e0-b816-edda3257c939-kube-api-access-zl6fn\") pod \"horizon-operator-index-lxj68\" (UID: \"d0c4491d-f126-40e0-b816-edda3257c939\") " pod="openstack-operators/horizon-operator-index-lxj68" Nov 26 15:22:48 crc kubenswrapper[4785]: I1126 15:22:48.499791 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-index-lxj68" Nov 26 15:22:48 crc kubenswrapper[4785]: I1126 15:22:48.681494 4785 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/keystone-84ae-account-create-update-c7dg2" Nov 26 15:22:48 crc kubenswrapper[4785]: I1126 15:22:48.689110 4785 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/keystone-db-create-8m6gk" Nov 26 15:22:48 crc kubenswrapper[4785]: I1126 15:22:48.853723 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mpc57\" (UniqueName: \"kubernetes.io/projected/74b3e785-2714-48e3-a605-95a8056b8220-kube-api-access-mpc57\") pod \"74b3e785-2714-48e3-a605-95a8056b8220\" (UID: \"74b3e785-2714-48e3-a605-95a8056b8220\") " Nov 26 15:22:48 crc kubenswrapper[4785]: I1126 15:22:48.853790 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/74b3e785-2714-48e3-a605-95a8056b8220-operator-scripts\") pod \"74b3e785-2714-48e3-a605-95a8056b8220\" (UID: \"74b3e785-2714-48e3-a605-95a8056b8220\") " Nov 26 15:22:48 crc kubenswrapper[4785]: I1126 15:22:48.853842 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/720626e8-8cf9-499b-970c-95fce7ef463a-operator-scripts\") pod \"720626e8-8cf9-499b-970c-95fce7ef463a\" (UID: \"720626e8-8cf9-499b-970c-95fce7ef463a\") " Nov 26 15:22:48 crc kubenswrapper[4785]: I1126 15:22:48.853918 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bzbh4\" (UniqueName: \"kubernetes.io/projected/720626e8-8cf9-499b-970c-95fce7ef463a-kube-api-access-bzbh4\") pod \"720626e8-8cf9-499b-970c-95fce7ef463a\" (UID: \"720626e8-8cf9-499b-970c-95fce7ef463a\") " Nov 26 15:22:48 crc kubenswrapper[4785]: I1126 15:22:48.854538 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/74b3e785-2714-48e3-a605-95a8056b8220-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "74b3e785-2714-48e3-a605-95a8056b8220" (UID: "74b3e785-2714-48e3-a605-95a8056b8220"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 15:22:48 crc kubenswrapper[4785]: I1126 15:22:48.854538 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/720626e8-8cf9-499b-970c-95fce7ef463a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "720626e8-8cf9-499b-970c-95fce7ef463a" (UID: "720626e8-8cf9-499b-970c-95fce7ef463a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 15:22:48 crc kubenswrapper[4785]: I1126 15:22:48.857082 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/74b3e785-2714-48e3-a605-95a8056b8220-kube-api-access-mpc57" (OuterVolumeSpecName: "kube-api-access-mpc57") pod "74b3e785-2714-48e3-a605-95a8056b8220" (UID: "74b3e785-2714-48e3-a605-95a8056b8220"). InnerVolumeSpecName "kube-api-access-mpc57". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 15:22:48 crc kubenswrapper[4785]: I1126 15:22:48.857680 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/720626e8-8cf9-499b-970c-95fce7ef463a-kube-api-access-bzbh4" (OuterVolumeSpecName: "kube-api-access-bzbh4") pod "720626e8-8cf9-499b-970c-95fce7ef463a" (UID: "720626e8-8cf9-499b-970c-95fce7ef463a"). InnerVolumeSpecName "kube-api-access-bzbh4". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 15:22:48 crc kubenswrapper[4785]: I1126 15:22:48.930108 4785 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-index-lxj68"] Nov 26 15:22:48 crc kubenswrapper[4785]: I1126 15:22:48.954890 4785 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bzbh4\" (UniqueName: \"kubernetes.io/projected/720626e8-8cf9-499b-970c-95fce7ef463a-kube-api-access-bzbh4\") on node \"crc\" DevicePath \"\"" Nov 26 15:22:48 crc kubenswrapper[4785]: I1126 15:22:48.954924 4785 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mpc57\" (UniqueName: \"kubernetes.io/projected/74b3e785-2714-48e3-a605-95a8056b8220-kube-api-access-mpc57\") on node \"crc\" DevicePath \"\"" Nov 26 15:22:48 crc kubenswrapper[4785]: I1126 15:22:48.954935 4785 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/74b3e785-2714-48e3-a605-95a8056b8220-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 26 15:22:48 crc kubenswrapper[4785]: I1126 15:22:48.954943 4785 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/720626e8-8cf9-499b-970c-95fce7ef463a-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 26 15:22:49 crc kubenswrapper[4785]: I1126 15:22:49.352133 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-index-lxj68" event={"ID":"d0c4491d-f126-40e0-b816-edda3257c939","Type":"ContainerStarted","Data":"8e4aabc8a60869496366a9e6a8eff2cfa0f8ea6920fa348546d1ef662c71af5b"} Nov 26 15:22:49 crc kubenswrapper[4785]: I1126 15:22:49.353572 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/keystone-db-create-8m6gk" event={"ID":"720626e8-8cf9-499b-970c-95fce7ef463a","Type":"ContainerDied","Data":"9ed0068057cc84c22b01a9197bb026008734c94ed8fe3c9d045a14ac9099451d"} Nov 26 15:22:49 crc kubenswrapper[4785]: I1126 15:22:49.353621 4785 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9ed0068057cc84c22b01a9197bb026008734c94ed8fe3c9d045a14ac9099451d" Nov 26 15:22:49 crc kubenswrapper[4785]: I1126 15:22:49.353583 4785 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/keystone-db-create-8m6gk" Nov 26 15:22:49 crc kubenswrapper[4785]: I1126 15:22:49.354657 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/keystone-84ae-account-create-update-c7dg2" event={"ID":"74b3e785-2714-48e3-a605-95a8056b8220","Type":"ContainerDied","Data":"c6f0d8f142c6a1021ee2e27d6970e2abe9efc47a29f6848ab6a8120810963d4c"} Nov 26 15:22:49 crc kubenswrapper[4785]: I1126 15:22:49.354681 4785 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c6f0d8f142c6a1021ee2e27d6970e2abe9efc47a29f6848ab6a8120810963d4c" Nov 26 15:22:49 crc kubenswrapper[4785]: I1126 15:22:49.354689 4785 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/keystone-84ae-account-create-update-c7dg2" Nov 26 15:22:50 crc kubenswrapper[4785]: I1126 15:22:50.363517 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-index-lxj68" event={"ID":"d0c4491d-f126-40e0-b816-edda3257c939","Type":"ContainerStarted","Data":"30a99bd1d04da190c9ca9bbce3f155e1e1e59b8f6ee5df199b9eb29d72cc0b96"} Nov 26 15:22:50 crc kubenswrapper[4785]: I1126 15:22:50.392574 4785 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-index-lxj68" podStartSLOduration=1.477959563 podStartE2EDuration="2.39253617s" podCreationTimestamp="2025-11-26 15:22:48 +0000 UTC" firstStartedPulling="2025-11-26 15:22:48.943532689 +0000 UTC m=+932.621898453" lastFinishedPulling="2025-11-26 15:22:49.858109256 +0000 UTC m=+933.536475060" observedRunningTime="2025-11-26 15:22:50.38794396 +0000 UTC m=+934.066309744" watchObservedRunningTime="2025-11-26 15:22:50.39253617 +0000 UTC m=+934.070901954" Nov 26 15:22:51 crc kubenswrapper[4785]: I1126 15:22:51.131829 4785 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/keystone-db-sync-26zth"] Nov 26 15:22:51 crc kubenswrapper[4785]: E1126 15:22:51.132052 4785 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74b3e785-2714-48e3-a605-95a8056b8220" containerName="mariadb-account-create-update" Nov 26 15:22:51 crc kubenswrapper[4785]: I1126 15:22:51.132063 4785 state_mem.go:107] "Deleted CPUSet assignment" podUID="74b3e785-2714-48e3-a605-95a8056b8220" containerName="mariadb-account-create-update" Nov 26 15:22:51 crc kubenswrapper[4785]: E1126 15:22:51.132070 4785 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="720626e8-8cf9-499b-970c-95fce7ef463a" containerName="mariadb-database-create" Nov 26 15:22:51 crc kubenswrapper[4785]: I1126 15:22:51.132076 4785 state_mem.go:107] "Deleted CPUSet assignment" podUID="720626e8-8cf9-499b-970c-95fce7ef463a" containerName="mariadb-database-create" Nov 26 15:22:51 crc kubenswrapper[4785]: I1126 15:22:51.132187 4785 memory_manager.go:354] "RemoveStaleState removing state" podUID="720626e8-8cf9-499b-970c-95fce7ef463a" containerName="mariadb-database-create" Nov 26 15:22:51 crc kubenswrapper[4785]: I1126 15:22:51.132196 4785 memory_manager.go:354] "RemoveStaleState removing state" podUID="74b3e785-2714-48e3-a605-95a8056b8220" containerName="mariadb-account-create-update" Nov 26 15:22:51 crc kubenswrapper[4785]: I1126 15:22:51.132593 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/keystone-db-sync-26zth" Nov 26 15:22:51 crc kubenswrapper[4785]: I1126 15:22:51.134695 4785 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"keystone-keystone-dockercfg-gwfvc" Nov 26 15:22:51 crc kubenswrapper[4785]: I1126 15:22:51.134861 4785 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"keystone-config-data" Nov 26 15:22:51 crc kubenswrapper[4785]: I1126 15:22:51.134909 4785 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"keystone-scripts" Nov 26 15:22:51 crc kubenswrapper[4785]: I1126 15:22:51.135133 4785 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"keystone" Nov 26 15:22:51 crc kubenswrapper[4785]: I1126 15:22:51.139952 4785 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/keystone-db-sync-26zth"] Nov 26 15:22:51 crc kubenswrapper[4785]: I1126 15:22:51.288277 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d6236e0d-402b-4338-93c8-4f1a65013acc-config-data\") pod \"keystone-db-sync-26zth\" (UID: \"d6236e0d-402b-4338-93c8-4f1a65013acc\") " pod="glance-kuttl-tests/keystone-db-sync-26zth" Nov 26 15:22:51 crc kubenswrapper[4785]: I1126 15:22:51.288384 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z49cx\" (UniqueName: \"kubernetes.io/projected/d6236e0d-402b-4338-93c8-4f1a65013acc-kube-api-access-z49cx\") pod \"keystone-db-sync-26zth\" (UID: \"d6236e0d-402b-4338-93c8-4f1a65013acc\") " pod="glance-kuttl-tests/keystone-db-sync-26zth" Nov 26 15:22:51 crc kubenswrapper[4785]: I1126 15:22:51.378984 4785 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-index-n27b9"] Nov 26 15:22:51 crc kubenswrapper[4785]: I1126 15:22:51.380033 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-index-n27b9" Nov 26 15:22:51 crc kubenswrapper[4785]: I1126 15:22:51.382341 4785 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-index-dockercfg-rzv55" Nov 26 15:22:51 crc kubenswrapper[4785]: I1126 15:22:51.390222 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d6236e0d-402b-4338-93c8-4f1a65013acc-config-data\") pod \"keystone-db-sync-26zth\" (UID: \"d6236e0d-402b-4338-93c8-4f1a65013acc\") " pod="glance-kuttl-tests/keystone-db-sync-26zth" Nov 26 15:22:51 crc kubenswrapper[4785]: I1126 15:22:51.392399 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z49cx\" (UniqueName: \"kubernetes.io/projected/d6236e0d-402b-4338-93c8-4f1a65013acc-kube-api-access-z49cx\") pod \"keystone-db-sync-26zth\" (UID: \"d6236e0d-402b-4338-93c8-4f1a65013acc\") " pod="glance-kuttl-tests/keystone-db-sync-26zth" Nov 26 15:22:51 crc kubenswrapper[4785]: I1126 15:22:51.395338 4785 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-index-n27b9"] Nov 26 15:22:51 crc kubenswrapper[4785]: I1126 15:22:51.401252 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d6236e0d-402b-4338-93c8-4f1a65013acc-config-data\") pod \"keystone-db-sync-26zth\" (UID: \"d6236e0d-402b-4338-93c8-4f1a65013acc\") " pod="glance-kuttl-tests/keystone-db-sync-26zth" Nov 26 15:22:51 crc kubenswrapper[4785]: I1126 15:22:51.419222 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z49cx\" (UniqueName: \"kubernetes.io/projected/d6236e0d-402b-4338-93c8-4f1a65013acc-kube-api-access-z49cx\") pod \"keystone-db-sync-26zth\" (UID: \"d6236e0d-402b-4338-93c8-4f1a65013acc\") " pod="glance-kuttl-tests/keystone-db-sync-26zth" Nov 26 15:22:51 crc kubenswrapper[4785]: I1126 15:22:51.450220 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/keystone-db-sync-26zth" Nov 26 15:22:51 crc kubenswrapper[4785]: I1126 15:22:51.493975 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fsb5r\" (UniqueName: \"kubernetes.io/projected/a621a417-b542-4b88-a0b0-12b77aee924b-kube-api-access-fsb5r\") pod \"swift-operator-index-n27b9\" (UID: \"a621a417-b542-4b88-a0b0-12b77aee924b\") " pod="openstack-operators/swift-operator-index-n27b9" Nov 26 15:22:51 crc kubenswrapper[4785]: I1126 15:22:51.595055 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fsb5r\" (UniqueName: \"kubernetes.io/projected/a621a417-b542-4b88-a0b0-12b77aee924b-kube-api-access-fsb5r\") pod \"swift-operator-index-n27b9\" (UID: \"a621a417-b542-4b88-a0b0-12b77aee924b\") " pod="openstack-operators/swift-operator-index-n27b9" Nov 26 15:22:51 crc kubenswrapper[4785]: I1126 15:22:51.612384 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fsb5r\" (UniqueName: \"kubernetes.io/projected/a621a417-b542-4b88-a0b0-12b77aee924b-kube-api-access-fsb5r\") pod \"swift-operator-index-n27b9\" (UID: \"a621a417-b542-4b88-a0b0-12b77aee924b\") " pod="openstack-operators/swift-operator-index-n27b9" Nov 26 15:22:51 crc kubenswrapper[4785]: I1126 15:22:51.756662 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-index-n27b9" Nov 26 15:22:51 crc kubenswrapper[4785]: I1126 15:22:51.870613 4785 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/keystone-db-sync-26zth"] Nov 26 15:22:51 crc kubenswrapper[4785]: W1126 15:22:51.884507 4785 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd6236e0d_402b_4338_93c8_4f1a65013acc.slice/crio-359bd55762df065a9ac318a311bd8010b7a865450ef9fbf33a8c1cefc4c29ce1 WatchSource:0}: Error finding container 359bd55762df065a9ac318a311bd8010b7a865450ef9fbf33a8c1cefc4c29ce1: Status 404 returned error can't find the container with id 359bd55762df065a9ac318a311bd8010b7a865450ef9fbf33a8c1cefc4c29ce1 Nov 26 15:22:52 crc kubenswrapper[4785]: I1126 15:22:52.000450 4785 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-index-n27b9"] Nov 26 15:22:52 crc kubenswrapper[4785]: W1126 15:22:52.009162 4785 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda621a417_b542_4b88_a0b0_12b77aee924b.slice/crio-f5ffcef25f79c195cfdcc5149b74465aab7b8409c54c38b4845250e19539e65b WatchSource:0}: Error finding container f5ffcef25f79c195cfdcc5149b74465aab7b8409c54c38b4845250e19539e65b: Status 404 returned error can't find the container with id f5ffcef25f79c195cfdcc5149b74465aab7b8409c54c38b4845250e19539e65b Nov 26 15:22:52 crc kubenswrapper[4785]: I1126 15:22:52.380704 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/keystone-db-sync-26zth" event={"ID":"d6236e0d-402b-4338-93c8-4f1a65013acc","Type":"ContainerStarted","Data":"359bd55762df065a9ac318a311bd8010b7a865450ef9fbf33a8c1cefc4c29ce1"} Nov 26 15:22:52 crc kubenswrapper[4785]: I1126 15:22:52.381670 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-index-n27b9" event={"ID":"a621a417-b542-4b88-a0b0-12b77aee924b","Type":"ContainerStarted","Data":"f5ffcef25f79c195cfdcc5149b74465aab7b8409c54c38b4845250e19539e65b"} Nov 26 15:22:53 crc kubenswrapper[4785]: I1126 15:22:53.970046 4785 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/horizon-operator-index-lxj68"] Nov 26 15:22:53 crc kubenswrapper[4785]: I1126 15:22:53.971155 4785 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/horizon-operator-index-lxj68" podUID="d0c4491d-f126-40e0-b816-edda3257c939" containerName="registry-server" containerID="cri-o://30a99bd1d04da190c9ca9bbce3f155e1e1e59b8f6ee5df199b9eb29d72cc0b96" gracePeriod=2 Nov 26 15:22:54 crc kubenswrapper[4785]: I1126 15:22:54.397388 4785 generic.go:334] "Generic (PLEG): container finished" podID="d0c4491d-f126-40e0-b816-edda3257c939" containerID="30a99bd1d04da190c9ca9bbce3f155e1e1e59b8f6ee5df199b9eb29d72cc0b96" exitCode=0 Nov 26 15:22:54 crc kubenswrapper[4785]: I1126 15:22:54.397474 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-index-lxj68" event={"ID":"d0c4491d-f126-40e0-b816-edda3257c939","Type":"ContainerDied","Data":"30a99bd1d04da190c9ca9bbce3f155e1e1e59b8f6ee5df199b9eb29d72cc0b96"} Nov 26 15:22:54 crc kubenswrapper[4785]: I1126 15:22:54.399132 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-index-n27b9" event={"ID":"a621a417-b542-4b88-a0b0-12b77aee924b","Type":"ContainerStarted","Data":"5baef39a4c7dbcebbc9aacbae6420f83dd4301b566f4ada47daedbd4134cf72e"} Nov 26 15:22:54 crc kubenswrapper[4785]: I1126 15:22:54.416119 4785 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-index-n27b9" podStartSLOduration=1.953662557 podStartE2EDuration="3.416098969s" podCreationTimestamp="2025-11-26 15:22:51 +0000 UTC" firstStartedPulling="2025-11-26 15:22:52.01072563 +0000 UTC m=+935.689091394" lastFinishedPulling="2025-11-26 15:22:53.473162042 +0000 UTC m=+937.151527806" observedRunningTime="2025-11-26 15:22:54.411329045 +0000 UTC m=+938.089694819" watchObservedRunningTime="2025-11-26 15:22:54.416098969 +0000 UTC m=+938.094464753" Nov 26 15:22:54 crc kubenswrapper[4785]: I1126 15:22:54.589609 4785 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-index-tssqb"] Nov 26 15:22:54 crc kubenswrapper[4785]: I1126 15:22:54.590637 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-index-tssqb" Nov 26 15:22:54 crc kubenswrapper[4785]: I1126 15:22:54.599178 4785 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-index-tssqb"] Nov 26 15:22:54 crc kubenswrapper[4785]: I1126 15:22:54.741705 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rjtzq\" (UniqueName: \"kubernetes.io/projected/3e49c55a-9d72-4b78-ac75-84fba908f67b-kube-api-access-rjtzq\") pod \"horizon-operator-index-tssqb\" (UID: \"3e49c55a-9d72-4b78-ac75-84fba908f67b\") " pod="openstack-operators/horizon-operator-index-tssqb" Nov 26 15:22:54 crc kubenswrapper[4785]: I1126 15:22:54.843131 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rjtzq\" (UniqueName: \"kubernetes.io/projected/3e49c55a-9d72-4b78-ac75-84fba908f67b-kube-api-access-rjtzq\") pod \"horizon-operator-index-tssqb\" (UID: \"3e49c55a-9d72-4b78-ac75-84fba908f67b\") " pod="openstack-operators/horizon-operator-index-tssqb" Nov 26 15:22:54 crc kubenswrapper[4785]: I1126 15:22:54.868975 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rjtzq\" (UniqueName: \"kubernetes.io/projected/3e49c55a-9d72-4b78-ac75-84fba908f67b-kube-api-access-rjtzq\") pod \"horizon-operator-index-tssqb\" (UID: \"3e49c55a-9d72-4b78-ac75-84fba908f67b\") " pod="openstack-operators/horizon-operator-index-tssqb" Nov 26 15:22:54 crc kubenswrapper[4785]: I1126 15:22:54.911134 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-index-tssqb" Nov 26 15:22:56 crc kubenswrapper[4785]: I1126 15:22:56.420634 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-index-lxj68" event={"ID":"d0c4491d-f126-40e0-b816-edda3257c939","Type":"ContainerDied","Data":"8e4aabc8a60869496366a9e6a8eff2cfa0f8ea6920fa348546d1ef662c71af5b"} Nov 26 15:22:56 crc kubenswrapper[4785]: I1126 15:22:56.420676 4785 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8e4aabc8a60869496366a9e6a8eff2cfa0f8ea6920fa348546d1ef662c71af5b" Nov 26 15:22:56 crc kubenswrapper[4785]: I1126 15:22:56.468314 4785 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-index-lxj68" Nov 26 15:22:56 crc kubenswrapper[4785]: I1126 15:22:56.570898 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zl6fn\" (UniqueName: \"kubernetes.io/projected/d0c4491d-f126-40e0-b816-edda3257c939-kube-api-access-zl6fn\") pod \"d0c4491d-f126-40e0-b816-edda3257c939\" (UID: \"d0c4491d-f126-40e0-b816-edda3257c939\") " Nov 26 15:22:56 crc kubenswrapper[4785]: I1126 15:22:56.580959 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d0c4491d-f126-40e0-b816-edda3257c939-kube-api-access-zl6fn" (OuterVolumeSpecName: "kube-api-access-zl6fn") pod "d0c4491d-f126-40e0-b816-edda3257c939" (UID: "d0c4491d-f126-40e0-b816-edda3257c939"). InnerVolumeSpecName "kube-api-access-zl6fn". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 15:22:56 crc kubenswrapper[4785]: I1126 15:22:56.672223 4785 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zl6fn\" (UniqueName: \"kubernetes.io/projected/d0c4491d-f126-40e0-b816-edda3257c939-kube-api-access-zl6fn\") on node \"crc\" DevicePath \"\"" Nov 26 15:22:57 crc kubenswrapper[4785]: I1126 15:22:57.426609 4785 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-index-lxj68" Nov 26 15:22:57 crc kubenswrapper[4785]: I1126 15:22:57.444030 4785 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/horizon-operator-index-lxj68"] Nov 26 15:22:57 crc kubenswrapper[4785]: I1126 15:22:57.456166 4785 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/horizon-operator-index-lxj68"] Nov 26 15:22:57 crc kubenswrapper[4785]: I1126 15:22:57.776628 4785 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/swift-operator-index-n27b9"] Nov 26 15:22:57 crc kubenswrapper[4785]: I1126 15:22:57.776800 4785 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/swift-operator-index-n27b9" podUID="a621a417-b542-4b88-a0b0-12b77aee924b" containerName="registry-server" containerID="cri-o://5baef39a4c7dbcebbc9aacbae6420f83dd4301b566f4ada47daedbd4134cf72e" gracePeriod=2 Nov 26 15:22:58 crc kubenswrapper[4785]: I1126 15:22:58.436432 4785 generic.go:334] "Generic (PLEG): container finished" podID="a621a417-b542-4b88-a0b0-12b77aee924b" containerID="5baef39a4c7dbcebbc9aacbae6420f83dd4301b566f4ada47daedbd4134cf72e" exitCode=0 Nov 26 15:22:58 crc kubenswrapper[4785]: I1126 15:22:58.436760 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-index-n27b9" event={"ID":"a621a417-b542-4b88-a0b0-12b77aee924b","Type":"ContainerDied","Data":"5baef39a4c7dbcebbc9aacbae6420f83dd4301b566f4ada47daedbd4134cf72e"} Nov 26 15:22:58 crc kubenswrapper[4785]: I1126 15:22:58.584936 4785 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-index-c4ks4"] Nov 26 15:22:58 crc kubenswrapper[4785]: E1126 15:22:58.585176 4785 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d0c4491d-f126-40e0-b816-edda3257c939" containerName="registry-server" Nov 26 15:22:58 crc kubenswrapper[4785]: I1126 15:22:58.585189 4785 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0c4491d-f126-40e0-b816-edda3257c939" containerName="registry-server" Nov 26 15:22:58 crc kubenswrapper[4785]: I1126 15:22:58.585344 4785 memory_manager.go:354] "RemoveStaleState removing state" podUID="d0c4491d-f126-40e0-b816-edda3257c939" containerName="registry-server" Nov 26 15:22:58 crc kubenswrapper[4785]: I1126 15:22:58.586296 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-index-c4ks4" Nov 26 15:22:58 crc kubenswrapper[4785]: I1126 15:22:58.588867 4785 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-index-c4ks4"] Nov 26 15:22:58 crc kubenswrapper[4785]: I1126 15:22:58.599461 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fgdx7\" (UniqueName: \"kubernetes.io/projected/6030eac1-f066-489e-9a17-5c2dd5d5880a-kube-api-access-fgdx7\") pod \"swift-operator-index-c4ks4\" (UID: \"6030eac1-f066-489e-9a17-5c2dd5d5880a\") " pod="openstack-operators/swift-operator-index-c4ks4" Nov 26 15:22:58 crc kubenswrapper[4785]: I1126 15:22:58.700857 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fgdx7\" (UniqueName: \"kubernetes.io/projected/6030eac1-f066-489e-9a17-5c2dd5d5880a-kube-api-access-fgdx7\") pod \"swift-operator-index-c4ks4\" (UID: \"6030eac1-f066-489e-9a17-5c2dd5d5880a\") " pod="openstack-operators/swift-operator-index-c4ks4" Nov 26 15:22:58 crc kubenswrapper[4785]: I1126 15:22:58.718874 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fgdx7\" (UniqueName: \"kubernetes.io/projected/6030eac1-f066-489e-9a17-5c2dd5d5880a-kube-api-access-fgdx7\") pod \"swift-operator-index-c4ks4\" (UID: \"6030eac1-f066-489e-9a17-5c2dd5d5880a\") " pod="openstack-operators/swift-operator-index-c4ks4" Nov 26 15:22:58 crc kubenswrapper[4785]: I1126 15:22:58.903157 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-index-c4ks4" Nov 26 15:22:59 crc kubenswrapper[4785]: I1126 15:22:59.044614 4785 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d0c4491d-f126-40e0-b816-edda3257c939" path="/var/lib/kubelet/pods/d0c4491d-f126-40e0-b816-edda3257c939/volumes" Nov 26 15:23:01 crc kubenswrapper[4785]: I1126 15:23:01.757678 4785 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-index-n27b9" Nov 26 15:23:02 crc kubenswrapper[4785]: I1126 15:23:02.004312 4785 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-index-n27b9" Nov 26 15:23:02 crc kubenswrapper[4785]: I1126 15:23:02.149721 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fsb5r\" (UniqueName: \"kubernetes.io/projected/a621a417-b542-4b88-a0b0-12b77aee924b-kube-api-access-fsb5r\") pod \"a621a417-b542-4b88-a0b0-12b77aee924b\" (UID: \"a621a417-b542-4b88-a0b0-12b77aee924b\") " Nov 26 15:23:02 crc kubenswrapper[4785]: I1126 15:23:02.168984 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a621a417-b542-4b88-a0b0-12b77aee924b-kube-api-access-fsb5r" (OuterVolumeSpecName: "kube-api-access-fsb5r") pod "a621a417-b542-4b88-a0b0-12b77aee924b" (UID: "a621a417-b542-4b88-a0b0-12b77aee924b"). InnerVolumeSpecName "kube-api-access-fsb5r". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 15:23:02 crc kubenswrapper[4785]: I1126 15:23:02.250748 4785 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fsb5r\" (UniqueName: \"kubernetes.io/projected/a621a417-b542-4b88-a0b0-12b77aee924b-kube-api-access-fsb5r\") on node \"crc\" DevicePath \"\"" Nov 26 15:23:02 crc kubenswrapper[4785]: I1126 15:23:02.363259 4785 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-index-c4ks4"] Nov 26 15:23:02 crc kubenswrapper[4785]: W1126 15:23:02.367773 4785 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6030eac1_f066_489e_9a17_5c2dd5d5880a.slice/crio-c1cdb446eb34e0125ed7318d6b29a5852faef51710d0d1553a00390a343fb26f WatchSource:0}: Error finding container c1cdb446eb34e0125ed7318d6b29a5852faef51710d0d1553a00390a343fb26f: Status 404 returned error can't find the container with id c1cdb446eb34e0125ed7318d6b29a5852faef51710d0d1553a00390a343fb26f Nov 26 15:23:02 crc kubenswrapper[4785]: I1126 15:23:02.467362 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-index-n27b9" event={"ID":"a621a417-b542-4b88-a0b0-12b77aee924b","Type":"ContainerDied","Data":"f5ffcef25f79c195cfdcc5149b74465aab7b8409c54c38b4845250e19539e65b"} Nov 26 15:23:02 crc kubenswrapper[4785]: I1126 15:23:02.467427 4785 scope.go:117] "RemoveContainer" containerID="5baef39a4c7dbcebbc9aacbae6420f83dd4301b566f4ada47daedbd4134cf72e" Nov 26 15:23:02 crc kubenswrapper[4785]: I1126 15:23:02.467612 4785 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-index-n27b9" Nov 26 15:23:02 crc kubenswrapper[4785]: I1126 15:23:02.473431 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-index-c4ks4" event={"ID":"6030eac1-f066-489e-9a17-5c2dd5d5880a","Type":"ContainerStarted","Data":"c1cdb446eb34e0125ed7318d6b29a5852faef51710d0d1553a00390a343fb26f"} Nov 26 15:23:02 crc kubenswrapper[4785]: I1126 15:23:02.476435 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/keystone-db-sync-26zth" event={"ID":"d6236e0d-402b-4338-93c8-4f1a65013acc","Type":"ContainerStarted","Data":"4f7040ec43f8c18d36fb6378aa0d99d6ca53b4c91c2ccacefed46dbcec1a1bef"} Nov 26 15:23:02 crc kubenswrapper[4785]: I1126 15:23:02.478134 4785 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-index-tssqb"] Nov 26 15:23:02 crc kubenswrapper[4785]: I1126 15:23:02.496289 4785 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/keystone-db-sync-26zth" podStartSLOduration=1.12855247 podStartE2EDuration="11.496275448s" podCreationTimestamp="2025-11-26 15:22:51 +0000 UTC" firstStartedPulling="2025-11-26 15:22:51.890281602 +0000 UTC m=+935.568647376" lastFinishedPulling="2025-11-26 15:23:02.2580046 +0000 UTC m=+945.936370354" observedRunningTime="2025-11-26 15:23:02.494592935 +0000 UTC m=+946.172958739" watchObservedRunningTime="2025-11-26 15:23:02.496275448 +0000 UTC m=+946.174641212" Nov 26 15:23:02 crc kubenswrapper[4785]: W1126 15:23:02.500628 4785 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3e49c55a_9d72_4b78_ac75_84fba908f67b.slice/crio-c826eacbe8cdbbf302e8b383ce88ef373e52d716e634c8d4c241c00cb11b4be4 WatchSource:0}: Error finding container c826eacbe8cdbbf302e8b383ce88ef373e52d716e634c8d4c241c00cb11b4be4: Status 404 returned error can't find the container with id c826eacbe8cdbbf302e8b383ce88ef373e52d716e634c8d4c241c00cb11b4be4 Nov 26 15:23:02 crc kubenswrapper[4785]: I1126 15:23:02.514049 4785 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/swift-operator-index-n27b9"] Nov 26 15:23:02 crc kubenswrapper[4785]: I1126 15:23:02.519160 4785 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/swift-operator-index-n27b9"] Nov 26 15:23:03 crc kubenswrapper[4785]: I1126 15:23:03.053083 4785 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a621a417-b542-4b88-a0b0-12b77aee924b" path="/var/lib/kubelet/pods/a621a417-b542-4b88-a0b0-12b77aee924b/volumes" Nov 26 15:23:03 crc kubenswrapper[4785]: I1126 15:23:03.487397 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-index-c4ks4" event={"ID":"6030eac1-f066-489e-9a17-5c2dd5d5880a","Type":"ContainerStarted","Data":"e9134d1e3c656f9f55a358e86e7548a819466f0e8ee18b253b7316f31a3d15bf"} Nov 26 15:23:03 crc kubenswrapper[4785]: I1126 15:23:03.490668 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-index-tssqb" event={"ID":"3e49c55a-9d72-4b78-ac75-84fba908f67b","Type":"ContainerStarted","Data":"cd5dc1379a7e25055879538f9972aeb09a98564c7941c3a1b2f89d81db7c912d"} Nov 26 15:23:03 crc kubenswrapper[4785]: I1126 15:23:03.490730 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-index-tssqb" event={"ID":"3e49c55a-9d72-4b78-ac75-84fba908f67b","Type":"ContainerStarted","Data":"c826eacbe8cdbbf302e8b383ce88ef373e52d716e634c8d4c241c00cb11b4be4"} Nov 26 15:23:03 crc kubenswrapper[4785]: I1126 15:23:03.539493 4785 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-index-tssqb" podStartSLOduration=8.813824712 podStartE2EDuration="9.539469608s" podCreationTimestamp="2025-11-26 15:22:54 +0000 UTC" firstStartedPulling="2025-11-26 15:23:02.507950003 +0000 UTC m=+946.186315767" lastFinishedPulling="2025-11-26 15:23:03.233594879 +0000 UTC m=+946.911960663" observedRunningTime="2025-11-26 15:23:03.539146729 +0000 UTC m=+947.217512533" watchObservedRunningTime="2025-11-26 15:23:03.539469608 +0000 UTC m=+947.217835382" Nov 26 15:23:03 crc kubenswrapper[4785]: I1126 15:23:03.543927 4785 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-index-c4ks4" podStartSLOduration=4.729294459 podStartE2EDuration="5.543903683s" podCreationTimestamp="2025-11-26 15:22:58 +0000 UTC" firstStartedPulling="2025-11-26 15:23:02.370317127 +0000 UTC m=+946.048682891" lastFinishedPulling="2025-11-26 15:23:03.184926321 +0000 UTC m=+946.863292115" observedRunningTime="2025-11-26 15:23:03.514294422 +0000 UTC m=+947.192660286" watchObservedRunningTime="2025-11-26 15:23:03.543903683 +0000 UTC m=+947.222269467" Nov 26 15:23:04 crc kubenswrapper[4785]: I1126 15:23:04.912660 4785 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/horizon-operator-index-tssqb" Nov 26 15:23:04 crc kubenswrapper[4785]: I1126 15:23:04.913027 4785 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-index-tssqb" Nov 26 15:23:04 crc kubenswrapper[4785]: I1126 15:23:04.950849 4785 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/horizon-operator-index-tssqb" Nov 26 15:23:06 crc kubenswrapper[4785]: I1126 15:23:06.513489 4785 generic.go:334] "Generic (PLEG): container finished" podID="d6236e0d-402b-4338-93c8-4f1a65013acc" containerID="4f7040ec43f8c18d36fb6378aa0d99d6ca53b4c91c2ccacefed46dbcec1a1bef" exitCode=0 Nov 26 15:23:06 crc kubenswrapper[4785]: I1126 15:23:06.513574 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/keystone-db-sync-26zth" event={"ID":"d6236e0d-402b-4338-93c8-4f1a65013acc","Type":"ContainerDied","Data":"4f7040ec43f8c18d36fb6378aa0d99d6ca53b4c91c2ccacefed46dbcec1a1bef"} Nov 26 15:23:07 crc kubenswrapper[4785]: I1126 15:23:07.289222 4785 patch_prober.go:28] interesting pod/machine-config-daemon-gkxdl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 26 15:23:07 crc kubenswrapper[4785]: I1126 15:23:07.289306 4785 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-gkxdl" podUID="5539d39a-e2bc-4e7f-8b5a-a5e3e10c4ba4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 26 15:23:07 crc kubenswrapper[4785]: I1126 15:23:07.839780 4785 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/keystone-db-sync-26zth" Nov 26 15:23:08 crc kubenswrapper[4785]: I1126 15:23:08.041650 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d6236e0d-402b-4338-93c8-4f1a65013acc-config-data\") pod \"d6236e0d-402b-4338-93c8-4f1a65013acc\" (UID: \"d6236e0d-402b-4338-93c8-4f1a65013acc\") " Nov 26 15:23:08 crc kubenswrapper[4785]: I1126 15:23:08.041700 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z49cx\" (UniqueName: \"kubernetes.io/projected/d6236e0d-402b-4338-93c8-4f1a65013acc-kube-api-access-z49cx\") pod \"d6236e0d-402b-4338-93c8-4f1a65013acc\" (UID: \"d6236e0d-402b-4338-93c8-4f1a65013acc\") " Nov 26 15:23:08 crc kubenswrapper[4785]: I1126 15:23:08.046680 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d6236e0d-402b-4338-93c8-4f1a65013acc-kube-api-access-z49cx" (OuterVolumeSpecName: "kube-api-access-z49cx") pod "d6236e0d-402b-4338-93c8-4f1a65013acc" (UID: "d6236e0d-402b-4338-93c8-4f1a65013acc"). InnerVolumeSpecName "kube-api-access-z49cx". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 15:23:08 crc kubenswrapper[4785]: I1126 15:23:08.089283 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d6236e0d-402b-4338-93c8-4f1a65013acc-config-data" (OuterVolumeSpecName: "config-data") pod "d6236e0d-402b-4338-93c8-4f1a65013acc" (UID: "d6236e0d-402b-4338-93c8-4f1a65013acc"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 15:23:08 crc kubenswrapper[4785]: I1126 15:23:08.144623 4785 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d6236e0d-402b-4338-93c8-4f1a65013acc-config-data\") on node \"crc\" DevicePath \"\"" Nov 26 15:23:08 crc kubenswrapper[4785]: I1126 15:23:08.144658 4785 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z49cx\" (UniqueName: \"kubernetes.io/projected/d6236e0d-402b-4338-93c8-4f1a65013acc-kube-api-access-z49cx\") on node \"crc\" DevicePath \"\"" Nov 26 15:23:08 crc kubenswrapper[4785]: I1126 15:23:08.530483 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/keystone-db-sync-26zth" event={"ID":"d6236e0d-402b-4338-93c8-4f1a65013acc","Type":"ContainerDied","Data":"359bd55762df065a9ac318a311bd8010b7a865450ef9fbf33a8c1cefc4c29ce1"} Nov 26 15:23:08 crc kubenswrapper[4785]: I1126 15:23:08.530520 4785 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="359bd55762df065a9ac318a311bd8010b7a865450ef9fbf33a8c1cefc4c29ce1" Nov 26 15:23:08 crc kubenswrapper[4785]: I1126 15:23:08.530641 4785 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/keystone-db-sync-26zth" Nov 26 15:23:08 crc kubenswrapper[4785]: I1126 15:23:08.717385 4785 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/keystone-bootstrap-88srj"] Nov 26 15:23:08 crc kubenswrapper[4785]: E1126 15:23:08.717972 4785 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a621a417-b542-4b88-a0b0-12b77aee924b" containerName="registry-server" Nov 26 15:23:08 crc kubenswrapper[4785]: I1126 15:23:08.718088 4785 state_mem.go:107] "Deleted CPUSet assignment" podUID="a621a417-b542-4b88-a0b0-12b77aee924b" containerName="registry-server" Nov 26 15:23:08 crc kubenswrapper[4785]: E1126 15:23:08.718188 4785 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d6236e0d-402b-4338-93c8-4f1a65013acc" containerName="keystone-db-sync" Nov 26 15:23:08 crc kubenswrapper[4785]: I1126 15:23:08.718260 4785 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6236e0d-402b-4338-93c8-4f1a65013acc" containerName="keystone-db-sync" Nov 26 15:23:08 crc kubenswrapper[4785]: I1126 15:23:08.718466 4785 memory_manager.go:354] "RemoveStaleState removing state" podUID="a621a417-b542-4b88-a0b0-12b77aee924b" containerName="registry-server" Nov 26 15:23:08 crc kubenswrapper[4785]: I1126 15:23:08.718577 4785 memory_manager.go:354] "RemoveStaleState removing state" podUID="d6236e0d-402b-4338-93c8-4f1a65013acc" containerName="keystone-db-sync" Nov 26 15:23:08 crc kubenswrapper[4785]: I1126 15:23:08.719176 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/keystone-bootstrap-88srj" Nov 26 15:23:08 crc kubenswrapper[4785]: I1126 15:23:08.721398 4785 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"keystone-scripts" Nov 26 15:23:08 crc kubenswrapper[4785]: I1126 15:23:08.721498 4785 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"osp-secret" Nov 26 15:23:08 crc kubenswrapper[4785]: I1126 15:23:08.721665 4785 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"keystone-config-data" Nov 26 15:23:08 crc kubenswrapper[4785]: I1126 15:23:08.721670 4785 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"keystone-keystone-dockercfg-gwfvc" Nov 26 15:23:08 crc kubenswrapper[4785]: I1126 15:23:08.721778 4785 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"keystone" Nov 26 15:23:08 crc kubenswrapper[4785]: I1126 15:23:08.728233 4785 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/keystone-bootstrap-88srj"] Nov 26 15:23:08 crc kubenswrapper[4785]: I1126 15:23:08.852321 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/9f64f69d-c5fa-4936-b96f-c99658d8ff59-credential-keys\") pod \"keystone-bootstrap-88srj\" (UID: \"9f64f69d-c5fa-4936-b96f-c99658d8ff59\") " pod="glance-kuttl-tests/keystone-bootstrap-88srj" Nov 26 15:23:08 crc kubenswrapper[4785]: I1126 15:23:08.852354 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5jzwk\" (UniqueName: \"kubernetes.io/projected/9f64f69d-c5fa-4936-b96f-c99658d8ff59-kube-api-access-5jzwk\") pod \"keystone-bootstrap-88srj\" (UID: \"9f64f69d-c5fa-4936-b96f-c99658d8ff59\") " pod="glance-kuttl-tests/keystone-bootstrap-88srj" Nov 26 15:23:08 crc kubenswrapper[4785]: I1126 15:23:08.852388 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9f64f69d-c5fa-4936-b96f-c99658d8ff59-scripts\") pod \"keystone-bootstrap-88srj\" (UID: \"9f64f69d-c5fa-4936-b96f-c99658d8ff59\") " pod="glance-kuttl-tests/keystone-bootstrap-88srj" Nov 26 15:23:08 crc kubenswrapper[4785]: I1126 15:23:08.852405 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9f64f69d-c5fa-4936-b96f-c99658d8ff59-config-data\") pod \"keystone-bootstrap-88srj\" (UID: \"9f64f69d-c5fa-4936-b96f-c99658d8ff59\") " pod="glance-kuttl-tests/keystone-bootstrap-88srj" Nov 26 15:23:08 crc kubenswrapper[4785]: I1126 15:23:08.852451 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/9f64f69d-c5fa-4936-b96f-c99658d8ff59-fernet-keys\") pod \"keystone-bootstrap-88srj\" (UID: \"9f64f69d-c5fa-4936-b96f-c99658d8ff59\") " pod="glance-kuttl-tests/keystone-bootstrap-88srj" Nov 26 15:23:08 crc kubenswrapper[4785]: I1126 15:23:08.904449 4785 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/swift-operator-index-c4ks4" Nov 26 15:23:08 crc kubenswrapper[4785]: I1126 15:23:08.904502 4785 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-index-c4ks4" Nov 26 15:23:08 crc kubenswrapper[4785]: I1126 15:23:08.931725 4785 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/swift-operator-index-c4ks4" Nov 26 15:23:08 crc kubenswrapper[4785]: I1126 15:23:08.953877 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9f64f69d-c5fa-4936-b96f-c99658d8ff59-scripts\") pod \"keystone-bootstrap-88srj\" (UID: \"9f64f69d-c5fa-4936-b96f-c99658d8ff59\") " pod="glance-kuttl-tests/keystone-bootstrap-88srj" Nov 26 15:23:08 crc kubenswrapper[4785]: I1126 15:23:08.954099 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9f64f69d-c5fa-4936-b96f-c99658d8ff59-config-data\") pod \"keystone-bootstrap-88srj\" (UID: \"9f64f69d-c5fa-4936-b96f-c99658d8ff59\") " pod="glance-kuttl-tests/keystone-bootstrap-88srj" Nov 26 15:23:08 crc kubenswrapper[4785]: I1126 15:23:08.954271 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/9f64f69d-c5fa-4936-b96f-c99658d8ff59-fernet-keys\") pod \"keystone-bootstrap-88srj\" (UID: \"9f64f69d-c5fa-4936-b96f-c99658d8ff59\") " pod="glance-kuttl-tests/keystone-bootstrap-88srj" Nov 26 15:23:08 crc kubenswrapper[4785]: I1126 15:23:08.954416 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/9f64f69d-c5fa-4936-b96f-c99658d8ff59-credential-keys\") pod \"keystone-bootstrap-88srj\" (UID: \"9f64f69d-c5fa-4936-b96f-c99658d8ff59\") " pod="glance-kuttl-tests/keystone-bootstrap-88srj" Nov 26 15:23:08 crc kubenswrapper[4785]: I1126 15:23:08.954440 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5jzwk\" (UniqueName: \"kubernetes.io/projected/9f64f69d-c5fa-4936-b96f-c99658d8ff59-kube-api-access-5jzwk\") pod \"keystone-bootstrap-88srj\" (UID: \"9f64f69d-c5fa-4936-b96f-c99658d8ff59\") " pod="glance-kuttl-tests/keystone-bootstrap-88srj" Nov 26 15:23:08 crc kubenswrapper[4785]: I1126 15:23:08.957934 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9f64f69d-c5fa-4936-b96f-c99658d8ff59-scripts\") pod \"keystone-bootstrap-88srj\" (UID: \"9f64f69d-c5fa-4936-b96f-c99658d8ff59\") " pod="glance-kuttl-tests/keystone-bootstrap-88srj" Nov 26 15:23:08 crc kubenswrapper[4785]: I1126 15:23:08.958279 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/9f64f69d-c5fa-4936-b96f-c99658d8ff59-credential-keys\") pod \"keystone-bootstrap-88srj\" (UID: \"9f64f69d-c5fa-4936-b96f-c99658d8ff59\") " pod="glance-kuttl-tests/keystone-bootstrap-88srj" Nov 26 15:23:08 crc kubenswrapper[4785]: I1126 15:23:08.958833 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9f64f69d-c5fa-4936-b96f-c99658d8ff59-config-data\") pod \"keystone-bootstrap-88srj\" (UID: \"9f64f69d-c5fa-4936-b96f-c99658d8ff59\") " pod="glance-kuttl-tests/keystone-bootstrap-88srj" Nov 26 15:23:08 crc kubenswrapper[4785]: I1126 15:23:08.964224 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/9f64f69d-c5fa-4936-b96f-c99658d8ff59-fernet-keys\") pod \"keystone-bootstrap-88srj\" (UID: \"9f64f69d-c5fa-4936-b96f-c99658d8ff59\") " pod="glance-kuttl-tests/keystone-bootstrap-88srj" Nov 26 15:23:08 crc kubenswrapper[4785]: I1126 15:23:08.984545 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5jzwk\" (UniqueName: \"kubernetes.io/projected/9f64f69d-c5fa-4936-b96f-c99658d8ff59-kube-api-access-5jzwk\") pod \"keystone-bootstrap-88srj\" (UID: \"9f64f69d-c5fa-4936-b96f-c99658d8ff59\") " pod="glance-kuttl-tests/keystone-bootstrap-88srj" Nov 26 15:23:09 crc kubenswrapper[4785]: I1126 15:23:09.035982 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/keystone-bootstrap-88srj" Nov 26 15:23:09 crc kubenswrapper[4785]: I1126 15:23:09.458131 4785 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/keystone-bootstrap-88srj"] Nov 26 15:23:09 crc kubenswrapper[4785]: I1126 15:23:09.542873 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/keystone-bootstrap-88srj" event={"ID":"9f64f69d-c5fa-4936-b96f-c99658d8ff59","Type":"ContainerStarted","Data":"fe93776fa60973758b75ab5222395a89bcfaeaa5111a4722af1f8b8620dc7eca"} Nov 26 15:23:09 crc kubenswrapper[4785]: I1126 15:23:09.591464 4785 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-index-c4ks4" Nov 26 15:23:10 crc kubenswrapper[4785]: I1126 15:23:10.548975 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/keystone-bootstrap-88srj" event={"ID":"9f64f69d-c5fa-4936-b96f-c99658d8ff59","Type":"ContainerStarted","Data":"65b12e9b3c4851b5c6da5e144adb4e86f98c4f43ea830afdc18b44b3103d9e80"} Nov 26 15:23:12 crc kubenswrapper[4785]: I1126 15:23:12.560857 4785 generic.go:334] "Generic (PLEG): container finished" podID="9f64f69d-c5fa-4936-b96f-c99658d8ff59" containerID="65b12e9b3c4851b5c6da5e144adb4e86f98c4f43ea830afdc18b44b3103d9e80" exitCode=0 Nov 26 15:23:12 crc kubenswrapper[4785]: I1126 15:23:12.560921 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/keystone-bootstrap-88srj" event={"ID":"9f64f69d-c5fa-4936-b96f-c99658d8ff59","Type":"ContainerDied","Data":"65b12e9b3c4851b5c6da5e144adb4e86f98c4f43ea830afdc18b44b3103d9e80"} Nov 26 15:23:13 crc kubenswrapper[4785]: I1126 15:23:13.903164 4785 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/keystone-bootstrap-88srj" Nov 26 15:23:14 crc kubenswrapper[4785]: I1126 15:23:14.026183 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5jzwk\" (UniqueName: \"kubernetes.io/projected/9f64f69d-c5fa-4936-b96f-c99658d8ff59-kube-api-access-5jzwk\") pod \"9f64f69d-c5fa-4936-b96f-c99658d8ff59\" (UID: \"9f64f69d-c5fa-4936-b96f-c99658d8ff59\") " Nov 26 15:23:14 crc kubenswrapper[4785]: I1126 15:23:14.026523 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9f64f69d-c5fa-4936-b96f-c99658d8ff59-scripts\") pod \"9f64f69d-c5fa-4936-b96f-c99658d8ff59\" (UID: \"9f64f69d-c5fa-4936-b96f-c99658d8ff59\") " Nov 26 15:23:14 crc kubenswrapper[4785]: I1126 15:23:14.027275 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/9f64f69d-c5fa-4936-b96f-c99658d8ff59-credential-keys\") pod \"9f64f69d-c5fa-4936-b96f-c99658d8ff59\" (UID: \"9f64f69d-c5fa-4936-b96f-c99658d8ff59\") " Nov 26 15:23:14 crc kubenswrapper[4785]: I1126 15:23:14.027789 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/9f64f69d-c5fa-4936-b96f-c99658d8ff59-fernet-keys\") pod \"9f64f69d-c5fa-4936-b96f-c99658d8ff59\" (UID: \"9f64f69d-c5fa-4936-b96f-c99658d8ff59\") " Nov 26 15:23:14 crc kubenswrapper[4785]: I1126 15:23:14.028420 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9f64f69d-c5fa-4936-b96f-c99658d8ff59-config-data\") pod \"9f64f69d-c5fa-4936-b96f-c99658d8ff59\" (UID: \"9f64f69d-c5fa-4936-b96f-c99658d8ff59\") " Nov 26 15:23:14 crc kubenswrapper[4785]: I1126 15:23:14.031485 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f64f69d-c5fa-4936-b96f-c99658d8ff59-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "9f64f69d-c5fa-4936-b96f-c99658d8ff59" (UID: "9f64f69d-c5fa-4936-b96f-c99658d8ff59"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 15:23:14 crc kubenswrapper[4785]: I1126 15:23:14.031819 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f64f69d-c5fa-4936-b96f-c99658d8ff59-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "9f64f69d-c5fa-4936-b96f-c99658d8ff59" (UID: "9f64f69d-c5fa-4936-b96f-c99658d8ff59"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 15:23:14 crc kubenswrapper[4785]: I1126 15:23:14.032843 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9f64f69d-c5fa-4936-b96f-c99658d8ff59-kube-api-access-5jzwk" (OuterVolumeSpecName: "kube-api-access-5jzwk") pod "9f64f69d-c5fa-4936-b96f-c99658d8ff59" (UID: "9f64f69d-c5fa-4936-b96f-c99658d8ff59"). InnerVolumeSpecName "kube-api-access-5jzwk". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 15:23:14 crc kubenswrapper[4785]: I1126 15:23:14.033439 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f64f69d-c5fa-4936-b96f-c99658d8ff59-scripts" (OuterVolumeSpecName: "scripts") pod "9f64f69d-c5fa-4936-b96f-c99658d8ff59" (UID: "9f64f69d-c5fa-4936-b96f-c99658d8ff59"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 15:23:14 crc kubenswrapper[4785]: I1126 15:23:14.048668 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f64f69d-c5fa-4936-b96f-c99658d8ff59-config-data" (OuterVolumeSpecName: "config-data") pod "9f64f69d-c5fa-4936-b96f-c99658d8ff59" (UID: "9f64f69d-c5fa-4936-b96f-c99658d8ff59"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 15:23:14 crc kubenswrapper[4785]: I1126 15:23:14.129512 4785 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9f64f69d-c5fa-4936-b96f-c99658d8ff59-scripts\") on node \"crc\" DevicePath \"\"" Nov 26 15:23:14 crc kubenswrapper[4785]: I1126 15:23:14.130421 4785 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/9f64f69d-c5fa-4936-b96f-c99658d8ff59-credential-keys\") on node \"crc\" DevicePath \"\"" Nov 26 15:23:14 crc kubenswrapper[4785]: I1126 15:23:14.130439 4785 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/9f64f69d-c5fa-4936-b96f-c99658d8ff59-fernet-keys\") on node \"crc\" DevicePath \"\"" Nov 26 15:23:14 crc kubenswrapper[4785]: I1126 15:23:14.130451 4785 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9f64f69d-c5fa-4936-b96f-c99658d8ff59-config-data\") on node \"crc\" DevicePath \"\"" Nov 26 15:23:14 crc kubenswrapper[4785]: I1126 15:23:14.130465 4785 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5jzwk\" (UniqueName: \"kubernetes.io/projected/9f64f69d-c5fa-4936-b96f-c99658d8ff59-kube-api-access-5jzwk\") on node \"crc\" DevicePath \"\"" Nov 26 15:23:14 crc kubenswrapper[4785]: I1126 15:23:14.581048 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/keystone-bootstrap-88srj" event={"ID":"9f64f69d-c5fa-4936-b96f-c99658d8ff59","Type":"ContainerDied","Data":"fe93776fa60973758b75ab5222395a89bcfaeaa5111a4722af1f8b8620dc7eca"} Nov 26 15:23:14 crc kubenswrapper[4785]: I1126 15:23:14.581456 4785 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fe93776fa60973758b75ab5222395a89bcfaeaa5111a4722af1f8b8620dc7eca" Nov 26 15:23:14 crc kubenswrapper[4785]: I1126 15:23:14.581111 4785 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/keystone-bootstrap-88srj" Nov 26 15:23:14 crc kubenswrapper[4785]: I1126 15:23:14.673441 4785 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/keystone-5b85f48447-rwscr"] Nov 26 15:23:14 crc kubenswrapper[4785]: E1126 15:23:14.673754 4785 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f64f69d-c5fa-4936-b96f-c99658d8ff59" containerName="keystone-bootstrap" Nov 26 15:23:14 crc kubenswrapper[4785]: I1126 15:23:14.673773 4785 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f64f69d-c5fa-4936-b96f-c99658d8ff59" containerName="keystone-bootstrap" Nov 26 15:23:14 crc kubenswrapper[4785]: I1126 15:23:14.673922 4785 memory_manager.go:354] "RemoveStaleState removing state" podUID="9f64f69d-c5fa-4936-b96f-c99658d8ff59" containerName="keystone-bootstrap" Nov 26 15:23:14 crc kubenswrapper[4785]: I1126 15:23:14.674429 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/keystone-5b85f48447-rwscr" Nov 26 15:23:14 crc kubenswrapper[4785]: I1126 15:23:14.676424 4785 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"keystone" Nov 26 15:23:14 crc kubenswrapper[4785]: I1126 15:23:14.676608 4785 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"keystone-keystone-dockercfg-gwfvc" Nov 26 15:23:14 crc kubenswrapper[4785]: I1126 15:23:14.676679 4785 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"keystone-config-data" Nov 26 15:23:14 crc kubenswrapper[4785]: I1126 15:23:14.676605 4785 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"keystone-scripts" Nov 26 15:23:14 crc kubenswrapper[4785]: I1126 15:23:14.687717 4785 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/keystone-5b85f48447-rwscr"] Nov 26 15:23:14 crc kubenswrapper[4785]: I1126 15:23:14.839743 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/578b1e05-62bf-4cc2-921f-cbfccf41a170-credential-keys\") pod \"keystone-5b85f48447-rwscr\" (UID: \"578b1e05-62bf-4cc2-921f-cbfccf41a170\") " pod="glance-kuttl-tests/keystone-5b85f48447-rwscr" Nov 26 15:23:14 crc kubenswrapper[4785]: I1126 15:23:14.840274 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/578b1e05-62bf-4cc2-921f-cbfccf41a170-fernet-keys\") pod \"keystone-5b85f48447-rwscr\" (UID: \"578b1e05-62bf-4cc2-921f-cbfccf41a170\") " pod="glance-kuttl-tests/keystone-5b85f48447-rwscr" Nov 26 15:23:14 crc kubenswrapper[4785]: I1126 15:23:14.840468 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/578b1e05-62bf-4cc2-921f-cbfccf41a170-scripts\") pod \"keystone-5b85f48447-rwscr\" (UID: \"578b1e05-62bf-4cc2-921f-cbfccf41a170\") " pod="glance-kuttl-tests/keystone-5b85f48447-rwscr" Nov 26 15:23:14 crc kubenswrapper[4785]: I1126 15:23:14.840667 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fkc2k\" (UniqueName: \"kubernetes.io/projected/578b1e05-62bf-4cc2-921f-cbfccf41a170-kube-api-access-fkc2k\") pod \"keystone-5b85f48447-rwscr\" (UID: \"578b1e05-62bf-4cc2-921f-cbfccf41a170\") " pod="glance-kuttl-tests/keystone-5b85f48447-rwscr" Nov 26 15:23:14 crc kubenswrapper[4785]: I1126 15:23:14.840877 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/578b1e05-62bf-4cc2-921f-cbfccf41a170-config-data\") pod \"keystone-5b85f48447-rwscr\" (UID: \"578b1e05-62bf-4cc2-921f-cbfccf41a170\") " pod="glance-kuttl-tests/keystone-5b85f48447-rwscr" Nov 26 15:23:14 crc kubenswrapper[4785]: I1126 15:23:14.942454 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/578b1e05-62bf-4cc2-921f-cbfccf41a170-credential-keys\") pod \"keystone-5b85f48447-rwscr\" (UID: \"578b1e05-62bf-4cc2-921f-cbfccf41a170\") " pod="glance-kuttl-tests/keystone-5b85f48447-rwscr" Nov 26 15:23:14 crc kubenswrapper[4785]: I1126 15:23:14.942546 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/578b1e05-62bf-4cc2-921f-cbfccf41a170-fernet-keys\") pod \"keystone-5b85f48447-rwscr\" (UID: \"578b1e05-62bf-4cc2-921f-cbfccf41a170\") " pod="glance-kuttl-tests/keystone-5b85f48447-rwscr" Nov 26 15:23:14 crc kubenswrapper[4785]: I1126 15:23:14.942619 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/578b1e05-62bf-4cc2-921f-cbfccf41a170-scripts\") pod \"keystone-5b85f48447-rwscr\" (UID: \"578b1e05-62bf-4cc2-921f-cbfccf41a170\") " pod="glance-kuttl-tests/keystone-5b85f48447-rwscr" Nov 26 15:23:14 crc kubenswrapper[4785]: I1126 15:23:14.942651 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fkc2k\" (UniqueName: \"kubernetes.io/projected/578b1e05-62bf-4cc2-921f-cbfccf41a170-kube-api-access-fkc2k\") pod \"keystone-5b85f48447-rwscr\" (UID: \"578b1e05-62bf-4cc2-921f-cbfccf41a170\") " pod="glance-kuttl-tests/keystone-5b85f48447-rwscr" Nov 26 15:23:14 crc kubenswrapper[4785]: I1126 15:23:14.942677 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/578b1e05-62bf-4cc2-921f-cbfccf41a170-config-data\") pod \"keystone-5b85f48447-rwscr\" (UID: \"578b1e05-62bf-4cc2-921f-cbfccf41a170\") " pod="glance-kuttl-tests/keystone-5b85f48447-rwscr" Nov 26 15:23:14 crc kubenswrapper[4785]: I1126 15:23:14.944846 4785 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-index-tssqb" Nov 26 15:23:14 crc kubenswrapper[4785]: I1126 15:23:14.947759 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/578b1e05-62bf-4cc2-921f-cbfccf41a170-credential-keys\") pod \"keystone-5b85f48447-rwscr\" (UID: \"578b1e05-62bf-4cc2-921f-cbfccf41a170\") " pod="glance-kuttl-tests/keystone-5b85f48447-rwscr" Nov 26 15:23:14 crc kubenswrapper[4785]: I1126 15:23:14.948884 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/578b1e05-62bf-4cc2-921f-cbfccf41a170-config-data\") pod \"keystone-5b85f48447-rwscr\" (UID: \"578b1e05-62bf-4cc2-921f-cbfccf41a170\") " pod="glance-kuttl-tests/keystone-5b85f48447-rwscr" Nov 26 15:23:14 crc kubenswrapper[4785]: I1126 15:23:14.952104 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/578b1e05-62bf-4cc2-921f-cbfccf41a170-fernet-keys\") pod \"keystone-5b85f48447-rwscr\" (UID: \"578b1e05-62bf-4cc2-921f-cbfccf41a170\") " pod="glance-kuttl-tests/keystone-5b85f48447-rwscr" Nov 26 15:23:14 crc kubenswrapper[4785]: I1126 15:23:14.952977 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/578b1e05-62bf-4cc2-921f-cbfccf41a170-scripts\") pod \"keystone-5b85f48447-rwscr\" (UID: \"578b1e05-62bf-4cc2-921f-cbfccf41a170\") " pod="glance-kuttl-tests/keystone-5b85f48447-rwscr" Nov 26 15:23:14 crc kubenswrapper[4785]: I1126 15:23:14.967086 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fkc2k\" (UniqueName: \"kubernetes.io/projected/578b1e05-62bf-4cc2-921f-cbfccf41a170-kube-api-access-fkc2k\") pod \"keystone-5b85f48447-rwscr\" (UID: \"578b1e05-62bf-4cc2-921f-cbfccf41a170\") " pod="glance-kuttl-tests/keystone-5b85f48447-rwscr" Nov 26 15:23:14 crc kubenswrapper[4785]: I1126 15:23:14.992221 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/keystone-5b85f48447-rwscr" Nov 26 15:23:15 crc kubenswrapper[4785]: I1126 15:23:15.399123 4785 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/keystone-5b85f48447-rwscr"] Nov 26 15:23:15 crc kubenswrapper[4785]: W1126 15:23:15.405054 4785 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod578b1e05_62bf_4cc2_921f_cbfccf41a170.slice/crio-7a8615f3d95cda20060bccd971207920294b7712a395696772f26726430b1de7 WatchSource:0}: Error finding container 7a8615f3d95cda20060bccd971207920294b7712a395696772f26726430b1de7: Status 404 returned error can't find the container with id 7a8615f3d95cda20060bccd971207920294b7712a395696772f26726430b1de7 Nov 26 15:23:15 crc kubenswrapper[4785]: I1126 15:23:15.587941 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/keystone-5b85f48447-rwscr" event={"ID":"578b1e05-62bf-4cc2-921f-cbfccf41a170","Type":"ContainerStarted","Data":"b5c63991a3c44dadea9758c1d96a78e07f6aba845f6549a651c2adc6402e1019"} Nov 26 15:23:15 crc kubenswrapper[4785]: I1126 15:23:15.588254 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/keystone-5b85f48447-rwscr" event={"ID":"578b1e05-62bf-4cc2-921f-cbfccf41a170","Type":"ContainerStarted","Data":"7a8615f3d95cda20060bccd971207920294b7712a395696772f26726430b1de7"} Nov 26 15:23:15 crc kubenswrapper[4785]: I1126 15:23:15.588271 4785 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/keystone-5b85f48447-rwscr" Nov 26 15:23:15 crc kubenswrapper[4785]: I1126 15:23:15.607789 4785 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/keystone-5b85f48447-rwscr" podStartSLOduration=1.607769062 podStartE2EDuration="1.607769062s" podCreationTimestamp="2025-11-26 15:23:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 15:23:15.603576463 +0000 UTC m=+959.281942237" watchObservedRunningTime="2025-11-26 15:23:15.607769062 +0000 UTC m=+959.286134826" Nov 26 15:23:33 crc kubenswrapper[4785]: I1126 15:23:33.256906 4785 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/9f0c59a3968beec894e04476dd5efd0a707bad85f482efd4940498368cf67k9"] Nov 26 15:23:33 crc kubenswrapper[4785]: I1126 15:23:33.259763 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/9f0c59a3968beec894e04476dd5efd0a707bad85f482efd4940498368cf67k9" Nov 26 15:23:33 crc kubenswrapper[4785]: I1126 15:23:33.268386 4785 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-sfhll" Nov 26 15:23:33 crc kubenswrapper[4785]: I1126 15:23:33.276802 4785 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/9f0c59a3968beec894e04476dd5efd0a707bad85f482efd4940498368cf67k9"] Nov 26 15:23:33 crc kubenswrapper[4785]: I1126 15:23:33.300728 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/353baaac-9344-4e21-af31-6695033fd724-bundle\") pod \"9f0c59a3968beec894e04476dd5efd0a707bad85f482efd4940498368cf67k9\" (UID: \"353baaac-9344-4e21-af31-6695033fd724\") " pod="openstack-operators/9f0c59a3968beec894e04476dd5efd0a707bad85f482efd4940498368cf67k9" Nov 26 15:23:33 crc kubenswrapper[4785]: I1126 15:23:33.300882 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/353baaac-9344-4e21-af31-6695033fd724-util\") pod \"9f0c59a3968beec894e04476dd5efd0a707bad85f482efd4940498368cf67k9\" (UID: \"353baaac-9344-4e21-af31-6695033fd724\") " pod="openstack-operators/9f0c59a3968beec894e04476dd5efd0a707bad85f482efd4940498368cf67k9" Nov 26 15:23:33 crc kubenswrapper[4785]: I1126 15:23:33.301020 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7flc6\" (UniqueName: \"kubernetes.io/projected/353baaac-9344-4e21-af31-6695033fd724-kube-api-access-7flc6\") pod \"9f0c59a3968beec894e04476dd5efd0a707bad85f482efd4940498368cf67k9\" (UID: \"353baaac-9344-4e21-af31-6695033fd724\") " pod="openstack-operators/9f0c59a3968beec894e04476dd5efd0a707bad85f482efd4940498368cf67k9" Nov 26 15:23:33 crc kubenswrapper[4785]: I1126 15:23:33.402215 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/353baaac-9344-4e21-af31-6695033fd724-bundle\") pod \"9f0c59a3968beec894e04476dd5efd0a707bad85f482efd4940498368cf67k9\" (UID: \"353baaac-9344-4e21-af31-6695033fd724\") " pod="openstack-operators/9f0c59a3968beec894e04476dd5efd0a707bad85f482efd4940498368cf67k9" Nov 26 15:23:33 crc kubenswrapper[4785]: I1126 15:23:33.402330 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/353baaac-9344-4e21-af31-6695033fd724-util\") pod \"9f0c59a3968beec894e04476dd5efd0a707bad85f482efd4940498368cf67k9\" (UID: \"353baaac-9344-4e21-af31-6695033fd724\") " pod="openstack-operators/9f0c59a3968beec894e04476dd5efd0a707bad85f482efd4940498368cf67k9" Nov 26 15:23:33 crc kubenswrapper[4785]: I1126 15:23:33.402484 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7flc6\" (UniqueName: \"kubernetes.io/projected/353baaac-9344-4e21-af31-6695033fd724-kube-api-access-7flc6\") pod \"9f0c59a3968beec894e04476dd5efd0a707bad85f482efd4940498368cf67k9\" (UID: \"353baaac-9344-4e21-af31-6695033fd724\") " pod="openstack-operators/9f0c59a3968beec894e04476dd5efd0a707bad85f482efd4940498368cf67k9" Nov 26 15:23:33 crc kubenswrapper[4785]: I1126 15:23:33.402926 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/353baaac-9344-4e21-af31-6695033fd724-util\") pod \"9f0c59a3968beec894e04476dd5efd0a707bad85f482efd4940498368cf67k9\" (UID: \"353baaac-9344-4e21-af31-6695033fd724\") " pod="openstack-operators/9f0c59a3968beec894e04476dd5efd0a707bad85f482efd4940498368cf67k9" Nov 26 15:23:33 crc kubenswrapper[4785]: I1126 15:23:33.403267 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/353baaac-9344-4e21-af31-6695033fd724-bundle\") pod \"9f0c59a3968beec894e04476dd5efd0a707bad85f482efd4940498368cf67k9\" (UID: \"353baaac-9344-4e21-af31-6695033fd724\") " pod="openstack-operators/9f0c59a3968beec894e04476dd5efd0a707bad85f482efd4940498368cf67k9" Nov 26 15:23:33 crc kubenswrapper[4785]: I1126 15:23:33.427730 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7flc6\" (UniqueName: \"kubernetes.io/projected/353baaac-9344-4e21-af31-6695033fd724-kube-api-access-7flc6\") pod \"9f0c59a3968beec894e04476dd5efd0a707bad85f482efd4940498368cf67k9\" (UID: \"353baaac-9344-4e21-af31-6695033fd724\") " pod="openstack-operators/9f0c59a3968beec894e04476dd5efd0a707bad85f482efd4940498368cf67k9" Nov 26 15:23:33 crc kubenswrapper[4785]: I1126 15:23:33.614671 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/9f0c59a3968beec894e04476dd5efd0a707bad85f482efd4940498368cf67k9" Nov 26 15:23:34 crc kubenswrapper[4785]: I1126 15:23:34.044297 4785 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/87b4bb7621dcb67338b53778f2871f07aa0e4d3dfcd0fd25724bfd240bvx5zp"] Nov 26 15:23:34 crc kubenswrapper[4785]: I1126 15:23:34.046964 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/87b4bb7621dcb67338b53778f2871f07aa0e4d3dfcd0fd25724bfd240bvx5zp" Nov 26 15:23:34 crc kubenswrapper[4785]: I1126 15:23:34.056915 4785 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/87b4bb7621dcb67338b53778f2871f07aa0e4d3dfcd0fd25724bfd240bvx5zp"] Nov 26 15:23:34 crc kubenswrapper[4785]: I1126 15:23:34.090209 4785 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/9f0c59a3968beec894e04476dd5efd0a707bad85f482efd4940498368cf67k9"] Nov 26 15:23:34 crc kubenswrapper[4785]: I1126 15:23:34.112677 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a6f1913b-2431-4998-a360-fc87606e990e-bundle\") pod \"87b4bb7621dcb67338b53778f2871f07aa0e4d3dfcd0fd25724bfd240bvx5zp\" (UID: \"a6f1913b-2431-4998-a360-fc87606e990e\") " pod="openstack-operators/87b4bb7621dcb67338b53778f2871f07aa0e4d3dfcd0fd25724bfd240bvx5zp" Nov 26 15:23:34 crc kubenswrapper[4785]: I1126 15:23:34.112756 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-stqrq\" (UniqueName: \"kubernetes.io/projected/a6f1913b-2431-4998-a360-fc87606e990e-kube-api-access-stqrq\") pod \"87b4bb7621dcb67338b53778f2871f07aa0e4d3dfcd0fd25724bfd240bvx5zp\" (UID: \"a6f1913b-2431-4998-a360-fc87606e990e\") " pod="openstack-operators/87b4bb7621dcb67338b53778f2871f07aa0e4d3dfcd0fd25724bfd240bvx5zp" Nov 26 15:23:34 crc kubenswrapper[4785]: I1126 15:23:34.112794 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a6f1913b-2431-4998-a360-fc87606e990e-util\") pod \"87b4bb7621dcb67338b53778f2871f07aa0e4d3dfcd0fd25724bfd240bvx5zp\" (UID: \"a6f1913b-2431-4998-a360-fc87606e990e\") " pod="openstack-operators/87b4bb7621dcb67338b53778f2871f07aa0e4d3dfcd0fd25724bfd240bvx5zp" Nov 26 15:23:34 crc kubenswrapper[4785]: I1126 15:23:34.213950 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a6f1913b-2431-4998-a360-fc87606e990e-bundle\") pod \"87b4bb7621dcb67338b53778f2871f07aa0e4d3dfcd0fd25724bfd240bvx5zp\" (UID: \"a6f1913b-2431-4998-a360-fc87606e990e\") " pod="openstack-operators/87b4bb7621dcb67338b53778f2871f07aa0e4d3dfcd0fd25724bfd240bvx5zp" Nov 26 15:23:34 crc kubenswrapper[4785]: I1126 15:23:34.214112 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-stqrq\" (UniqueName: \"kubernetes.io/projected/a6f1913b-2431-4998-a360-fc87606e990e-kube-api-access-stqrq\") pod \"87b4bb7621dcb67338b53778f2871f07aa0e4d3dfcd0fd25724bfd240bvx5zp\" (UID: \"a6f1913b-2431-4998-a360-fc87606e990e\") " pod="openstack-operators/87b4bb7621dcb67338b53778f2871f07aa0e4d3dfcd0fd25724bfd240bvx5zp" Nov 26 15:23:34 crc kubenswrapper[4785]: I1126 15:23:34.214182 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a6f1913b-2431-4998-a360-fc87606e990e-util\") pod \"87b4bb7621dcb67338b53778f2871f07aa0e4d3dfcd0fd25724bfd240bvx5zp\" (UID: \"a6f1913b-2431-4998-a360-fc87606e990e\") " pod="openstack-operators/87b4bb7621dcb67338b53778f2871f07aa0e4d3dfcd0fd25724bfd240bvx5zp" Nov 26 15:23:34 crc kubenswrapper[4785]: I1126 15:23:34.214617 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a6f1913b-2431-4998-a360-fc87606e990e-bundle\") pod \"87b4bb7621dcb67338b53778f2871f07aa0e4d3dfcd0fd25724bfd240bvx5zp\" (UID: \"a6f1913b-2431-4998-a360-fc87606e990e\") " pod="openstack-operators/87b4bb7621dcb67338b53778f2871f07aa0e4d3dfcd0fd25724bfd240bvx5zp" Nov 26 15:23:34 crc kubenswrapper[4785]: I1126 15:23:34.214749 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a6f1913b-2431-4998-a360-fc87606e990e-util\") pod \"87b4bb7621dcb67338b53778f2871f07aa0e4d3dfcd0fd25724bfd240bvx5zp\" (UID: \"a6f1913b-2431-4998-a360-fc87606e990e\") " pod="openstack-operators/87b4bb7621dcb67338b53778f2871f07aa0e4d3dfcd0fd25724bfd240bvx5zp" Nov 26 15:23:34 crc kubenswrapper[4785]: I1126 15:23:34.240995 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-stqrq\" (UniqueName: \"kubernetes.io/projected/a6f1913b-2431-4998-a360-fc87606e990e-kube-api-access-stqrq\") pod \"87b4bb7621dcb67338b53778f2871f07aa0e4d3dfcd0fd25724bfd240bvx5zp\" (UID: \"a6f1913b-2431-4998-a360-fc87606e990e\") " pod="openstack-operators/87b4bb7621dcb67338b53778f2871f07aa0e4d3dfcd0fd25724bfd240bvx5zp" Nov 26 15:23:34 crc kubenswrapper[4785]: I1126 15:23:34.368020 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/87b4bb7621dcb67338b53778f2871f07aa0e4d3dfcd0fd25724bfd240bvx5zp" Nov 26 15:23:34 crc kubenswrapper[4785]: I1126 15:23:34.739239 4785 generic.go:334] "Generic (PLEG): container finished" podID="353baaac-9344-4e21-af31-6695033fd724" containerID="01c727b00e82e4d31b8f64618bdd38d751bd304f2462e40d89c395eaf87f0f4e" exitCode=0 Nov 26 15:23:34 crc kubenswrapper[4785]: I1126 15:23:34.739285 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/9f0c59a3968beec894e04476dd5efd0a707bad85f482efd4940498368cf67k9" event={"ID":"353baaac-9344-4e21-af31-6695033fd724","Type":"ContainerDied","Data":"01c727b00e82e4d31b8f64618bdd38d751bd304f2462e40d89c395eaf87f0f4e"} Nov 26 15:23:34 crc kubenswrapper[4785]: I1126 15:23:34.739582 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/9f0c59a3968beec894e04476dd5efd0a707bad85f482efd4940498368cf67k9" event={"ID":"353baaac-9344-4e21-af31-6695033fd724","Type":"ContainerStarted","Data":"bb1581933e72240f3b04262e3cae16896637e5e9f729f056cb52e378e8c89c24"} Nov 26 15:23:34 crc kubenswrapper[4785]: I1126 15:23:34.741019 4785 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Nov 26 15:23:34 crc kubenswrapper[4785]: I1126 15:23:34.855679 4785 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/87b4bb7621dcb67338b53778f2871f07aa0e4d3dfcd0fd25724bfd240bvx5zp"] Nov 26 15:23:35 crc kubenswrapper[4785]: I1126 15:23:35.749576 4785 generic.go:334] "Generic (PLEG): container finished" podID="a6f1913b-2431-4998-a360-fc87606e990e" containerID="ce16edcdff1179765394ef84142d7987d1e22b092df503be99f23942437db2a3" exitCode=0 Nov 26 15:23:35 crc kubenswrapper[4785]: I1126 15:23:35.749695 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/87b4bb7621dcb67338b53778f2871f07aa0e4d3dfcd0fd25724bfd240bvx5zp" event={"ID":"a6f1913b-2431-4998-a360-fc87606e990e","Type":"ContainerDied","Data":"ce16edcdff1179765394ef84142d7987d1e22b092df503be99f23942437db2a3"} Nov 26 15:23:35 crc kubenswrapper[4785]: I1126 15:23:35.749884 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/87b4bb7621dcb67338b53778f2871f07aa0e4d3dfcd0fd25724bfd240bvx5zp" event={"ID":"a6f1913b-2431-4998-a360-fc87606e990e","Type":"ContainerStarted","Data":"668b0eb7895737e73e39eed42d3013b9e5725f37ec8c92d940e1cf5d54868843"} Nov 26 15:23:36 crc kubenswrapper[4785]: I1126 15:23:36.758307 4785 generic.go:334] "Generic (PLEG): container finished" podID="353baaac-9344-4e21-af31-6695033fd724" containerID="387ee93221dc61a0b53aaa5b6b65b991e61e7fef25f1e6c06b22cf47a3f9a5fe" exitCode=0 Nov 26 15:23:36 crc kubenswrapper[4785]: I1126 15:23:36.758390 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/9f0c59a3968beec894e04476dd5efd0a707bad85f482efd4940498368cf67k9" event={"ID":"353baaac-9344-4e21-af31-6695033fd724","Type":"ContainerDied","Data":"387ee93221dc61a0b53aaa5b6b65b991e61e7fef25f1e6c06b22cf47a3f9a5fe"} Nov 26 15:23:37 crc kubenswrapper[4785]: I1126 15:23:37.289569 4785 patch_prober.go:28] interesting pod/machine-config-daemon-gkxdl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 26 15:23:37 crc kubenswrapper[4785]: I1126 15:23:37.289641 4785 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-gkxdl" podUID="5539d39a-e2bc-4e7f-8b5a-a5e3e10c4ba4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 26 15:23:37 crc kubenswrapper[4785]: I1126 15:23:37.289696 4785 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-gkxdl" Nov 26 15:23:37 crc kubenswrapper[4785]: I1126 15:23:37.290451 4785 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"a6ffa546d6cd3d829cf51dc79161c420c203f2371af580e656e6ea6f7619320e"} pod="openshift-machine-config-operator/machine-config-daemon-gkxdl" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 26 15:23:37 crc kubenswrapper[4785]: I1126 15:23:37.290545 4785 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-gkxdl" podUID="5539d39a-e2bc-4e7f-8b5a-a5e3e10c4ba4" containerName="machine-config-daemon" containerID="cri-o://a6ffa546d6cd3d829cf51dc79161c420c203f2371af580e656e6ea6f7619320e" gracePeriod=600 Nov 26 15:23:37 crc kubenswrapper[4785]: I1126 15:23:37.768886 4785 generic.go:334] "Generic (PLEG): container finished" podID="5539d39a-e2bc-4e7f-8b5a-a5e3e10c4ba4" containerID="a6ffa546d6cd3d829cf51dc79161c420c203f2371af580e656e6ea6f7619320e" exitCode=0 Nov 26 15:23:37 crc kubenswrapper[4785]: I1126 15:23:37.768964 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-gkxdl" event={"ID":"5539d39a-e2bc-4e7f-8b5a-a5e3e10c4ba4","Type":"ContainerDied","Data":"a6ffa546d6cd3d829cf51dc79161c420c203f2371af580e656e6ea6f7619320e"} Nov 26 15:23:37 crc kubenswrapper[4785]: I1126 15:23:37.769324 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-gkxdl" event={"ID":"5539d39a-e2bc-4e7f-8b5a-a5e3e10c4ba4","Type":"ContainerStarted","Data":"931a84441789a47b1ac55236dde8c88c217189d0a2b8d7b82c6917d783312096"} Nov 26 15:23:37 crc kubenswrapper[4785]: I1126 15:23:37.769347 4785 scope.go:117] "RemoveContainer" containerID="5335547fe096a0c8db3ef049b0bdabe9e8e04a213e13c79d8696fbb738fcc9fa" Nov 26 15:23:37 crc kubenswrapper[4785]: I1126 15:23:37.772068 4785 generic.go:334] "Generic (PLEG): container finished" podID="353baaac-9344-4e21-af31-6695033fd724" containerID="599dcea87185dc3fb56f5e49435dc2b057ebf3b12c62237198b8169b566f8a93" exitCode=0 Nov 26 15:23:37 crc kubenswrapper[4785]: I1126 15:23:37.772187 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/9f0c59a3968beec894e04476dd5efd0a707bad85f482efd4940498368cf67k9" event={"ID":"353baaac-9344-4e21-af31-6695033fd724","Type":"ContainerDied","Data":"599dcea87185dc3fb56f5e49435dc2b057ebf3b12c62237198b8169b566f8a93"} Nov 26 15:23:37 crc kubenswrapper[4785]: I1126 15:23:37.774627 4785 generic.go:334] "Generic (PLEG): container finished" podID="a6f1913b-2431-4998-a360-fc87606e990e" containerID="ace581d6243f903fa663b054d406f1e7777326cdf66bdfd4ef62cc1ebc5407b5" exitCode=0 Nov 26 15:23:37 crc kubenswrapper[4785]: I1126 15:23:37.774663 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/87b4bb7621dcb67338b53778f2871f07aa0e4d3dfcd0fd25724bfd240bvx5zp" event={"ID":"a6f1913b-2431-4998-a360-fc87606e990e","Type":"ContainerDied","Data":"ace581d6243f903fa663b054d406f1e7777326cdf66bdfd4ef62cc1ebc5407b5"} Nov 26 15:23:38 crc kubenswrapper[4785]: I1126 15:23:38.794983 4785 generic.go:334] "Generic (PLEG): container finished" podID="a6f1913b-2431-4998-a360-fc87606e990e" containerID="4fdae1a56fa7d191f1a597febc5ed45be237d928844b1199cf540df2850117f6" exitCode=0 Nov 26 15:23:38 crc kubenswrapper[4785]: I1126 15:23:38.795063 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/87b4bb7621dcb67338b53778f2871f07aa0e4d3dfcd0fd25724bfd240bvx5zp" event={"ID":"a6f1913b-2431-4998-a360-fc87606e990e","Type":"ContainerDied","Data":"4fdae1a56fa7d191f1a597febc5ed45be237d928844b1199cf540df2850117f6"} Nov 26 15:23:39 crc kubenswrapper[4785]: I1126 15:23:39.102377 4785 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/9f0c59a3968beec894e04476dd5efd0a707bad85f482efd4940498368cf67k9" Nov 26 15:23:39 crc kubenswrapper[4785]: I1126 15:23:39.193064 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/353baaac-9344-4e21-af31-6695033fd724-bundle\") pod \"353baaac-9344-4e21-af31-6695033fd724\" (UID: \"353baaac-9344-4e21-af31-6695033fd724\") " Nov 26 15:23:39 crc kubenswrapper[4785]: I1126 15:23:39.193179 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7flc6\" (UniqueName: \"kubernetes.io/projected/353baaac-9344-4e21-af31-6695033fd724-kube-api-access-7flc6\") pod \"353baaac-9344-4e21-af31-6695033fd724\" (UID: \"353baaac-9344-4e21-af31-6695033fd724\") " Nov 26 15:23:39 crc kubenswrapper[4785]: I1126 15:23:39.193250 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/353baaac-9344-4e21-af31-6695033fd724-util\") pod \"353baaac-9344-4e21-af31-6695033fd724\" (UID: \"353baaac-9344-4e21-af31-6695033fd724\") " Nov 26 15:23:39 crc kubenswrapper[4785]: I1126 15:23:39.193857 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/353baaac-9344-4e21-af31-6695033fd724-bundle" (OuterVolumeSpecName: "bundle") pod "353baaac-9344-4e21-af31-6695033fd724" (UID: "353baaac-9344-4e21-af31-6695033fd724"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 15:23:39 crc kubenswrapper[4785]: I1126 15:23:39.200306 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/353baaac-9344-4e21-af31-6695033fd724-kube-api-access-7flc6" (OuterVolumeSpecName: "kube-api-access-7flc6") pod "353baaac-9344-4e21-af31-6695033fd724" (UID: "353baaac-9344-4e21-af31-6695033fd724"). InnerVolumeSpecName "kube-api-access-7flc6". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 15:23:39 crc kubenswrapper[4785]: I1126 15:23:39.212751 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/353baaac-9344-4e21-af31-6695033fd724-util" (OuterVolumeSpecName: "util") pod "353baaac-9344-4e21-af31-6695033fd724" (UID: "353baaac-9344-4e21-af31-6695033fd724"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 15:23:39 crc kubenswrapper[4785]: I1126 15:23:39.294420 4785 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/353baaac-9344-4e21-af31-6695033fd724-util\") on node \"crc\" DevicePath \"\"" Nov 26 15:23:39 crc kubenswrapper[4785]: I1126 15:23:39.294708 4785 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/353baaac-9344-4e21-af31-6695033fd724-bundle\") on node \"crc\" DevicePath \"\"" Nov 26 15:23:39 crc kubenswrapper[4785]: I1126 15:23:39.294718 4785 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7flc6\" (UniqueName: \"kubernetes.io/projected/353baaac-9344-4e21-af31-6695033fd724-kube-api-access-7flc6\") on node \"crc\" DevicePath \"\"" Nov 26 15:23:39 crc kubenswrapper[4785]: I1126 15:23:39.803356 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/9f0c59a3968beec894e04476dd5efd0a707bad85f482efd4940498368cf67k9" event={"ID":"353baaac-9344-4e21-af31-6695033fd724","Type":"ContainerDied","Data":"bb1581933e72240f3b04262e3cae16896637e5e9f729f056cb52e378e8c89c24"} Nov 26 15:23:39 crc kubenswrapper[4785]: I1126 15:23:39.803435 4785 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bb1581933e72240f3b04262e3cae16896637e5e9f729f056cb52e378e8c89c24" Nov 26 15:23:39 crc kubenswrapper[4785]: I1126 15:23:39.803390 4785 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/9f0c59a3968beec894e04476dd5efd0a707bad85f482efd4940498368cf67k9" Nov 26 15:23:40 crc kubenswrapper[4785]: I1126 15:23:40.058858 4785 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/87b4bb7621dcb67338b53778f2871f07aa0e4d3dfcd0fd25724bfd240bvx5zp" Nov 26 15:23:40 crc kubenswrapper[4785]: I1126 15:23:40.205155 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a6f1913b-2431-4998-a360-fc87606e990e-util\") pod \"a6f1913b-2431-4998-a360-fc87606e990e\" (UID: \"a6f1913b-2431-4998-a360-fc87606e990e\") " Nov 26 15:23:40 crc kubenswrapper[4785]: I1126 15:23:40.205260 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a6f1913b-2431-4998-a360-fc87606e990e-bundle\") pod \"a6f1913b-2431-4998-a360-fc87606e990e\" (UID: \"a6f1913b-2431-4998-a360-fc87606e990e\") " Nov 26 15:23:40 crc kubenswrapper[4785]: I1126 15:23:40.205301 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-stqrq\" (UniqueName: \"kubernetes.io/projected/a6f1913b-2431-4998-a360-fc87606e990e-kube-api-access-stqrq\") pod \"a6f1913b-2431-4998-a360-fc87606e990e\" (UID: \"a6f1913b-2431-4998-a360-fc87606e990e\") " Nov 26 15:23:40 crc kubenswrapper[4785]: I1126 15:23:40.207891 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a6f1913b-2431-4998-a360-fc87606e990e-bundle" (OuterVolumeSpecName: "bundle") pod "a6f1913b-2431-4998-a360-fc87606e990e" (UID: "a6f1913b-2431-4998-a360-fc87606e990e"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 15:23:40 crc kubenswrapper[4785]: I1126 15:23:40.216713 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a6f1913b-2431-4998-a360-fc87606e990e-kube-api-access-stqrq" (OuterVolumeSpecName: "kube-api-access-stqrq") pod "a6f1913b-2431-4998-a360-fc87606e990e" (UID: "a6f1913b-2431-4998-a360-fc87606e990e"). InnerVolumeSpecName "kube-api-access-stqrq". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 15:23:40 crc kubenswrapper[4785]: I1126 15:23:40.307312 4785 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a6f1913b-2431-4998-a360-fc87606e990e-bundle\") on node \"crc\" DevicePath \"\"" Nov 26 15:23:40 crc kubenswrapper[4785]: I1126 15:23:40.307367 4785 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-stqrq\" (UniqueName: \"kubernetes.io/projected/a6f1913b-2431-4998-a360-fc87606e990e-kube-api-access-stqrq\") on node \"crc\" DevicePath \"\"" Nov 26 15:23:40 crc kubenswrapper[4785]: I1126 15:23:40.638044 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a6f1913b-2431-4998-a360-fc87606e990e-util" (OuterVolumeSpecName: "util") pod "a6f1913b-2431-4998-a360-fc87606e990e" (UID: "a6f1913b-2431-4998-a360-fc87606e990e"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 15:23:40 crc kubenswrapper[4785]: I1126 15:23:40.714051 4785 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a6f1913b-2431-4998-a360-fc87606e990e-util\") on node \"crc\" DevicePath \"\"" Nov 26 15:23:40 crc kubenswrapper[4785]: I1126 15:23:40.812893 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/87b4bb7621dcb67338b53778f2871f07aa0e4d3dfcd0fd25724bfd240bvx5zp" event={"ID":"a6f1913b-2431-4998-a360-fc87606e990e","Type":"ContainerDied","Data":"668b0eb7895737e73e39eed42d3013b9e5725f37ec8c92d940e1cf5d54868843"} Nov 26 15:23:40 crc kubenswrapper[4785]: I1126 15:23:40.812930 4785 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/87b4bb7621dcb67338b53778f2871f07aa0e4d3dfcd0fd25724bfd240bvx5zp" Nov 26 15:23:40 crc kubenswrapper[4785]: I1126 15:23:40.812988 4785 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="668b0eb7895737e73e39eed42d3013b9e5725f37ec8c92d940e1cf5d54868843" Nov 26 15:23:46 crc kubenswrapper[4785]: I1126 15:23:46.554908 4785 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/keystone-5b85f48447-rwscr" Nov 26 15:23:51 crc kubenswrapper[4785]: I1126 15:23:51.565501 4785 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-647db694df-qnrxh"] Nov 26 15:23:51 crc kubenswrapper[4785]: E1126 15:23:51.566387 4785 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="353baaac-9344-4e21-af31-6695033fd724" containerName="pull" Nov 26 15:23:51 crc kubenswrapper[4785]: I1126 15:23:51.566405 4785 state_mem.go:107] "Deleted CPUSet assignment" podUID="353baaac-9344-4e21-af31-6695033fd724" containerName="pull" Nov 26 15:23:51 crc kubenswrapper[4785]: E1126 15:23:51.566421 4785 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6f1913b-2431-4998-a360-fc87606e990e" containerName="pull" Nov 26 15:23:51 crc kubenswrapper[4785]: I1126 15:23:51.566430 4785 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6f1913b-2431-4998-a360-fc87606e990e" containerName="pull" Nov 26 15:23:51 crc kubenswrapper[4785]: E1126 15:23:51.566448 4785 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="353baaac-9344-4e21-af31-6695033fd724" containerName="util" Nov 26 15:23:51 crc kubenswrapper[4785]: I1126 15:23:51.566456 4785 state_mem.go:107] "Deleted CPUSet assignment" podUID="353baaac-9344-4e21-af31-6695033fd724" containerName="util" Nov 26 15:23:51 crc kubenswrapper[4785]: E1126 15:23:51.566466 4785 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6f1913b-2431-4998-a360-fc87606e990e" containerName="extract" Nov 26 15:23:51 crc kubenswrapper[4785]: I1126 15:23:51.566473 4785 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6f1913b-2431-4998-a360-fc87606e990e" containerName="extract" Nov 26 15:23:51 crc kubenswrapper[4785]: E1126 15:23:51.566482 4785 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6f1913b-2431-4998-a360-fc87606e990e" containerName="util" Nov 26 15:23:51 crc kubenswrapper[4785]: I1126 15:23:51.566489 4785 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6f1913b-2431-4998-a360-fc87606e990e" containerName="util" Nov 26 15:23:51 crc kubenswrapper[4785]: E1126 15:23:51.566503 4785 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="353baaac-9344-4e21-af31-6695033fd724" containerName="extract" Nov 26 15:23:51 crc kubenswrapper[4785]: I1126 15:23:51.566510 4785 state_mem.go:107] "Deleted CPUSet assignment" podUID="353baaac-9344-4e21-af31-6695033fd724" containerName="extract" Nov 26 15:23:51 crc kubenswrapper[4785]: I1126 15:23:51.566656 4785 memory_manager.go:354] "RemoveStaleState removing state" podUID="353baaac-9344-4e21-af31-6695033fd724" containerName="extract" Nov 26 15:23:51 crc kubenswrapper[4785]: I1126 15:23:51.566675 4785 memory_manager.go:354] "RemoveStaleState removing state" podUID="a6f1913b-2431-4998-a360-fc87606e990e" containerName="extract" Nov 26 15:23:51 crc kubenswrapper[4785]: I1126 15:23:51.567205 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-647db694df-qnrxh" Nov 26 15:23:51 crc kubenswrapper[4785]: I1126 15:23:51.570371 4785 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-service-cert" Nov 26 15:23:51 crc kubenswrapper[4785]: I1126 15:23:51.570593 4785 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-4hcqv" Nov 26 15:23:51 crc kubenswrapper[4785]: I1126 15:23:51.584567 4785 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-647db694df-qnrxh"] Nov 26 15:23:51 crc kubenswrapper[4785]: I1126 15:23:51.662140 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/3e7e6d72-e83c-4188-a2bc-11b7c16e6e8e-webhook-cert\") pod \"horizon-operator-controller-manager-647db694df-qnrxh\" (UID: \"3e7e6d72-e83c-4188-a2bc-11b7c16e6e8e\") " pod="openstack-operators/horizon-operator-controller-manager-647db694df-qnrxh" Nov 26 15:23:51 crc kubenswrapper[4785]: I1126 15:23:51.662217 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/3e7e6d72-e83c-4188-a2bc-11b7c16e6e8e-apiservice-cert\") pod \"horizon-operator-controller-manager-647db694df-qnrxh\" (UID: \"3e7e6d72-e83c-4188-a2bc-11b7c16e6e8e\") " pod="openstack-operators/horizon-operator-controller-manager-647db694df-qnrxh" Nov 26 15:23:51 crc kubenswrapper[4785]: I1126 15:23:51.662385 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4v66f\" (UniqueName: \"kubernetes.io/projected/3e7e6d72-e83c-4188-a2bc-11b7c16e6e8e-kube-api-access-4v66f\") pod \"horizon-operator-controller-manager-647db694df-qnrxh\" (UID: \"3e7e6d72-e83c-4188-a2bc-11b7c16e6e8e\") " pod="openstack-operators/horizon-operator-controller-manager-647db694df-qnrxh" Nov 26 15:23:51 crc kubenswrapper[4785]: I1126 15:23:51.763492 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4v66f\" (UniqueName: \"kubernetes.io/projected/3e7e6d72-e83c-4188-a2bc-11b7c16e6e8e-kube-api-access-4v66f\") pod \"horizon-operator-controller-manager-647db694df-qnrxh\" (UID: \"3e7e6d72-e83c-4188-a2bc-11b7c16e6e8e\") " pod="openstack-operators/horizon-operator-controller-manager-647db694df-qnrxh" Nov 26 15:23:51 crc kubenswrapper[4785]: I1126 15:23:51.763567 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/3e7e6d72-e83c-4188-a2bc-11b7c16e6e8e-webhook-cert\") pod \"horizon-operator-controller-manager-647db694df-qnrxh\" (UID: \"3e7e6d72-e83c-4188-a2bc-11b7c16e6e8e\") " pod="openstack-operators/horizon-operator-controller-manager-647db694df-qnrxh" Nov 26 15:23:51 crc kubenswrapper[4785]: I1126 15:23:51.763589 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/3e7e6d72-e83c-4188-a2bc-11b7c16e6e8e-apiservice-cert\") pod \"horizon-operator-controller-manager-647db694df-qnrxh\" (UID: \"3e7e6d72-e83c-4188-a2bc-11b7c16e6e8e\") " pod="openstack-operators/horizon-operator-controller-manager-647db694df-qnrxh" Nov 26 15:23:51 crc kubenswrapper[4785]: I1126 15:23:51.771161 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/3e7e6d72-e83c-4188-a2bc-11b7c16e6e8e-apiservice-cert\") pod \"horizon-operator-controller-manager-647db694df-qnrxh\" (UID: \"3e7e6d72-e83c-4188-a2bc-11b7c16e6e8e\") " pod="openstack-operators/horizon-operator-controller-manager-647db694df-qnrxh" Nov 26 15:23:51 crc kubenswrapper[4785]: I1126 15:23:51.772163 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/3e7e6d72-e83c-4188-a2bc-11b7c16e6e8e-webhook-cert\") pod \"horizon-operator-controller-manager-647db694df-qnrxh\" (UID: \"3e7e6d72-e83c-4188-a2bc-11b7c16e6e8e\") " pod="openstack-operators/horizon-operator-controller-manager-647db694df-qnrxh" Nov 26 15:23:51 crc kubenswrapper[4785]: I1126 15:23:51.781776 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4v66f\" (UniqueName: \"kubernetes.io/projected/3e7e6d72-e83c-4188-a2bc-11b7c16e6e8e-kube-api-access-4v66f\") pod \"horizon-operator-controller-manager-647db694df-qnrxh\" (UID: \"3e7e6d72-e83c-4188-a2bc-11b7c16e6e8e\") " pod="openstack-operators/horizon-operator-controller-manager-647db694df-qnrxh" Nov 26 15:23:51 crc kubenswrapper[4785]: I1126 15:23:51.883621 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-647db694df-qnrxh" Nov 26 15:23:52 crc kubenswrapper[4785]: I1126 15:23:52.076578 4785 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-647db694df-qnrxh"] Nov 26 15:23:52 crc kubenswrapper[4785]: I1126 15:23:52.907789 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-647db694df-qnrxh" event={"ID":"3e7e6d72-e83c-4188-a2bc-11b7c16e6e8e","Type":"ContainerStarted","Data":"2debf7537ed071eca83b4b354931889cada63eaff3f1328361cc6eb218aaf6bd"} Nov 26 15:23:54 crc kubenswrapper[4785]: I1126 15:23:54.926887 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-647db694df-qnrxh" event={"ID":"3e7e6d72-e83c-4188-a2bc-11b7c16e6e8e","Type":"ContainerStarted","Data":"4a9013328a4c852f6798521a620dd057d4cb7ec37f86824d3e564c9aa58dfbd8"} Nov 26 15:23:54 crc kubenswrapper[4785]: I1126 15:23:54.928172 4785 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-647db694df-qnrxh" Nov 26 15:23:54 crc kubenswrapper[4785]: I1126 15:23:54.959440 4785 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-647db694df-qnrxh" podStartSLOduration=1.6385964450000001 podStartE2EDuration="3.959415331s" podCreationTimestamp="2025-11-26 15:23:51 +0000 UTC" firstStartedPulling="2025-11-26 15:23:52.097494431 +0000 UTC m=+995.775860195" lastFinishedPulling="2025-11-26 15:23:54.418313317 +0000 UTC m=+998.096679081" observedRunningTime="2025-11-26 15:23:54.956717808 +0000 UTC m=+998.635083632" watchObservedRunningTime="2025-11-26 15:23:54.959415331 +0000 UTC m=+998.637781125" Nov 26 15:24:00 crc kubenswrapper[4785]: I1126 15:24:00.661586 4785 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-5d784fc5bb-kn67f"] Nov 26 15:24:00 crc kubenswrapper[4785]: I1126 15:24:00.662807 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-5d784fc5bb-kn67f" Nov 26 15:24:00 crc kubenswrapper[4785]: I1126 15:24:00.664800 4785 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-xj7wq" Nov 26 15:24:00 crc kubenswrapper[4785]: I1126 15:24:00.665236 4785 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-service-cert" Nov 26 15:24:00 crc kubenswrapper[4785]: I1126 15:24:00.680957 4785 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-5d784fc5bb-kn67f"] Nov 26 15:24:00 crc kubenswrapper[4785]: I1126 15:24:00.824067 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/60b24860-07b4-4841-9c4a-a5e6456a45dc-webhook-cert\") pod \"swift-operator-controller-manager-5d784fc5bb-kn67f\" (UID: \"60b24860-07b4-4841-9c4a-a5e6456a45dc\") " pod="openstack-operators/swift-operator-controller-manager-5d784fc5bb-kn67f" Nov 26 15:24:00 crc kubenswrapper[4785]: I1126 15:24:00.824423 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fltgh\" (UniqueName: \"kubernetes.io/projected/60b24860-07b4-4841-9c4a-a5e6456a45dc-kube-api-access-fltgh\") pod \"swift-operator-controller-manager-5d784fc5bb-kn67f\" (UID: \"60b24860-07b4-4841-9c4a-a5e6456a45dc\") " pod="openstack-operators/swift-operator-controller-manager-5d784fc5bb-kn67f" Nov 26 15:24:00 crc kubenswrapper[4785]: I1126 15:24:00.824486 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/60b24860-07b4-4841-9c4a-a5e6456a45dc-apiservice-cert\") pod \"swift-operator-controller-manager-5d784fc5bb-kn67f\" (UID: \"60b24860-07b4-4841-9c4a-a5e6456a45dc\") " pod="openstack-operators/swift-operator-controller-manager-5d784fc5bb-kn67f" Nov 26 15:24:00 crc kubenswrapper[4785]: I1126 15:24:00.925465 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fltgh\" (UniqueName: \"kubernetes.io/projected/60b24860-07b4-4841-9c4a-a5e6456a45dc-kube-api-access-fltgh\") pod \"swift-operator-controller-manager-5d784fc5bb-kn67f\" (UID: \"60b24860-07b4-4841-9c4a-a5e6456a45dc\") " pod="openstack-operators/swift-operator-controller-manager-5d784fc5bb-kn67f" Nov 26 15:24:00 crc kubenswrapper[4785]: I1126 15:24:00.925511 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/60b24860-07b4-4841-9c4a-a5e6456a45dc-apiservice-cert\") pod \"swift-operator-controller-manager-5d784fc5bb-kn67f\" (UID: \"60b24860-07b4-4841-9c4a-a5e6456a45dc\") " pod="openstack-operators/swift-operator-controller-manager-5d784fc5bb-kn67f" Nov 26 15:24:00 crc kubenswrapper[4785]: I1126 15:24:00.925570 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/60b24860-07b4-4841-9c4a-a5e6456a45dc-webhook-cert\") pod \"swift-operator-controller-manager-5d784fc5bb-kn67f\" (UID: \"60b24860-07b4-4841-9c4a-a5e6456a45dc\") " pod="openstack-operators/swift-operator-controller-manager-5d784fc5bb-kn67f" Nov 26 15:24:00 crc kubenswrapper[4785]: I1126 15:24:00.933406 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/60b24860-07b4-4841-9c4a-a5e6456a45dc-apiservice-cert\") pod \"swift-operator-controller-manager-5d784fc5bb-kn67f\" (UID: \"60b24860-07b4-4841-9c4a-a5e6456a45dc\") " pod="openstack-operators/swift-operator-controller-manager-5d784fc5bb-kn67f" Nov 26 15:24:00 crc kubenswrapper[4785]: I1126 15:24:00.940591 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fltgh\" (UniqueName: \"kubernetes.io/projected/60b24860-07b4-4841-9c4a-a5e6456a45dc-kube-api-access-fltgh\") pod \"swift-operator-controller-manager-5d784fc5bb-kn67f\" (UID: \"60b24860-07b4-4841-9c4a-a5e6456a45dc\") " pod="openstack-operators/swift-operator-controller-manager-5d784fc5bb-kn67f" Nov 26 15:24:00 crc kubenswrapper[4785]: I1126 15:24:00.945209 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/60b24860-07b4-4841-9c4a-a5e6456a45dc-webhook-cert\") pod \"swift-operator-controller-manager-5d784fc5bb-kn67f\" (UID: \"60b24860-07b4-4841-9c4a-a5e6456a45dc\") " pod="openstack-operators/swift-operator-controller-manager-5d784fc5bb-kn67f" Nov 26 15:24:00 crc kubenswrapper[4785]: I1126 15:24:00.980055 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-5d784fc5bb-kn67f" Nov 26 15:24:01 crc kubenswrapper[4785]: I1126 15:24:01.382006 4785 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-5d784fc5bb-kn67f"] Nov 26 15:24:01 crc kubenswrapper[4785]: I1126 15:24:01.887827 4785 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-647db694df-qnrxh" Nov 26 15:24:01 crc kubenswrapper[4785]: I1126 15:24:01.973794 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-5d784fc5bb-kn67f" event={"ID":"60b24860-07b4-4841-9c4a-a5e6456a45dc","Type":"ContainerStarted","Data":"fc1eb18ea23366c9ef58ec0b56e40d757e1ddbb6b564c2cd0d4d2e7dde0bb925"} Nov 26 15:24:03 crc kubenswrapper[4785]: I1126 15:24:03.987930 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-5d784fc5bb-kn67f" event={"ID":"60b24860-07b4-4841-9c4a-a5e6456a45dc","Type":"ContainerStarted","Data":"5b25616e36bb67a22a9385ef0360a339f8572ff565f1fff90f7f634a3370f4b4"} Nov 26 15:24:03 crc kubenswrapper[4785]: I1126 15:24:03.988218 4785 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-5d784fc5bb-kn67f" Nov 26 15:24:04 crc kubenswrapper[4785]: I1126 15:24:04.006818 4785 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-5d784fc5bb-kn67f" podStartSLOduration=1.644408253 podStartE2EDuration="4.006792503s" podCreationTimestamp="2025-11-26 15:24:00 +0000 UTC" firstStartedPulling="2025-11-26 15:24:01.391533694 +0000 UTC m=+1005.069899478" lastFinishedPulling="2025-11-26 15:24:03.753917964 +0000 UTC m=+1007.432283728" observedRunningTime="2025-11-26 15:24:04.002178859 +0000 UTC m=+1007.680544643" watchObservedRunningTime="2025-11-26 15:24:04.006792503 +0000 UTC m=+1007.685158267" Nov 26 15:24:10 crc kubenswrapper[4785]: I1126 15:24:10.986995 4785 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-5d784fc5bb-kn67f" Nov 26 15:24:13 crc kubenswrapper[4785]: I1126 15:24:13.972777 4785 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/swift-storage-0"] Nov 26 15:24:13 crc kubenswrapper[4785]: I1126 15:24:13.978327 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/swift-storage-0" Nov 26 15:24:13 crc kubenswrapper[4785]: I1126 15:24:13.980762 4785 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"swift-conf" Nov 26 15:24:13 crc kubenswrapper[4785]: I1126 15:24:13.982626 4785 reflector.go:368] Caches populated for *v1.ConfigMap from object-"glance-kuttl-tests"/"swift-ring-files" Nov 26 15:24:13 crc kubenswrapper[4785]: I1126 15:24:13.982912 4785 reflector.go:368] Caches populated for *v1.ConfigMap from object-"glance-kuttl-tests"/"swift-storage-config-data" Nov 26 15:24:13 crc kubenswrapper[4785]: I1126 15:24:13.983057 4785 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"swift-swift-dockercfg-66h9w" Nov 26 15:24:14 crc kubenswrapper[4785]: I1126 15:24:14.003755 4785 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/swift-storage-0"] Nov 26 15:24:14 crc kubenswrapper[4785]: I1126 15:24:14.078734 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/c22f4ea9-991d-4431-be3c-aeb8f547176e-cache\") pod \"swift-storage-0\" (UID: \"c22f4ea9-991d-4431-be3c-aeb8f547176e\") " pod="glance-kuttl-tests/swift-storage-0" Nov 26 15:24:14 crc kubenswrapper[4785]: I1126 15:24:14.078796 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sb26f\" (UniqueName: \"kubernetes.io/projected/c22f4ea9-991d-4431-be3c-aeb8f547176e-kube-api-access-sb26f\") pod \"swift-storage-0\" (UID: \"c22f4ea9-991d-4431-be3c-aeb8f547176e\") " pod="glance-kuttl-tests/swift-storage-0" Nov 26 15:24:14 crc kubenswrapper[4785]: I1126 15:24:14.078881 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/c22f4ea9-991d-4431-be3c-aeb8f547176e-etc-swift\") pod \"swift-storage-0\" (UID: \"c22f4ea9-991d-4431-be3c-aeb8f547176e\") " pod="glance-kuttl-tests/swift-storage-0" Nov 26 15:24:14 crc kubenswrapper[4785]: I1126 15:24:14.078920 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"swift-storage-0\" (UID: \"c22f4ea9-991d-4431-be3c-aeb8f547176e\") " pod="glance-kuttl-tests/swift-storage-0" Nov 26 15:24:14 crc kubenswrapper[4785]: I1126 15:24:14.078973 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/c22f4ea9-991d-4431-be3c-aeb8f547176e-lock\") pod \"swift-storage-0\" (UID: \"c22f4ea9-991d-4431-be3c-aeb8f547176e\") " pod="glance-kuttl-tests/swift-storage-0" Nov 26 15:24:14 crc kubenswrapper[4785]: I1126 15:24:14.180732 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/c22f4ea9-991d-4431-be3c-aeb8f547176e-cache\") pod \"swift-storage-0\" (UID: \"c22f4ea9-991d-4431-be3c-aeb8f547176e\") " pod="glance-kuttl-tests/swift-storage-0" Nov 26 15:24:14 crc kubenswrapper[4785]: I1126 15:24:14.181080 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sb26f\" (UniqueName: \"kubernetes.io/projected/c22f4ea9-991d-4431-be3c-aeb8f547176e-kube-api-access-sb26f\") pod \"swift-storage-0\" (UID: \"c22f4ea9-991d-4431-be3c-aeb8f547176e\") " pod="glance-kuttl-tests/swift-storage-0" Nov 26 15:24:14 crc kubenswrapper[4785]: I1126 15:24:14.181128 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/c22f4ea9-991d-4431-be3c-aeb8f547176e-etc-swift\") pod \"swift-storage-0\" (UID: \"c22f4ea9-991d-4431-be3c-aeb8f547176e\") " pod="glance-kuttl-tests/swift-storage-0" Nov 26 15:24:14 crc kubenswrapper[4785]: I1126 15:24:14.181159 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"swift-storage-0\" (UID: \"c22f4ea9-991d-4431-be3c-aeb8f547176e\") " pod="glance-kuttl-tests/swift-storage-0" Nov 26 15:24:14 crc kubenswrapper[4785]: I1126 15:24:14.181197 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/c22f4ea9-991d-4431-be3c-aeb8f547176e-lock\") pod \"swift-storage-0\" (UID: \"c22f4ea9-991d-4431-be3c-aeb8f547176e\") " pod="glance-kuttl-tests/swift-storage-0" Nov 26 15:24:14 crc kubenswrapper[4785]: I1126 15:24:14.182417 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/c22f4ea9-991d-4431-be3c-aeb8f547176e-cache\") pod \"swift-storage-0\" (UID: \"c22f4ea9-991d-4431-be3c-aeb8f547176e\") " pod="glance-kuttl-tests/swift-storage-0" Nov 26 15:24:14 crc kubenswrapper[4785]: E1126 15:24:14.182736 4785 projected.go:288] Couldn't get configMap glance-kuttl-tests/swift-ring-files: configmap "swift-ring-files" not found Nov 26 15:24:14 crc kubenswrapper[4785]: E1126 15:24:14.182754 4785 projected.go:194] Error preparing data for projected volume etc-swift for pod glance-kuttl-tests/swift-storage-0: configmap "swift-ring-files" not found Nov 26 15:24:14 crc kubenswrapper[4785]: E1126 15:24:14.182790 4785 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c22f4ea9-991d-4431-be3c-aeb8f547176e-etc-swift podName:c22f4ea9-991d-4431-be3c-aeb8f547176e nodeName:}" failed. No retries permitted until 2025-11-26 15:24:14.682775219 +0000 UTC m=+1018.361140983 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/c22f4ea9-991d-4431-be3c-aeb8f547176e-etc-swift") pod "swift-storage-0" (UID: "c22f4ea9-991d-4431-be3c-aeb8f547176e") : configmap "swift-ring-files" not found Nov 26 15:24:14 crc kubenswrapper[4785]: I1126 15:24:14.183076 4785 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"swift-storage-0\" (UID: \"c22f4ea9-991d-4431-be3c-aeb8f547176e\") device mount path \"/mnt/openstack/pv04\"" pod="glance-kuttl-tests/swift-storage-0" Nov 26 15:24:14 crc kubenswrapper[4785]: I1126 15:24:14.183860 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/c22f4ea9-991d-4431-be3c-aeb8f547176e-lock\") pod \"swift-storage-0\" (UID: \"c22f4ea9-991d-4431-be3c-aeb8f547176e\") " pod="glance-kuttl-tests/swift-storage-0" Nov 26 15:24:14 crc kubenswrapper[4785]: I1126 15:24:14.211750 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sb26f\" (UniqueName: \"kubernetes.io/projected/c22f4ea9-991d-4431-be3c-aeb8f547176e-kube-api-access-sb26f\") pod \"swift-storage-0\" (UID: \"c22f4ea9-991d-4431-be3c-aeb8f547176e\") " pod="glance-kuttl-tests/swift-storage-0" Nov 26 15:24:14 crc kubenswrapper[4785]: I1126 15:24:14.216803 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"swift-storage-0\" (UID: \"c22f4ea9-991d-4431-be3c-aeb8f547176e\") " pod="glance-kuttl-tests/swift-storage-0" Nov 26 15:24:14 crc kubenswrapper[4785]: I1126 15:24:14.688847 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/c22f4ea9-991d-4431-be3c-aeb8f547176e-etc-swift\") pod \"swift-storage-0\" (UID: \"c22f4ea9-991d-4431-be3c-aeb8f547176e\") " pod="glance-kuttl-tests/swift-storage-0" Nov 26 15:24:14 crc kubenswrapper[4785]: E1126 15:24:14.689058 4785 projected.go:288] Couldn't get configMap glance-kuttl-tests/swift-ring-files: configmap "swift-ring-files" not found Nov 26 15:24:14 crc kubenswrapper[4785]: E1126 15:24:14.689087 4785 projected.go:194] Error preparing data for projected volume etc-swift for pod glance-kuttl-tests/swift-storage-0: configmap "swift-ring-files" not found Nov 26 15:24:14 crc kubenswrapper[4785]: E1126 15:24:14.689147 4785 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c22f4ea9-991d-4431-be3c-aeb8f547176e-etc-swift podName:c22f4ea9-991d-4431-be3c-aeb8f547176e nodeName:}" failed. No retries permitted until 2025-11-26 15:24:15.689127323 +0000 UTC m=+1019.367493087 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/c22f4ea9-991d-4431-be3c-aeb8f547176e-etc-swift") pod "swift-storage-0" (UID: "c22f4ea9-991d-4431-be3c-aeb8f547176e") : configmap "swift-ring-files" not found Nov 26 15:24:15 crc kubenswrapper[4785]: I1126 15:24:15.001693 4785 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-index-nhbrf"] Nov 26 15:24:15 crc kubenswrapper[4785]: I1126 15:24:15.002440 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-index-nhbrf" Nov 26 15:24:15 crc kubenswrapper[4785]: I1126 15:24:15.004245 4785 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-index-dockercfg-rmbn8" Nov 26 15:24:15 crc kubenswrapper[4785]: I1126 15:24:15.015268 4785 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-index-nhbrf"] Nov 26 15:24:15 crc kubenswrapper[4785]: I1126 15:24:15.096665 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dn9s4\" (UniqueName: \"kubernetes.io/projected/abab561d-b77b-4484-803a-e3b57ce4ec3a-kube-api-access-dn9s4\") pod \"glance-operator-index-nhbrf\" (UID: \"abab561d-b77b-4484-803a-e3b57ce4ec3a\") " pod="openstack-operators/glance-operator-index-nhbrf" Nov 26 15:24:15 crc kubenswrapper[4785]: I1126 15:24:15.198661 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dn9s4\" (UniqueName: \"kubernetes.io/projected/abab561d-b77b-4484-803a-e3b57ce4ec3a-kube-api-access-dn9s4\") pod \"glance-operator-index-nhbrf\" (UID: \"abab561d-b77b-4484-803a-e3b57ce4ec3a\") " pod="openstack-operators/glance-operator-index-nhbrf" Nov 26 15:24:15 crc kubenswrapper[4785]: I1126 15:24:15.226730 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dn9s4\" (UniqueName: \"kubernetes.io/projected/abab561d-b77b-4484-803a-e3b57ce4ec3a-kube-api-access-dn9s4\") pod \"glance-operator-index-nhbrf\" (UID: \"abab561d-b77b-4484-803a-e3b57ce4ec3a\") " pod="openstack-operators/glance-operator-index-nhbrf" Nov 26 15:24:15 crc kubenswrapper[4785]: I1126 15:24:15.318823 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-index-nhbrf" Nov 26 15:24:15 crc kubenswrapper[4785]: I1126 15:24:15.698360 4785 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-index-nhbrf"] Nov 26 15:24:15 crc kubenswrapper[4785]: W1126 15:24:15.704508 4785 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podabab561d_b77b_4484_803a_e3b57ce4ec3a.slice/crio-953cf4423bb63b1b6c36dbb3dc538cd6383883e4da6f9a5e4a178e55da14e586 WatchSource:0}: Error finding container 953cf4423bb63b1b6c36dbb3dc538cd6383883e4da6f9a5e4a178e55da14e586: Status 404 returned error can't find the container with id 953cf4423bb63b1b6c36dbb3dc538cd6383883e4da6f9a5e4a178e55da14e586 Nov 26 15:24:15 crc kubenswrapper[4785]: I1126 15:24:15.706323 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/c22f4ea9-991d-4431-be3c-aeb8f547176e-etc-swift\") pod \"swift-storage-0\" (UID: \"c22f4ea9-991d-4431-be3c-aeb8f547176e\") " pod="glance-kuttl-tests/swift-storage-0" Nov 26 15:24:15 crc kubenswrapper[4785]: E1126 15:24:15.706522 4785 projected.go:288] Couldn't get configMap glance-kuttl-tests/swift-ring-files: configmap "swift-ring-files" not found Nov 26 15:24:15 crc kubenswrapper[4785]: E1126 15:24:15.706546 4785 projected.go:194] Error preparing data for projected volume etc-swift for pod glance-kuttl-tests/swift-storage-0: configmap "swift-ring-files" not found Nov 26 15:24:15 crc kubenswrapper[4785]: E1126 15:24:15.706615 4785 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c22f4ea9-991d-4431-be3c-aeb8f547176e-etc-swift podName:c22f4ea9-991d-4431-be3c-aeb8f547176e nodeName:}" failed. No retries permitted until 2025-11-26 15:24:17.70659896 +0000 UTC m=+1021.384964724 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/c22f4ea9-991d-4431-be3c-aeb8f547176e-etc-swift") pod "swift-storage-0" (UID: "c22f4ea9-991d-4431-be3c-aeb8f547176e") : configmap "swift-ring-files" not found Nov 26 15:24:16 crc kubenswrapper[4785]: I1126 15:24:16.460213 4785 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/swift-proxy-6bd58cfcf7-mxwdz"] Nov 26 15:24:16 crc kubenswrapper[4785]: I1126 15:24:16.463894 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/swift-proxy-6bd58cfcf7-mxwdz" Nov 26 15:24:16 crc kubenswrapper[4785]: I1126 15:24:16.466765 4785 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"swift-proxy-config-data" Nov 26 15:24:16 crc kubenswrapper[4785]: I1126 15:24:16.473617 4785 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/swift-proxy-6bd58cfcf7-mxwdz"] Nov 26 15:24:16 crc kubenswrapper[4785]: I1126 15:24:16.497865 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-index-nhbrf" event={"ID":"abab561d-b77b-4484-803a-e3b57ce4ec3a","Type":"ContainerStarted","Data":"953cf4423bb63b1b6c36dbb3dc538cd6383883e4da6f9a5e4a178e55da14e586"} Nov 26 15:24:16 crc kubenswrapper[4785]: I1126 15:24:16.619897 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/438d5704-c198-4184-aba9-e9be2025f903-log-httpd\") pod \"swift-proxy-6bd58cfcf7-mxwdz\" (UID: \"438d5704-c198-4184-aba9-e9be2025f903\") " pod="glance-kuttl-tests/swift-proxy-6bd58cfcf7-mxwdz" Nov 26 15:24:16 crc kubenswrapper[4785]: I1126 15:24:16.620006 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/438d5704-c198-4184-aba9-e9be2025f903-run-httpd\") pod \"swift-proxy-6bd58cfcf7-mxwdz\" (UID: \"438d5704-c198-4184-aba9-e9be2025f903\") " pod="glance-kuttl-tests/swift-proxy-6bd58cfcf7-mxwdz" Nov 26 15:24:16 crc kubenswrapper[4785]: I1126 15:24:16.620109 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mcxmb\" (UniqueName: \"kubernetes.io/projected/438d5704-c198-4184-aba9-e9be2025f903-kube-api-access-mcxmb\") pod \"swift-proxy-6bd58cfcf7-mxwdz\" (UID: \"438d5704-c198-4184-aba9-e9be2025f903\") " pod="glance-kuttl-tests/swift-proxy-6bd58cfcf7-mxwdz" Nov 26 15:24:16 crc kubenswrapper[4785]: I1126 15:24:16.620185 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/438d5704-c198-4184-aba9-e9be2025f903-etc-swift\") pod \"swift-proxy-6bd58cfcf7-mxwdz\" (UID: \"438d5704-c198-4184-aba9-e9be2025f903\") " pod="glance-kuttl-tests/swift-proxy-6bd58cfcf7-mxwdz" Nov 26 15:24:16 crc kubenswrapper[4785]: I1126 15:24:16.620275 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/438d5704-c198-4184-aba9-e9be2025f903-config-data\") pod \"swift-proxy-6bd58cfcf7-mxwdz\" (UID: \"438d5704-c198-4184-aba9-e9be2025f903\") " pod="glance-kuttl-tests/swift-proxy-6bd58cfcf7-mxwdz" Nov 26 15:24:16 crc kubenswrapper[4785]: I1126 15:24:16.721753 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mcxmb\" (UniqueName: \"kubernetes.io/projected/438d5704-c198-4184-aba9-e9be2025f903-kube-api-access-mcxmb\") pod \"swift-proxy-6bd58cfcf7-mxwdz\" (UID: \"438d5704-c198-4184-aba9-e9be2025f903\") " pod="glance-kuttl-tests/swift-proxy-6bd58cfcf7-mxwdz" Nov 26 15:24:16 crc kubenswrapper[4785]: I1126 15:24:16.721829 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/438d5704-c198-4184-aba9-e9be2025f903-etc-swift\") pod \"swift-proxy-6bd58cfcf7-mxwdz\" (UID: \"438d5704-c198-4184-aba9-e9be2025f903\") " pod="glance-kuttl-tests/swift-proxy-6bd58cfcf7-mxwdz" Nov 26 15:24:16 crc kubenswrapper[4785]: I1126 15:24:16.721887 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/438d5704-c198-4184-aba9-e9be2025f903-config-data\") pod \"swift-proxy-6bd58cfcf7-mxwdz\" (UID: \"438d5704-c198-4184-aba9-e9be2025f903\") " pod="glance-kuttl-tests/swift-proxy-6bd58cfcf7-mxwdz" Nov 26 15:24:16 crc kubenswrapper[4785]: I1126 15:24:16.721930 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/438d5704-c198-4184-aba9-e9be2025f903-log-httpd\") pod \"swift-proxy-6bd58cfcf7-mxwdz\" (UID: \"438d5704-c198-4184-aba9-e9be2025f903\") " pod="glance-kuttl-tests/swift-proxy-6bd58cfcf7-mxwdz" Nov 26 15:24:16 crc kubenswrapper[4785]: I1126 15:24:16.721953 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/438d5704-c198-4184-aba9-e9be2025f903-run-httpd\") pod \"swift-proxy-6bd58cfcf7-mxwdz\" (UID: \"438d5704-c198-4184-aba9-e9be2025f903\") " pod="glance-kuttl-tests/swift-proxy-6bd58cfcf7-mxwdz" Nov 26 15:24:16 crc kubenswrapper[4785]: E1126 15:24:16.721993 4785 projected.go:288] Couldn't get configMap glance-kuttl-tests/swift-ring-files: configmap "swift-ring-files" not found Nov 26 15:24:16 crc kubenswrapper[4785]: E1126 15:24:16.722015 4785 projected.go:194] Error preparing data for projected volume etc-swift for pod glance-kuttl-tests/swift-proxy-6bd58cfcf7-mxwdz: configmap "swift-ring-files" not found Nov 26 15:24:16 crc kubenswrapper[4785]: E1126 15:24:16.722071 4785 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/438d5704-c198-4184-aba9-e9be2025f903-etc-swift podName:438d5704-c198-4184-aba9-e9be2025f903 nodeName:}" failed. No retries permitted until 2025-11-26 15:24:17.222052013 +0000 UTC m=+1020.900417787 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/438d5704-c198-4184-aba9-e9be2025f903-etc-swift") pod "swift-proxy-6bd58cfcf7-mxwdz" (UID: "438d5704-c198-4184-aba9-e9be2025f903") : configmap "swift-ring-files" not found Nov 26 15:24:16 crc kubenswrapper[4785]: I1126 15:24:16.722417 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/438d5704-c198-4184-aba9-e9be2025f903-run-httpd\") pod \"swift-proxy-6bd58cfcf7-mxwdz\" (UID: \"438d5704-c198-4184-aba9-e9be2025f903\") " pod="glance-kuttl-tests/swift-proxy-6bd58cfcf7-mxwdz" Nov 26 15:24:16 crc kubenswrapper[4785]: I1126 15:24:16.722443 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/438d5704-c198-4184-aba9-e9be2025f903-log-httpd\") pod \"swift-proxy-6bd58cfcf7-mxwdz\" (UID: \"438d5704-c198-4184-aba9-e9be2025f903\") " pod="glance-kuttl-tests/swift-proxy-6bd58cfcf7-mxwdz" Nov 26 15:24:16 crc kubenswrapper[4785]: I1126 15:24:16.740922 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/438d5704-c198-4184-aba9-e9be2025f903-config-data\") pod \"swift-proxy-6bd58cfcf7-mxwdz\" (UID: \"438d5704-c198-4184-aba9-e9be2025f903\") " pod="glance-kuttl-tests/swift-proxy-6bd58cfcf7-mxwdz" Nov 26 15:24:16 crc kubenswrapper[4785]: I1126 15:24:16.751834 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mcxmb\" (UniqueName: \"kubernetes.io/projected/438d5704-c198-4184-aba9-e9be2025f903-kube-api-access-mcxmb\") pod \"swift-proxy-6bd58cfcf7-mxwdz\" (UID: \"438d5704-c198-4184-aba9-e9be2025f903\") " pod="glance-kuttl-tests/swift-proxy-6bd58cfcf7-mxwdz" Nov 26 15:24:17 crc kubenswrapper[4785]: I1126 15:24:17.229989 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/438d5704-c198-4184-aba9-e9be2025f903-etc-swift\") pod \"swift-proxy-6bd58cfcf7-mxwdz\" (UID: \"438d5704-c198-4184-aba9-e9be2025f903\") " pod="glance-kuttl-tests/swift-proxy-6bd58cfcf7-mxwdz" Nov 26 15:24:17 crc kubenswrapper[4785]: E1126 15:24:17.230198 4785 projected.go:288] Couldn't get configMap glance-kuttl-tests/swift-ring-files: configmap "swift-ring-files" not found Nov 26 15:24:17 crc kubenswrapper[4785]: E1126 15:24:17.230212 4785 projected.go:194] Error preparing data for projected volume etc-swift for pod glance-kuttl-tests/swift-proxy-6bd58cfcf7-mxwdz: configmap "swift-ring-files" not found Nov 26 15:24:17 crc kubenswrapper[4785]: E1126 15:24:17.230253 4785 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/438d5704-c198-4184-aba9-e9be2025f903-etc-swift podName:438d5704-c198-4184-aba9-e9be2025f903 nodeName:}" failed. No retries permitted until 2025-11-26 15:24:18.230239357 +0000 UTC m=+1021.908605111 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/438d5704-c198-4184-aba9-e9be2025f903-etc-swift") pod "swift-proxy-6bd58cfcf7-mxwdz" (UID: "438d5704-c198-4184-aba9-e9be2025f903") : configmap "swift-ring-files" not found Nov 26 15:24:17 crc kubenswrapper[4785]: I1126 15:24:17.736833 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/c22f4ea9-991d-4431-be3c-aeb8f547176e-etc-swift\") pod \"swift-storage-0\" (UID: \"c22f4ea9-991d-4431-be3c-aeb8f547176e\") " pod="glance-kuttl-tests/swift-storage-0" Nov 26 15:24:17 crc kubenswrapper[4785]: E1126 15:24:17.737040 4785 projected.go:288] Couldn't get configMap glance-kuttl-tests/swift-ring-files: configmap "swift-ring-files" not found Nov 26 15:24:17 crc kubenswrapper[4785]: E1126 15:24:17.737206 4785 projected.go:194] Error preparing data for projected volume etc-swift for pod glance-kuttl-tests/swift-storage-0: configmap "swift-ring-files" not found Nov 26 15:24:17 crc kubenswrapper[4785]: E1126 15:24:17.737261 4785 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c22f4ea9-991d-4431-be3c-aeb8f547176e-etc-swift podName:c22f4ea9-991d-4431-be3c-aeb8f547176e nodeName:}" failed. No retries permitted until 2025-11-26 15:24:21.737245568 +0000 UTC m=+1025.415611332 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/c22f4ea9-991d-4431-be3c-aeb8f547176e-etc-swift") pod "swift-storage-0" (UID: "c22f4ea9-991d-4431-be3c-aeb8f547176e") : configmap "swift-ring-files" not found Nov 26 15:24:17 crc kubenswrapper[4785]: I1126 15:24:17.867770 4785 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/swift-ring-rebalance-fsr6b"] Nov 26 15:24:17 crc kubenswrapper[4785]: I1126 15:24:17.868776 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/swift-ring-rebalance-fsr6b" Nov 26 15:24:17 crc kubenswrapper[4785]: I1126 15:24:17.871293 4785 reflector.go:368] Caches populated for *v1.ConfigMap from object-"glance-kuttl-tests"/"swift-ring-config-data" Nov 26 15:24:17 crc kubenswrapper[4785]: I1126 15:24:17.872001 4785 reflector.go:368] Caches populated for *v1.ConfigMap from object-"glance-kuttl-tests"/"swift-ring-scripts" Nov 26 15:24:17 crc kubenswrapper[4785]: I1126 15:24:17.883789 4785 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/swift-ring-rebalance-fsr6b"] Nov 26 15:24:17 crc kubenswrapper[4785]: I1126 15:24:17.986343 4785 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/glance-operator-index-nhbrf"] Nov 26 15:24:18 crc kubenswrapper[4785]: I1126 15:24:18.040935 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/5d518dbb-f95d-409e-be26-ec87f47d465a-etc-swift\") pod \"swift-ring-rebalance-fsr6b\" (UID: \"5d518dbb-f95d-409e-be26-ec87f47d465a\") " pod="glance-kuttl-tests/swift-ring-rebalance-fsr6b" Nov 26 15:24:18 crc kubenswrapper[4785]: I1126 15:24:18.040996 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/5d518dbb-f95d-409e-be26-ec87f47d465a-dispersionconf\") pod \"swift-ring-rebalance-fsr6b\" (UID: \"5d518dbb-f95d-409e-be26-ec87f47d465a\") " pod="glance-kuttl-tests/swift-ring-rebalance-fsr6b" Nov 26 15:24:18 crc kubenswrapper[4785]: I1126 15:24:18.041047 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fv4cr\" (UniqueName: \"kubernetes.io/projected/5d518dbb-f95d-409e-be26-ec87f47d465a-kube-api-access-fv4cr\") pod \"swift-ring-rebalance-fsr6b\" (UID: \"5d518dbb-f95d-409e-be26-ec87f47d465a\") " pod="glance-kuttl-tests/swift-ring-rebalance-fsr6b" Nov 26 15:24:18 crc kubenswrapper[4785]: I1126 15:24:18.041088 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/5d518dbb-f95d-409e-be26-ec87f47d465a-swiftconf\") pod \"swift-ring-rebalance-fsr6b\" (UID: \"5d518dbb-f95d-409e-be26-ec87f47d465a\") " pod="glance-kuttl-tests/swift-ring-rebalance-fsr6b" Nov 26 15:24:18 crc kubenswrapper[4785]: I1126 15:24:18.041137 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5d518dbb-f95d-409e-be26-ec87f47d465a-scripts\") pod \"swift-ring-rebalance-fsr6b\" (UID: \"5d518dbb-f95d-409e-be26-ec87f47d465a\") " pod="glance-kuttl-tests/swift-ring-rebalance-fsr6b" Nov 26 15:24:18 crc kubenswrapper[4785]: I1126 15:24:18.041305 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/5d518dbb-f95d-409e-be26-ec87f47d465a-ring-data-devices\") pod \"swift-ring-rebalance-fsr6b\" (UID: \"5d518dbb-f95d-409e-be26-ec87f47d465a\") " pod="glance-kuttl-tests/swift-ring-rebalance-fsr6b" Nov 26 15:24:18 crc kubenswrapper[4785]: I1126 15:24:18.142305 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/5d518dbb-f95d-409e-be26-ec87f47d465a-etc-swift\") pod \"swift-ring-rebalance-fsr6b\" (UID: \"5d518dbb-f95d-409e-be26-ec87f47d465a\") " pod="glance-kuttl-tests/swift-ring-rebalance-fsr6b" Nov 26 15:24:18 crc kubenswrapper[4785]: I1126 15:24:18.142419 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/5d518dbb-f95d-409e-be26-ec87f47d465a-dispersionconf\") pod \"swift-ring-rebalance-fsr6b\" (UID: \"5d518dbb-f95d-409e-be26-ec87f47d465a\") " pod="glance-kuttl-tests/swift-ring-rebalance-fsr6b" Nov 26 15:24:18 crc kubenswrapper[4785]: I1126 15:24:18.142483 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fv4cr\" (UniqueName: \"kubernetes.io/projected/5d518dbb-f95d-409e-be26-ec87f47d465a-kube-api-access-fv4cr\") pod \"swift-ring-rebalance-fsr6b\" (UID: \"5d518dbb-f95d-409e-be26-ec87f47d465a\") " pod="glance-kuttl-tests/swift-ring-rebalance-fsr6b" Nov 26 15:24:18 crc kubenswrapper[4785]: I1126 15:24:18.142539 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/5d518dbb-f95d-409e-be26-ec87f47d465a-swiftconf\") pod \"swift-ring-rebalance-fsr6b\" (UID: \"5d518dbb-f95d-409e-be26-ec87f47d465a\") " pod="glance-kuttl-tests/swift-ring-rebalance-fsr6b" Nov 26 15:24:18 crc kubenswrapper[4785]: I1126 15:24:18.142713 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5d518dbb-f95d-409e-be26-ec87f47d465a-scripts\") pod \"swift-ring-rebalance-fsr6b\" (UID: \"5d518dbb-f95d-409e-be26-ec87f47d465a\") " pod="glance-kuttl-tests/swift-ring-rebalance-fsr6b" Nov 26 15:24:18 crc kubenswrapper[4785]: I1126 15:24:18.142811 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/5d518dbb-f95d-409e-be26-ec87f47d465a-etc-swift\") pod \"swift-ring-rebalance-fsr6b\" (UID: \"5d518dbb-f95d-409e-be26-ec87f47d465a\") " pod="glance-kuttl-tests/swift-ring-rebalance-fsr6b" Nov 26 15:24:18 crc kubenswrapper[4785]: I1126 15:24:18.142817 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/5d518dbb-f95d-409e-be26-ec87f47d465a-ring-data-devices\") pod \"swift-ring-rebalance-fsr6b\" (UID: \"5d518dbb-f95d-409e-be26-ec87f47d465a\") " pod="glance-kuttl-tests/swift-ring-rebalance-fsr6b" Nov 26 15:24:18 crc kubenswrapper[4785]: I1126 15:24:18.143379 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5d518dbb-f95d-409e-be26-ec87f47d465a-scripts\") pod \"swift-ring-rebalance-fsr6b\" (UID: \"5d518dbb-f95d-409e-be26-ec87f47d465a\") " pod="glance-kuttl-tests/swift-ring-rebalance-fsr6b" Nov 26 15:24:18 crc kubenswrapper[4785]: I1126 15:24:18.144620 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/5d518dbb-f95d-409e-be26-ec87f47d465a-ring-data-devices\") pod \"swift-ring-rebalance-fsr6b\" (UID: \"5d518dbb-f95d-409e-be26-ec87f47d465a\") " pod="glance-kuttl-tests/swift-ring-rebalance-fsr6b" Nov 26 15:24:18 crc kubenswrapper[4785]: I1126 15:24:18.149142 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/5d518dbb-f95d-409e-be26-ec87f47d465a-dispersionconf\") pod \"swift-ring-rebalance-fsr6b\" (UID: \"5d518dbb-f95d-409e-be26-ec87f47d465a\") " pod="glance-kuttl-tests/swift-ring-rebalance-fsr6b" Nov 26 15:24:18 crc kubenswrapper[4785]: I1126 15:24:18.159964 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/5d518dbb-f95d-409e-be26-ec87f47d465a-swiftconf\") pod \"swift-ring-rebalance-fsr6b\" (UID: \"5d518dbb-f95d-409e-be26-ec87f47d465a\") " pod="glance-kuttl-tests/swift-ring-rebalance-fsr6b" Nov 26 15:24:18 crc kubenswrapper[4785]: I1126 15:24:18.166979 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fv4cr\" (UniqueName: \"kubernetes.io/projected/5d518dbb-f95d-409e-be26-ec87f47d465a-kube-api-access-fv4cr\") pod \"swift-ring-rebalance-fsr6b\" (UID: \"5d518dbb-f95d-409e-be26-ec87f47d465a\") " pod="glance-kuttl-tests/swift-ring-rebalance-fsr6b" Nov 26 15:24:18 crc kubenswrapper[4785]: I1126 15:24:18.186375 4785 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"swift-swift-dockercfg-66h9w" Nov 26 15:24:18 crc kubenswrapper[4785]: I1126 15:24:18.194804 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/swift-ring-rebalance-fsr6b" Nov 26 15:24:18 crc kubenswrapper[4785]: I1126 15:24:18.244022 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/438d5704-c198-4184-aba9-e9be2025f903-etc-swift\") pod \"swift-proxy-6bd58cfcf7-mxwdz\" (UID: \"438d5704-c198-4184-aba9-e9be2025f903\") " pod="glance-kuttl-tests/swift-proxy-6bd58cfcf7-mxwdz" Nov 26 15:24:18 crc kubenswrapper[4785]: E1126 15:24:18.244197 4785 projected.go:288] Couldn't get configMap glance-kuttl-tests/swift-ring-files: configmap "swift-ring-files" not found Nov 26 15:24:18 crc kubenswrapper[4785]: E1126 15:24:18.244430 4785 projected.go:194] Error preparing data for projected volume etc-swift for pod glance-kuttl-tests/swift-proxy-6bd58cfcf7-mxwdz: configmap "swift-ring-files" not found Nov 26 15:24:18 crc kubenswrapper[4785]: E1126 15:24:18.244578 4785 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/438d5704-c198-4184-aba9-e9be2025f903-etc-swift podName:438d5704-c198-4184-aba9-e9be2025f903 nodeName:}" failed. No retries permitted until 2025-11-26 15:24:20.244540088 +0000 UTC m=+1023.922905852 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/438d5704-c198-4184-aba9-e9be2025f903-etc-swift") pod "swift-proxy-6bd58cfcf7-mxwdz" (UID: "438d5704-c198-4184-aba9-e9be2025f903") : configmap "swift-ring-files" not found Nov 26 15:24:18 crc kubenswrapper[4785]: I1126 15:24:18.515065 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-index-nhbrf" event={"ID":"abab561d-b77b-4484-803a-e3b57ce4ec3a","Type":"ContainerStarted","Data":"552ced87fbbf3a847a6472468beb5bc8b13a52c4d125a4f5647ca82f7b6be227"} Nov 26 15:24:18 crc kubenswrapper[4785]: I1126 15:24:18.530226 4785 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-index-nhbrf" podStartSLOduration=2.898794623 podStartE2EDuration="4.530203064s" podCreationTimestamp="2025-11-26 15:24:14 +0000 UTC" firstStartedPulling="2025-11-26 15:24:15.707637728 +0000 UTC m=+1019.386003492" lastFinishedPulling="2025-11-26 15:24:17.339046169 +0000 UTC m=+1021.017411933" observedRunningTime="2025-11-26 15:24:18.529090324 +0000 UTC m=+1022.207456128" watchObservedRunningTime="2025-11-26 15:24:18.530203064 +0000 UTC m=+1022.208568868" Nov 26 15:24:18 crc kubenswrapper[4785]: I1126 15:24:18.602076 4785 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-index-tfk4s"] Nov 26 15:24:18 crc kubenswrapper[4785]: I1126 15:24:18.603437 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-index-tfk4s" Nov 26 15:24:18 crc kubenswrapper[4785]: I1126 15:24:18.638160 4785 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-index-tfk4s"] Nov 26 15:24:18 crc kubenswrapper[4785]: I1126 15:24:18.655238 4785 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/swift-ring-rebalance-fsr6b"] Nov 26 15:24:18 crc kubenswrapper[4785]: W1126 15:24:18.670176 4785 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5d518dbb_f95d_409e_be26_ec87f47d465a.slice/crio-14c536cda6a2559bd0a1f25346f5fedaae2d09d3fd8c2ff9f2512f5229ab7f8f WatchSource:0}: Error finding container 14c536cda6a2559bd0a1f25346f5fedaae2d09d3fd8c2ff9f2512f5229ab7f8f: Status 404 returned error can't find the container with id 14c536cda6a2559bd0a1f25346f5fedaae2d09d3fd8c2ff9f2512f5229ab7f8f Nov 26 15:24:18 crc kubenswrapper[4785]: I1126 15:24:18.751761 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mwpdv\" (UniqueName: \"kubernetes.io/projected/4652c290-44a9-4e40-b880-c73c6be91f2d-kube-api-access-mwpdv\") pod \"glance-operator-index-tfk4s\" (UID: \"4652c290-44a9-4e40-b880-c73c6be91f2d\") " pod="openstack-operators/glance-operator-index-tfk4s" Nov 26 15:24:18 crc kubenswrapper[4785]: I1126 15:24:18.853602 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mwpdv\" (UniqueName: \"kubernetes.io/projected/4652c290-44a9-4e40-b880-c73c6be91f2d-kube-api-access-mwpdv\") pod \"glance-operator-index-tfk4s\" (UID: \"4652c290-44a9-4e40-b880-c73c6be91f2d\") " pod="openstack-operators/glance-operator-index-tfk4s" Nov 26 15:24:18 crc kubenswrapper[4785]: I1126 15:24:18.869774 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mwpdv\" (UniqueName: \"kubernetes.io/projected/4652c290-44a9-4e40-b880-c73c6be91f2d-kube-api-access-mwpdv\") pod \"glance-operator-index-tfk4s\" (UID: \"4652c290-44a9-4e40-b880-c73c6be91f2d\") " pod="openstack-operators/glance-operator-index-tfk4s" Nov 26 15:24:18 crc kubenswrapper[4785]: I1126 15:24:18.940582 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-index-tfk4s" Nov 26 15:24:19 crc kubenswrapper[4785]: I1126 15:24:19.428573 4785 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-index-tfk4s"] Nov 26 15:24:19 crc kubenswrapper[4785]: I1126 15:24:19.522938 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-index-tfk4s" event={"ID":"4652c290-44a9-4e40-b880-c73c6be91f2d","Type":"ContainerStarted","Data":"13df964a8e01293f780d1848f5894553a350d2cee2df48295d508d3a4da18b95"} Nov 26 15:24:19 crc kubenswrapper[4785]: I1126 15:24:19.524225 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-ring-rebalance-fsr6b" event={"ID":"5d518dbb-f95d-409e-be26-ec87f47d465a","Type":"ContainerStarted","Data":"14c536cda6a2559bd0a1f25346f5fedaae2d09d3fd8c2ff9f2512f5229ab7f8f"} Nov 26 15:24:19 crc kubenswrapper[4785]: I1126 15:24:19.524318 4785 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/glance-operator-index-nhbrf" podUID="abab561d-b77b-4484-803a-e3b57ce4ec3a" containerName="registry-server" containerID="cri-o://552ced87fbbf3a847a6472468beb5bc8b13a52c4d125a4f5647ca82f7b6be227" gracePeriod=2 Nov 26 15:24:19 crc kubenswrapper[4785]: I1126 15:24:19.928815 4785 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-index-nhbrf" Nov 26 15:24:20 crc kubenswrapper[4785]: I1126 15:24:20.072145 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dn9s4\" (UniqueName: \"kubernetes.io/projected/abab561d-b77b-4484-803a-e3b57ce4ec3a-kube-api-access-dn9s4\") pod \"abab561d-b77b-4484-803a-e3b57ce4ec3a\" (UID: \"abab561d-b77b-4484-803a-e3b57ce4ec3a\") " Nov 26 15:24:20 crc kubenswrapper[4785]: I1126 15:24:20.083156 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/abab561d-b77b-4484-803a-e3b57ce4ec3a-kube-api-access-dn9s4" (OuterVolumeSpecName: "kube-api-access-dn9s4") pod "abab561d-b77b-4484-803a-e3b57ce4ec3a" (UID: "abab561d-b77b-4484-803a-e3b57ce4ec3a"). InnerVolumeSpecName "kube-api-access-dn9s4". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 15:24:20 crc kubenswrapper[4785]: I1126 15:24:20.173667 4785 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dn9s4\" (UniqueName: \"kubernetes.io/projected/abab561d-b77b-4484-803a-e3b57ce4ec3a-kube-api-access-dn9s4\") on node \"crc\" DevicePath \"\"" Nov 26 15:24:20 crc kubenswrapper[4785]: I1126 15:24:20.274400 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/438d5704-c198-4184-aba9-e9be2025f903-etc-swift\") pod \"swift-proxy-6bd58cfcf7-mxwdz\" (UID: \"438d5704-c198-4184-aba9-e9be2025f903\") " pod="glance-kuttl-tests/swift-proxy-6bd58cfcf7-mxwdz" Nov 26 15:24:20 crc kubenswrapper[4785]: E1126 15:24:20.274936 4785 projected.go:288] Couldn't get configMap glance-kuttl-tests/swift-ring-files: configmap "swift-ring-files" not found Nov 26 15:24:20 crc kubenswrapper[4785]: E1126 15:24:20.275081 4785 projected.go:194] Error preparing data for projected volume etc-swift for pod glance-kuttl-tests/swift-proxy-6bd58cfcf7-mxwdz: configmap "swift-ring-files" not found Nov 26 15:24:20 crc kubenswrapper[4785]: E1126 15:24:20.275440 4785 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/438d5704-c198-4184-aba9-e9be2025f903-etc-swift podName:438d5704-c198-4184-aba9-e9be2025f903 nodeName:}" failed. No retries permitted until 2025-11-26 15:24:24.275423292 +0000 UTC m=+1027.953789056 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/438d5704-c198-4184-aba9-e9be2025f903-etc-swift") pod "swift-proxy-6bd58cfcf7-mxwdz" (UID: "438d5704-c198-4184-aba9-e9be2025f903") : configmap "swift-ring-files" not found Nov 26 15:24:20 crc kubenswrapper[4785]: I1126 15:24:20.532289 4785 generic.go:334] "Generic (PLEG): container finished" podID="abab561d-b77b-4484-803a-e3b57ce4ec3a" containerID="552ced87fbbf3a847a6472468beb5bc8b13a52c4d125a4f5647ca82f7b6be227" exitCode=0 Nov 26 15:24:20 crc kubenswrapper[4785]: I1126 15:24:20.532355 4785 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-index-nhbrf" Nov 26 15:24:20 crc kubenswrapper[4785]: I1126 15:24:20.532360 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-index-nhbrf" event={"ID":"abab561d-b77b-4484-803a-e3b57ce4ec3a","Type":"ContainerDied","Data":"552ced87fbbf3a847a6472468beb5bc8b13a52c4d125a4f5647ca82f7b6be227"} Nov 26 15:24:20 crc kubenswrapper[4785]: I1126 15:24:20.532507 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-index-nhbrf" event={"ID":"abab561d-b77b-4484-803a-e3b57ce4ec3a","Type":"ContainerDied","Data":"953cf4423bb63b1b6c36dbb3dc538cd6383883e4da6f9a5e4a178e55da14e586"} Nov 26 15:24:20 crc kubenswrapper[4785]: I1126 15:24:20.532537 4785 scope.go:117] "RemoveContainer" containerID="552ced87fbbf3a847a6472468beb5bc8b13a52c4d125a4f5647ca82f7b6be227" Nov 26 15:24:20 crc kubenswrapper[4785]: I1126 15:24:20.534328 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-index-tfk4s" event={"ID":"4652c290-44a9-4e40-b880-c73c6be91f2d","Type":"ContainerStarted","Data":"814bef214b5b8b373a2f67d795ef2fa6a399d67dd23384238a13a42f741b3066"} Nov 26 15:24:20 crc kubenswrapper[4785]: I1126 15:24:20.559699 4785 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-index-tfk4s" podStartSLOduration=2.26647063 podStartE2EDuration="2.55968516s" podCreationTimestamp="2025-11-26 15:24:18 +0000 UTC" firstStartedPulling="2025-11-26 15:24:19.437215153 +0000 UTC m=+1023.115580917" lastFinishedPulling="2025-11-26 15:24:19.730429673 +0000 UTC m=+1023.408795447" observedRunningTime="2025-11-26 15:24:20.557594744 +0000 UTC m=+1024.235960508" watchObservedRunningTime="2025-11-26 15:24:20.55968516 +0000 UTC m=+1024.238050924" Nov 26 15:24:20 crc kubenswrapper[4785]: I1126 15:24:20.574871 4785 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/glance-operator-index-nhbrf"] Nov 26 15:24:20 crc kubenswrapper[4785]: I1126 15:24:20.580342 4785 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/glance-operator-index-nhbrf"] Nov 26 15:24:21 crc kubenswrapper[4785]: I1126 15:24:21.048283 4785 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="abab561d-b77b-4484-803a-e3b57ce4ec3a" path="/var/lib/kubelet/pods/abab561d-b77b-4484-803a-e3b57ce4ec3a/volumes" Nov 26 15:24:21 crc kubenswrapper[4785]: E1126 15:24:21.799955 4785 projected.go:288] Couldn't get configMap glance-kuttl-tests/swift-ring-files: configmap "swift-ring-files" not found Nov 26 15:24:21 crc kubenswrapper[4785]: E1126 15:24:21.799995 4785 projected.go:194] Error preparing data for projected volume etc-swift for pod glance-kuttl-tests/swift-storage-0: configmap "swift-ring-files" not found Nov 26 15:24:21 crc kubenswrapper[4785]: E1126 15:24:21.800054 4785 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c22f4ea9-991d-4431-be3c-aeb8f547176e-etc-swift podName:c22f4ea9-991d-4431-be3c-aeb8f547176e nodeName:}" failed. No retries permitted until 2025-11-26 15:24:29.800035165 +0000 UTC m=+1033.478400929 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/c22f4ea9-991d-4431-be3c-aeb8f547176e-etc-swift") pod "swift-storage-0" (UID: "c22f4ea9-991d-4431-be3c-aeb8f547176e") : configmap "swift-ring-files" not found Nov 26 15:24:21 crc kubenswrapper[4785]: I1126 15:24:21.800570 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/c22f4ea9-991d-4431-be3c-aeb8f547176e-etc-swift\") pod \"swift-storage-0\" (UID: \"c22f4ea9-991d-4431-be3c-aeb8f547176e\") " pod="glance-kuttl-tests/swift-storage-0" Nov 26 15:24:24 crc kubenswrapper[4785]: I1126 15:24:24.336164 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/438d5704-c198-4184-aba9-e9be2025f903-etc-swift\") pod \"swift-proxy-6bd58cfcf7-mxwdz\" (UID: \"438d5704-c198-4184-aba9-e9be2025f903\") " pod="glance-kuttl-tests/swift-proxy-6bd58cfcf7-mxwdz" Nov 26 15:24:24 crc kubenswrapper[4785]: E1126 15:24:24.336378 4785 projected.go:288] Couldn't get configMap glance-kuttl-tests/swift-ring-files: configmap "swift-ring-files" not found Nov 26 15:24:24 crc kubenswrapper[4785]: E1126 15:24:24.336608 4785 projected.go:194] Error preparing data for projected volume etc-swift for pod glance-kuttl-tests/swift-proxy-6bd58cfcf7-mxwdz: configmap "swift-ring-files" not found Nov 26 15:24:24 crc kubenswrapper[4785]: E1126 15:24:24.336665 4785 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/438d5704-c198-4184-aba9-e9be2025f903-etc-swift podName:438d5704-c198-4184-aba9-e9be2025f903 nodeName:}" failed. No retries permitted until 2025-11-26 15:24:32.336647707 +0000 UTC m=+1036.015013471 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/438d5704-c198-4184-aba9-e9be2025f903-etc-swift") pod "swift-proxy-6bd58cfcf7-mxwdz" (UID: "438d5704-c198-4184-aba9-e9be2025f903") : configmap "swift-ring-files" not found Nov 26 15:24:24 crc kubenswrapper[4785]: I1126 15:24:24.874180 4785 scope.go:117] "RemoveContainer" containerID="552ced87fbbf3a847a6472468beb5bc8b13a52c4d125a4f5647ca82f7b6be227" Nov 26 15:24:24 crc kubenswrapper[4785]: E1126 15:24:24.875368 4785 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"552ced87fbbf3a847a6472468beb5bc8b13a52c4d125a4f5647ca82f7b6be227\": container with ID starting with 552ced87fbbf3a847a6472468beb5bc8b13a52c4d125a4f5647ca82f7b6be227 not found: ID does not exist" containerID="552ced87fbbf3a847a6472468beb5bc8b13a52c4d125a4f5647ca82f7b6be227" Nov 26 15:24:24 crc kubenswrapper[4785]: I1126 15:24:24.875410 4785 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"552ced87fbbf3a847a6472468beb5bc8b13a52c4d125a4f5647ca82f7b6be227"} err="failed to get container status \"552ced87fbbf3a847a6472468beb5bc8b13a52c4d125a4f5647ca82f7b6be227\": rpc error: code = NotFound desc = could not find container \"552ced87fbbf3a847a6472468beb5bc8b13a52c4d125a4f5647ca82f7b6be227\": container with ID starting with 552ced87fbbf3a847a6472468beb5bc8b13a52c4d125a4f5647ca82f7b6be227 not found: ID does not exist" Nov 26 15:24:25 crc kubenswrapper[4785]: I1126 15:24:25.579717 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-ring-rebalance-fsr6b" event={"ID":"5d518dbb-f95d-409e-be26-ec87f47d465a","Type":"ContainerStarted","Data":"c57f27fbd169ff6e81903892472a0f24f6db14195d77780bffa15030069e24d8"} Nov 26 15:24:25 crc kubenswrapper[4785]: I1126 15:24:25.597190 4785 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/swift-ring-rebalance-fsr6b" podStartSLOduration=2.312955844 podStartE2EDuration="8.597174188s" podCreationTimestamp="2025-11-26 15:24:17 +0000 UTC" firstStartedPulling="2025-11-26 15:24:18.670448667 +0000 UTC m=+1022.348814451" lastFinishedPulling="2025-11-26 15:24:24.954667031 +0000 UTC m=+1028.633032795" observedRunningTime="2025-11-26 15:24:25.595960315 +0000 UTC m=+1029.274326089" watchObservedRunningTime="2025-11-26 15:24:25.597174188 +0000 UTC m=+1029.275539952" Nov 26 15:24:28 crc kubenswrapper[4785]: I1126 15:24:28.941436 4785 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-index-tfk4s" Nov 26 15:24:28 crc kubenswrapper[4785]: I1126 15:24:28.941847 4785 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/glance-operator-index-tfk4s" Nov 26 15:24:28 crc kubenswrapper[4785]: I1126 15:24:28.978510 4785 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/glance-operator-index-tfk4s" Nov 26 15:24:29 crc kubenswrapper[4785]: I1126 15:24:29.672363 4785 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-index-tfk4s" Nov 26 15:24:29 crc kubenswrapper[4785]: I1126 15:24:29.816992 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/c22f4ea9-991d-4431-be3c-aeb8f547176e-etc-swift\") pod \"swift-storage-0\" (UID: \"c22f4ea9-991d-4431-be3c-aeb8f547176e\") " pod="glance-kuttl-tests/swift-storage-0" Nov 26 15:24:29 crc kubenswrapper[4785]: E1126 15:24:29.817168 4785 projected.go:288] Couldn't get configMap glance-kuttl-tests/swift-ring-files: configmap "swift-ring-files" not found Nov 26 15:24:29 crc kubenswrapper[4785]: E1126 15:24:29.817404 4785 projected.go:194] Error preparing data for projected volume etc-swift for pod glance-kuttl-tests/swift-storage-0: configmap "swift-ring-files" not found Nov 26 15:24:29 crc kubenswrapper[4785]: E1126 15:24:29.817453 4785 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c22f4ea9-991d-4431-be3c-aeb8f547176e-etc-swift podName:c22f4ea9-991d-4431-be3c-aeb8f547176e nodeName:}" failed. No retries permitted until 2025-11-26 15:24:45.817438362 +0000 UTC m=+1049.495804126 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/c22f4ea9-991d-4431-be3c-aeb8f547176e-etc-swift") pod "swift-storage-0" (UID: "c22f4ea9-991d-4431-be3c-aeb8f547176e") : configmap "swift-ring-files" not found Nov 26 15:24:32 crc kubenswrapper[4785]: I1126 15:24:32.386950 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/438d5704-c198-4184-aba9-e9be2025f903-etc-swift\") pod \"swift-proxy-6bd58cfcf7-mxwdz\" (UID: \"438d5704-c198-4184-aba9-e9be2025f903\") " pod="glance-kuttl-tests/swift-proxy-6bd58cfcf7-mxwdz" Nov 26 15:24:32 crc kubenswrapper[4785]: I1126 15:24:32.398487 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/438d5704-c198-4184-aba9-e9be2025f903-etc-swift\") pod \"swift-proxy-6bd58cfcf7-mxwdz\" (UID: \"438d5704-c198-4184-aba9-e9be2025f903\") " pod="glance-kuttl-tests/swift-proxy-6bd58cfcf7-mxwdz" Nov 26 15:24:32 crc kubenswrapper[4785]: I1126 15:24:32.636257 4785 generic.go:334] "Generic (PLEG): container finished" podID="5d518dbb-f95d-409e-be26-ec87f47d465a" containerID="c57f27fbd169ff6e81903892472a0f24f6db14195d77780bffa15030069e24d8" exitCode=0 Nov 26 15:24:32 crc kubenswrapper[4785]: I1126 15:24:32.636303 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-ring-rebalance-fsr6b" event={"ID":"5d518dbb-f95d-409e-be26-ec87f47d465a","Type":"ContainerDied","Data":"c57f27fbd169ff6e81903892472a0f24f6db14195d77780bffa15030069e24d8"} Nov 26 15:24:32 crc kubenswrapper[4785]: I1126 15:24:32.686876 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/swift-proxy-6bd58cfcf7-mxwdz" Nov 26 15:24:33 crc kubenswrapper[4785]: I1126 15:24:33.123829 4785 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/swift-proxy-6bd58cfcf7-mxwdz"] Nov 26 15:24:33 crc kubenswrapper[4785]: I1126 15:24:33.646478 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-proxy-6bd58cfcf7-mxwdz" event={"ID":"438d5704-c198-4184-aba9-e9be2025f903","Type":"ContainerStarted","Data":"c82348abeb07174d370045b018236ae44cc10b6d976e7666b1a18c38f1de8b89"} Nov 26 15:24:33 crc kubenswrapper[4785]: I1126 15:24:33.647361 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-proxy-6bd58cfcf7-mxwdz" event={"ID":"438d5704-c198-4184-aba9-e9be2025f903","Type":"ContainerStarted","Data":"9fbfaa86194af0a0cc5523362ba91af5a5b065c0cc871cb32e1147e370f5e6d9"} Nov 26 15:24:33 crc kubenswrapper[4785]: I1126 15:24:33.929190 4785 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/swift-ring-rebalance-fsr6b" Nov 26 15:24:34 crc kubenswrapper[4785]: I1126 15:24:34.014534 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/5d518dbb-f95d-409e-be26-ec87f47d465a-ring-data-devices\") pod \"5d518dbb-f95d-409e-be26-ec87f47d465a\" (UID: \"5d518dbb-f95d-409e-be26-ec87f47d465a\") " Nov 26 15:24:34 crc kubenswrapper[4785]: I1126 15:24:34.015412 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5d518dbb-f95d-409e-be26-ec87f47d465a-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "5d518dbb-f95d-409e-be26-ec87f47d465a" (UID: "5d518dbb-f95d-409e-be26-ec87f47d465a"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 15:24:34 crc kubenswrapper[4785]: I1126 15:24:34.015938 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/5d518dbb-f95d-409e-be26-ec87f47d465a-dispersionconf\") pod \"5d518dbb-f95d-409e-be26-ec87f47d465a\" (UID: \"5d518dbb-f95d-409e-be26-ec87f47d465a\") " Nov 26 15:24:34 crc kubenswrapper[4785]: I1126 15:24:34.015985 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fv4cr\" (UniqueName: \"kubernetes.io/projected/5d518dbb-f95d-409e-be26-ec87f47d465a-kube-api-access-fv4cr\") pod \"5d518dbb-f95d-409e-be26-ec87f47d465a\" (UID: \"5d518dbb-f95d-409e-be26-ec87f47d465a\") " Nov 26 15:24:34 crc kubenswrapper[4785]: I1126 15:24:34.016034 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5d518dbb-f95d-409e-be26-ec87f47d465a-scripts\") pod \"5d518dbb-f95d-409e-be26-ec87f47d465a\" (UID: \"5d518dbb-f95d-409e-be26-ec87f47d465a\") " Nov 26 15:24:34 crc kubenswrapper[4785]: I1126 15:24:34.016084 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/5d518dbb-f95d-409e-be26-ec87f47d465a-swiftconf\") pod \"5d518dbb-f95d-409e-be26-ec87f47d465a\" (UID: \"5d518dbb-f95d-409e-be26-ec87f47d465a\") " Nov 26 15:24:34 crc kubenswrapper[4785]: I1126 15:24:34.016130 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/5d518dbb-f95d-409e-be26-ec87f47d465a-etc-swift\") pod \"5d518dbb-f95d-409e-be26-ec87f47d465a\" (UID: \"5d518dbb-f95d-409e-be26-ec87f47d465a\") " Nov 26 15:24:34 crc kubenswrapper[4785]: I1126 15:24:34.016972 4785 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/5d518dbb-f95d-409e-be26-ec87f47d465a-ring-data-devices\") on node \"crc\" DevicePath \"\"" Nov 26 15:24:34 crc kubenswrapper[4785]: I1126 15:24:34.017032 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5d518dbb-f95d-409e-be26-ec87f47d465a-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "5d518dbb-f95d-409e-be26-ec87f47d465a" (UID: "5d518dbb-f95d-409e-be26-ec87f47d465a"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 15:24:34 crc kubenswrapper[4785]: I1126 15:24:34.025257 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5d518dbb-f95d-409e-be26-ec87f47d465a-kube-api-access-fv4cr" (OuterVolumeSpecName: "kube-api-access-fv4cr") pod "5d518dbb-f95d-409e-be26-ec87f47d465a" (UID: "5d518dbb-f95d-409e-be26-ec87f47d465a"). InnerVolumeSpecName "kube-api-access-fv4cr". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 15:24:34 crc kubenswrapper[4785]: I1126 15:24:34.035138 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d518dbb-f95d-409e-be26-ec87f47d465a-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "5d518dbb-f95d-409e-be26-ec87f47d465a" (UID: "5d518dbb-f95d-409e-be26-ec87f47d465a"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 15:24:34 crc kubenswrapper[4785]: I1126 15:24:34.035473 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5d518dbb-f95d-409e-be26-ec87f47d465a-scripts" (OuterVolumeSpecName: "scripts") pod "5d518dbb-f95d-409e-be26-ec87f47d465a" (UID: "5d518dbb-f95d-409e-be26-ec87f47d465a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 15:24:34 crc kubenswrapper[4785]: I1126 15:24:34.035722 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d518dbb-f95d-409e-be26-ec87f47d465a-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "5d518dbb-f95d-409e-be26-ec87f47d465a" (UID: "5d518dbb-f95d-409e-be26-ec87f47d465a"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 15:24:34 crc kubenswrapper[4785]: I1126 15:24:34.118752 4785 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/5d518dbb-f95d-409e-be26-ec87f47d465a-dispersionconf\") on node \"crc\" DevicePath \"\"" Nov 26 15:24:34 crc kubenswrapper[4785]: I1126 15:24:34.118789 4785 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fv4cr\" (UniqueName: \"kubernetes.io/projected/5d518dbb-f95d-409e-be26-ec87f47d465a-kube-api-access-fv4cr\") on node \"crc\" DevicePath \"\"" Nov 26 15:24:34 crc kubenswrapper[4785]: I1126 15:24:34.118801 4785 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5d518dbb-f95d-409e-be26-ec87f47d465a-scripts\") on node \"crc\" DevicePath \"\"" Nov 26 15:24:34 crc kubenswrapper[4785]: I1126 15:24:34.118810 4785 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/5d518dbb-f95d-409e-be26-ec87f47d465a-swiftconf\") on node \"crc\" DevicePath \"\"" Nov 26 15:24:34 crc kubenswrapper[4785]: I1126 15:24:34.118819 4785 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/5d518dbb-f95d-409e-be26-ec87f47d465a-etc-swift\") on node \"crc\" DevicePath \"\"" Nov 26 15:24:34 crc kubenswrapper[4785]: I1126 15:24:34.656600 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-proxy-6bd58cfcf7-mxwdz" event={"ID":"438d5704-c198-4184-aba9-e9be2025f903","Type":"ContainerStarted","Data":"3ef3c3055c11d7586cb77dfb1bd470b25d72a7fa17662adb62760fea7660e542"} Nov 26 15:24:34 crc kubenswrapper[4785]: I1126 15:24:34.656813 4785 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/swift-proxy-6bd58cfcf7-mxwdz" Nov 26 15:24:34 crc kubenswrapper[4785]: I1126 15:24:34.657686 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-ring-rebalance-fsr6b" event={"ID":"5d518dbb-f95d-409e-be26-ec87f47d465a","Type":"ContainerDied","Data":"14c536cda6a2559bd0a1f25346f5fedaae2d09d3fd8c2ff9f2512f5229ab7f8f"} Nov 26 15:24:34 crc kubenswrapper[4785]: I1126 15:24:34.657711 4785 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="14c536cda6a2559bd0a1f25346f5fedaae2d09d3fd8c2ff9f2512f5229ab7f8f" Nov 26 15:24:34 crc kubenswrapper[4785]: I1126 15:24:34.657740 4785 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/swift-ring-rebalance-fsr6b" Nov 26 15:24:34 crc kubenswrapper[4785]: I1126 15:24:34.701009 4785 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/swift-proxy-6bd58cfcf7-mxwdz" podStartSLOduration=18.700982776 podStartE2EDuration="18.700982776s" podCreationTimestamp="2025-11-26 15:24:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 15:24:34.682776213 +0000 UTC m=+1038.361142007" watchObservedRunningTime="2025-11-26 15:24:34.700982776 +0000 UTC m=+1038.379348580" Nov 26 15:24:35 crc kubenswrapper[4785]: I1126 15:24:35.664403 4785 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/swift-proxy-6bd58cfcf7-mxwdz" Nov 26 15:24:42 crc kubenswrapper[4785]: I1126 15:24:42.692405 4785 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/swift-proxy-6bd58cfcf7-mxwdz" Nov 26 15:24:42 crc kubenswrapper[4785]: I1126 15:24:42.693033 4785 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/swift-proxy-6bd58cfcf7-mxwdz" Nov 26 15:24:44 crc kubenswrapper[4785]: I1126 15:24:44.842286 4785 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/3375e8518d2544d2be57982cef9b070243a887947cb6dc52d06f274d4esfw28"] Nov 26 15:24:44 crc kubenswrapper[4785]: E1126 15:24:44.842973 4785 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="abab561d-b77b-4484-803a-e3b57ce4ec3a" containerName="registry-server" Nov 26 15:24:44 crc kubenswrapper[4785]: I1126 15:24:44.842985 4785 state_mem.go:107] "Deleted CPUSet assignment" podUID="abab561d-b77b-4484-803a-e3b57ce4ec3a" containerName="registry-server" Nov 26 15:24:44 crc kubenswrapper[4785]: E1126 15:24:44.843009 4785 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5d518dbb-f95d-409e-be26-ec87f47d465a" containerName="swift-ring-rebalance" Nov 26 15:24:44 crc kubenswrapper[4785]: I1126 15:24:44.843015 4785 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d518dbb-f95d-409e-be26-ec87f47d465a" containerName="swift-ring-rebalance" Nov 26 15:24:44 crc kubenswrapper[4785]: I1126 15:24:44.843139 4785 memory_manager.go:354] "RemoveStaleState removing state" podUID="abab561d-b77b-4484-803a-e3b57ce4ec3a" containerName="registry-server" Nov 26 15:24:44 crc kubenswrapper[4785]: I1126 15:24:44.843149 4785 memory_manager.go:354] "RemoveStaleState removing state" podUID="5d518dbb-f95d-409e-be26-ec87f47d465a" containerName="swift-ring-rebalance" Nov 26 15:24:44 crc kubenswrapper[4785]: I1126 15:24:44.844051 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/3375e8518d2544d2be57982cef9b070243a887947cb6dc52d06f274d4esfw28" Nov 26 15:24:44 crc kubenswrapper[4785]: I1126 15:24:44.847306 4785 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-sfhll" Nov 26 15:24:44 crc kubenswrapper[4785]: I1126 15:24:44.852575 4785 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/3375e8518d2544d2be57982cef9b070243a887947cb6dc52d06f274d4esfw28"] Nov 26 15:24:44 crc kubenswrapper[4785]: I1126 15:24:44.895541 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0da8226f-52cd-44a6-9fc6-b30c8a92c074-bundle\") pod \"3375e8518d2544d2be57982cef9b070243a887947cb6dc52d06f274d4esfw28\" (UID: \"0da8226f-52cd-44a6-9fc6-b30c8a92c074\") " pod="openstack-operators/3375e8518d2544d2be57982cef9b070243a887947cb6dc52d06f274d4esfw28" Nov 26 15:24:44 crc kubenswrapper[4785]: I1126 15:24:44.895687 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tvmnl\" (UniqueName: \"kubernetes.io/projected/0da8226f-52cd-44a6-9fc6-b30c8a92c074-kube-api-access-tvmnl\") pod \"3375e8518d2544d2be57982cef9b070243a887947cb6dc52d06f274d4esfw28\" (UID: \"0da8226f-52cd-44a6-9fc6-b30c8a92c074\") " pod="openstack-operators/3375e8518d2544d2be57982cef9b070243a887947cb6dc52d06f274d4esfw28" Nov 26 15:24:44 crc kubenswrapper[4785]: I1126 15:24:44.895722 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0da8226f-52cd-44a6-9fc6-b30c8a92c074-util\") pod \"3375e8518d2544d2be57982cef9b070243a887947cb6dc52d06f274d4esfw28\" (UID: \"0da8226f-52cd-44a6-9fc6-b30c8a92c074\") " pod="openstack-operators/3375e8518d2544d2be57982cef9b070243a887947cb6dc52d06f274d4esfw28" Nov 26 15:24:44 crc kubenswrapper[4785]: I1126 15:24:44.996597 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tvmnl\" (UniqueName: \"kubernetes.io/projected/0da8226f-52cd-44a6-9fc6-b30c8a92c074-kube-api-access-tvmnl\") pod \"3375e8518d2544d2be57982cef9b070243a887947cb6dc52d06f274d4esfw28\" (UID: \"0da8226f-52cd-44a6-9fc6-b30c8a92c074\") " pod="openstack-operators/3375e8518d2544d2be57982cef9b070243a887947cb6dc52d06f274d4esfw28" Nov 26 15:24:44 crc kubenswrapper[4785]: I1126 15:24:44.996695 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0da8226f-52cd-44a6-9fc6-b30c8a92c074-util\") pod \"3375e8518d2544d2be57982cef9b070243a887947cb6dc52d06f274d4esfw28\" (UID: \"0da8226f-52cd-44a6-9fc6-b30c8a92c074\") " pod="openstack-operators/3375e8518d2544d2be57982cef9b070243a887947cb6dc52d06f274d4esfw28" Nov 26 15:24:44 crc kubenswrapper[4785]: I1126 15:24:44.996753 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0da8226f-52cd-44a6-9fc6-b30c8a92c074-bundle\") pod \"3375e8518d2544d2be57982cef9b070243a887947cb6dc52d06f274d4esfw28\" (UID: \"0da8226f-52cd-44a6-9fc6-b30c8a92c074\") " pod="openstack-operators/3375e8518d2544d2be57982cef9b070243a887947cb6dc52d06f274d4esfw28" Nov 26 15:24:44 crc kubenswrapper[4785]: I1126 15:24:44.997170 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0da8226f-52cd-44a6-9fc6-b30c8a92c074-util\") pod \"3375e8518d2544d2be57982cef9b070243a887947cb6dc52d06f274d4esfw28\" (UID: \"0da8226f-52cd-44a6-9fc6-b30c8a92c074\") " pod="openstack-operators/3375e8518d2544d2be57982cef9b070243a887947cb6dc52d06f274d4esfw28" Nov 26 15:24:44 crc kubenswrapper[4785]: I1126 15:24:44.997242 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0da8226f-52cd-44a6-9fc6-b30c8a92c074-bundle\") pod \"3375e8518d2544d2be57982cef9b070243a887947cb6dc52d06f274d4esfw28\" (UID: \"0da8226f-52cd-44a6-9fc6-b30c8a92c074\") " pod="openstack-operators/3375e8518d2544d2be57982cef9b070243a887947cb6dc52d06f274d4esfw28" Nov 26 15:24:45 crc kubenswrapper[4785]: I1126 15:24:45.016620 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tvmnl\" (UniqueName: \"kubernetes.io/projected/0da8226f-52cd-44a6-9fc6-b30c8a92c074-kube-api-access-tvmnl\") pod \"3375e8518d2544d2be57982cef9b070243a887947cb6dc52d06f274d4esfw28\" (UID: \"0da8226f-52cd-44a6-9fc6-b30c8a92c074\") " pod="openstack-operators/3375e8518d2544d2be57982cef9b070243a887947cb6dc52d06f274d4esfw28" Nov 26 15:24:45 crc kubenswrapper[4785]: I1126 15:24:45.163053 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/3375e8518d2544d2be57982cef9b070243a887947cb6dc52d06f274d4esfw28" Nov 26 15:24:45 crc kubenswrapper[4785]: I1126 15:24:45.590884 4785 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/3375e8518d2544d2be57982cef9b070243a887947cb6dc52d06f274d4esfw28"] Nov 26 15:24:45 crc kubenswrapper[4785]: I1126 15:24:45.750708 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/3375e8518d2544d2be57982cef9b070243a887947cb6dc52d06f274d4esfw28" event={"ID":"0da8226f-52cd-44a6-9fc6-b30c8a92c074","Type":"ContainerStarted","Data":"ba17f09758912a7859d8ece4f3c524fbf0730bf7a5d4ffede95298b4e425dd5b"} Nov 26 15:24:45 crc kubenswrapper[4785]: I1126 15:24:45.909107 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/c22f4ea9-991d-4431-be3c-aeb8f547176e-etc-swift\") pod \"swift-storage-0\" (UID: \"c22f4ea9-991d-4431-be3c-aeb8f547176e\") " pod="glance-kuttl-tests/swift-storage-0" Nov 26 15:24:45 crc kubenswrapper[4785]: I1126 15:24:45.916648 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/c22f4ea9-991d-4431-be3c-aeb8f547176e-etc-swift\") pod \"swift-storage-0\" (UID: \"c22f4ea9-991d-4431-be3c-aeb8f547176e\") " pod="glance-kuttl-tests/swift-storage-0" Nov 26 15:24:46 crc kubenswrapper[4785]: I1126 15:24:46.106932 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/swift-storage-0" Nov 26 15:24:46 crc kubenswrapper[4785]: I1126 15:24:46.559169 4785 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/swift-storage-0"] Nov 26 15:24:46 crc kubenswrapper[4785]: W1126 15:24:46.562959 4785 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc22f4ea9_991d_4431_be3c_aeb8f547176e.slice/crio-e1211f0eb049450a4dc1d8f69a02ea666c5cae3d8e1d06ca3df931ae560f77d5 WatchSource:0}: Error finding container e1211f0eb049450a4dc1d8f69a02ea666c5cae3d8e1d06ca3df931ae560f77d5: Status 404 returned error can't find the container with id e1211f0eb049450a4dc1d8f69a02ea666c5cae3d8e1d06ca3df931ae560f77d5 Nov 26 15:24:46 crc kubenswrapper[4785]: I1126 15:24:46.765993 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-storage-0" event={"ID":"c22f4ea9-991d-4431-be3c-aeb8f547176e","Type":"ContainerStarted","Data":"e1211f0eb049450a4dc1d8f69a02ea666c5cae3d8e1d06ca3df931ae560f77d5"} Nov 26 15:24:46 crc kubenswrapper[4785]: I1126 15:24:46.767506 4785 generic.go:334] "Generic (PLEG): container finished" podID="0da8226f-52cd-44a6-9fc6-b30c8a92c074" containerID="a7b0a36131d6a0d3a345f71127037f728dd160eef2efc5ca8275aa7798ac8a4d" exitCode=0 Nov 26 15:24:46 crc kubenswrapper[4785]: I1126 15:24:46.767571 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/3375e8518d2544d2be57982cef9b070243a887947cb6dc52d06f274d4esfw28" event={"ID":"0da8226f-52cd-44a6-9fc6-b30c8a92c074","Type":"ContainerDied","Data":"a7b0a36131d6a0d3a345f71127037f728dd160eef2efc5ca8275aa7798ac8a4d"} Nov 26 15:24:47 crc kubenswrapper[4785]: I1126 15:24:47.776274 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-storage-0" event={"ID":"c22f4ea9-991d-4431-be3c-aeb8f547176e","Type":"ContainerStarted","Data":"a878b25e0d160b6302b6389b1270b5a14be38842b44e597dc37d0d0a321bec9c"} Nov 26 15:24:47 crc kubenswrapper[4785]: I1126 15:24:47.779016 4785 generic.go:334] "Generic (PLEG): container finished" podID="0da8226f-52cd-44a6-9fc6-b30c8a92c074" containerID="566511f8d25f95a254e693362987453ad0d1b90ea36b57572149c51dfe4e16ce" exitCode=0 Nov 26 15:24:47 crc kubenswrapper[4785]: I1126 15:24:47.779078 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/3375e8518d2544d2be57982cef9b070243a887947cb6dc52d06f274d4esfw28" event={"ID":"0da8226f-52cd-44a6-9fc6-b30c8a92c074","Type":"ContainerDied","Data":"566511f8d25f95a254e693362987453ad0d1b90ea36b57572149c51dfe4e16ce"} Nov 26 15:24:48 crc kubenswrapper[4785]: I1126 15:24:48.788263 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-storage-0" event={"ID":"c22f4ea9-991d-4431-be3c-aeb8f547176e","Type":"ContainerStarted","Data":"f22a3f8ca10317f2273419fc27292c75ba8290df60857522f4d9f2b648855c35"} Nov 26 15:24:48 crc kubenswrapper[4785]: I1126 15:24:48.788830 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-storage-0" event={"ID":"c22f4ea9-991d-4431-be3c-aeb8f547176e","Type":"ContainerStarted","Data":"5c8bb9fbaf333716bb4f23a54b7c8a55c0342870c41986d27cf6d22685130b18"} Nov 26 15:24:48 crc kubenswrapper[4785]: I1126 15:24:48.788907 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-storage-0" event={"ID":"c22f4ea9-991d-4431-be3c-aeb8f547176e","Type":"ContainerStarted","Data":"0866801f3787b0ae7bd448e7cac484d2a63824acb14f3e2efb743e4b3acb413f"} Nov 26 15:24:48 crc kubenswrapper[4785]: I1126 15:24:48.790588 4785 generic.go:334] "Generic (PLEG): container finished" podID="0da8226f-52cd-44a6-9fc6-b30c8a92c074" containerID="e46a9aeac3b31bc772ad8e33eb5aaad1d582e5900ead18c15807825b065fc20b" exitCode=0 Nov 26 15:24:48 crc kubenswrapper[4785]: I1126 15:24:48.790638 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/3375e8518d2544d2be57982cef9b070243a887947cb6dc52d06f274d4esfw28" event={"ID":"0da8226f-52cd-44a6-9fc6-b30c8a92c074","Type":"ContainerDied","Data":"e46a9aeac3b31bc772ad8e33eb5aaad1d582e5900ead18c15807825b065fc20b"} Nov 26 15:24:49 crc kubenswrapper[4785]: I1126 15:24:49.801497 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-storage-0" event={"ID":"c22f4ea9-991d-4431-be3c-aeb8f547176e","Type":"ContainerStarted","Data":"ccb3fea290069405bea7d96a521a49c720705a173dd69358ede2a92678fc49bd"} Nov 26 15:24:49 crc kubenswrapper[4785]: I1126 15:24:49.802143 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-storage-0" event={"ID":"c22f4ea9-991d-4431-be3c-aeb8f547176e","Type":"ContainerStarted","Data":"97aaa2ee9135176c62a21b33c9c69803a94444b15a82b538fc3840d5be85481a"} Nov 26 15:24:50 crc kubenswrapper[4785]: I1126 15:24:50.076856 4785 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/3375e8518d2544d2be57982cef9b070243a887947cb6dc52d06f274d4esfw28" Nov 26 15:24:50 crc kubenswrapper[4785]: I1126 15:24:50.182590 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0da8226f-52cd-44a6-9fc6-b30c8a92c074-bundle\") pod \"0da8226f-52cd-44a6-9fc6-b30c8a92c074\" (UID: \"0da8226f-52cd-44a6-9fc6-b30c8a92c074\") " Nov 26 15:24:50 crc kubenswrapper[4785]: I1126 15:24:50.182707 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0da8226f-52cd-44a6-9fc6-b30c8a92c074-util\") pod \"0da8226f-52cd-44a6-9fc6-b30c8a92c074\" (UID: \"0da8226f-52cd-44a6-9fc6-b30c8a92c074\") " Nov 26 15:24:50 crc kubenswrapper[4785]: I1126 15:24:50.183663 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tvmnl\" (UniqueName: \"kubernetes.io/projected/0da8226f-52cd-44a6-9fc6-b30c8a92c074-kube-api-access-tvmnl\") pod \"0da8226f-52cd-44a6-9fc6-b30c8a92c074\" (UID: \"0da8226f-52cd-44a6-9fc6-b30c8a92c074\") " Nov 26 15:24:50 crc kubenswrapper[4785]: I1126 15:24:50.183744 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0da8226f-52cd-44a6-9fc6-b30c8a92c074-bundle" (OuterVolumeSpecName: "bundle") pod "0da8226f-52cd-44a6-9fc6-b30c8a92c074" (UID: "0da8226f-52cd-44a6-9fc6-b30c8a92c074"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 15:24:50 crc kubenswrapper[4785]: I1126 15:24:50.184192 4785 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0da8226f-52cd-44a6-9fc6-b30c8a92c074-bundle\") on node \"crc\" DevicePath \"\"" Nov 26 15:24:50 crc kubenswrapper[4785]: I1126 15:24:50.189547 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0da8226f-52cd-44a6-9fc6-b30c8a92c074-kube-api-access-tvmnl" (OuterVolumeSpecName: "kube-api-access-tvmnl") pod "0da8226f-52cd-44a6-9fc6-b30c8a92c074" (UID: "0da8226f-52cd-44a6-9fc6-b30c8a92c074"). InnerVolumeSpecName "kube-api-access-tvmnl". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 15:24:50 crc kubenswrapper[4785]: I1126 15:24:50.201415 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0da8226f-52cd-44a6-9fc6-b30c8a92c074-util" (OuterVolumeSpecName: "util") pod "0da8226f-52cd-44a6-9fc6-b30c8a92c074" (UID: "0da8226f-52cd-44a6-9fc6-b30c8a92c074"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 15:24:50 crc kubenswrapper[4785]: I1126 15:24:50.286372 4785 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tvmnl\" (UniqueName: \"kubernetes.io/projected/0da8226f-52cd-44a6-9fc6-b30c8a92c074-kube-api-access-tvmnl\") on node \"crc\" DevicePath \"\"" Nov 26 15:24:50 crc kubenswrapper[4785]: I1126 15:24:50.286413 4785 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0da8226f-52cd-44a6-9fc6-b30c8a92c074-util\") on node \"crc\" DevicePath \"\"" Nov 26 15:24:50 crc kubenswrapper[4785]: I1126 15:24:50.810998 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-storage-0" event={"ID":"c22f4ea9-991d-4431-be3c-aeb8f547176e","Type":"ContainerStarted","Data":"2772c71cbdc35998c676d9f0f13cf91646486a3794e8d839c1e09d837530b4d3"} Nov 26 15:24:50 crc kubenswrapper[4785]: I1126 15:24:50.811033 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-storage-0" event={"ID":"c22f4ea9-991d-4431-be3c-aeb8f547176e","Type":"ContainerStarted","Data":"ebc8c5022b14b29a1b8443dd7abb81ce99328e8aaa8421a6224a05c0dfc44243"} Nov 26 15:24:50 crc kubenswrapper[4785]: I1126 15:24:50.813426 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/3375e8518d2544d2be57982cef9b070243a887947cb6dc52d06f274d4esfw28" event={"ID":"0da8226f-52cd-44a6-9fc6-b30c8a92c074","Type":"ContainerDied","Data":"ba17f09758912a7859d8ece4f3c524fbf0730bf7a5d4ffede95298b4e425dd5b"} Nov 26 15:24:50 crc kubenswrapper[4785]: I1126 15:24:50.813453 4785 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ba17f09758912a7859d8ece4f3c524fbf0730bf7a5d4ffede95298b4e425dd5b" Nov 26 15:24:50 crc kubenswrapper[4785]: I1126 15:24:50.813488 4785 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/3375e8518d2544d2be57982cef9b070243a887947cb6dc52d06f274d4esfw28" Nov 26 15:24:50 crc kubenswrapper[4785]: E1126 15:24:50.914832 4785 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0da8226f_52cd_44a6_9fc6_b30c8a92c074.slice/crio-ba17f09758912a7859d8ece4f3c524fbf0730bf7a5d4ffede95298b4e425dd5b\": RecentStats: unable to find data in memory cache]" Nov 26 15:24:51 crc kubenswrapper[4785]: I1126 15:24:51.826313 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-storage-0" event={"ID":"c22f4ea9-991d-4431-be3c-aeb8f547176e","Type":"ContainerStarted","Data":"2e8c281c8a72b93494ce0075836f0b78872e6d9ba154b8e90a8a12539aa34d30"} Nov 26 15:24:51 crc kubenswrapper[4785]: I1126 15:24:51.826365 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-storage-0" event={"ID":"c22f4ea9-991d-4431-be3c-aeb8f547176e","Type":"ContainerStarted","Data":"8d319a21c29557607b94a3f30a84a2e76789694ea73a0952bcd9c12de7e24577"} Nov 26 15:24:52 crc kubenswrapper[4785]: I1126 15:24:52.842344 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-storage-0" event={"ID":"c22f4ea9-991d-4431-be3c-aeb8f547176e","Type":"ContainerStarted","Data":"9839b91829503260d1e58c53256132bed03a2ba0feb2429e06760c1de5137224"} Nov 26 15:24:52 crc kubenswrapper[4785]: I1126 15:24:52.842402 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-storage-0" event={"ID":"c22f4ea9-991d-4431-be3c-aeb8f547176e","Type":"ContainerStarted","Data":"4e87004aeb7fae6f5dd70c8e97ecd9494593c5e324812b67e7b52ff2e0c3b130"} Nov 26 15:24:52 crc kubenswrapper[4785]: I1126 15:24:52.842421 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-storage-0" event={"ID":"c22f4ea9-991d-4431-be3c-aeb8f547176e","Type":"ContainerStarted","Data":"46413aabc8a4ebb3cf8e05b58d658807b53c762c00935d0fdc72eb4515fdc776"} Nov 26 15:24:52 crc kubenswrapper[4785]: I1126 15:24:52.842439 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-storage-0" event={"ID":"c22f4ea9-991d-4431-be3c-aeb8f547176e","Type":"ContainerStarted","Data":"2a596c3526c347b40f7a0fd8068899d735b1b381293789df1c672d7bcf698d4d"} Nov 26 15:24:52 crc kubenswrapper[4785]: I1126 15:24:52.842457 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-storage-0" event={"ID":"c22f4ea9-991d-4431-be3c-aeb8f547176e","Type":"ContainerStarted","Data":"29d64faf9ac8c817378ec593b82deeb2a4e5955210f3276b3ccfff0623b0e227"} Nov 26 15:24:52 crc kubenswrapper[4785]: I1126 15:24:52.886940 4785 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/swift-storage-0" podStartSLOduration=36.242700774 podStartE2EDuration="40.886903348s" podCreationTimestamp="2025-11-26 15:24:12 +0000 UTC" firstStartedPulling="2025-11-26 15:24:46.565224613 +0000 UTC m=+1050.243590367" lastFinishedPulling="2025-11-26 15:24:51.209427177 +0000 UTC m=+1054.887792941" observedRunningTime="2025-11-26 15:24:52.876779122 +0000 UTC m=+1056.555144926" watchObservedRunningTime="2025-11-26 15:24:52.886903348 +0000 UTC m=+1056.565269182" Nov 26 15:25:00 crc kubenswrapper[4785]: I1126 15:25:00.813094 4785 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-56ccd5f88c-dzft5"] Nov 26 15:25:00 crc kubenswrapper[4785]: E1126 15:25:00.813990 4785 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0da8226f-52cd-44a6-9fc6-b30c8a92c074" containerName="extract" Nov 26 15:25:00 crc kubenswrapper[4785]: I1126 15:25:00.814199 4785 state_mem.go:107] "Deleted CPUSet assignment" podUID="0da8226f-52cd-44a6-9fc6-b30c8a92c074" containerName="extract" Nov 26 15:25:00 crc kubenswrapper[4785]: E1126 15:25:00.814219 4785 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0da8226f-52cd-44a6-9fc6-b30c8a92c074" containerName="pull" Nov 26 15:25:00 crc kubenswrapper[4785]: I1126 15:25:00.814227 4785 state_mem.go:107] "Deleted CPUSet assignment" podUID="0da8226f-52cd-44a6-9fc6-b30c8a92c074" containerName="pull" Nov 26 15:25:00 crc kubenswrapper[4785]: E1126 15:25:00.814241 4785 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0da8226f-52cd-44a6-9fc6-b30c8a92c074" containerName="util" Nov 26 15:25:00 crc kubenswrapper[4785]: I1126 15:25:00.814250 4785 state_mem.go:107] "Deleted CPUSet assignment" podUID="0da8226f-52cd-44a6-9fc6-b30c8a92c074" containerName="util" Nov 26 15:25:00 crc kubenswrapper[4785]: I1126 15:25:00.814403 4785 memory_manager.go:354] "RemoveStaleState removing state" podUID="0da8226f-52cd-44a6-9fc6-b30c8a92c074" containerName="extract" Nov 26 15:25:00 crc kubenswrapper[4785]: I1126 15:25:00.814966 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-56ccd5f88c-dzft5" Nov 26 15:25:00 crc kubenswrapper[4785]: I1126 15:25:00.816971 4785 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-mjzws" Nov 26 15:25:00 crc kubenswrapper[4785]: I1126 15:25:00.817472 4785 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-service-cert" Nov 26 15:25:00 crc kubenswrapper[4785]: I1126 15:25:00.828221 4785 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-56ccd5f88c-dzft5"] Nov 26 15:25:00 crc kubenswrapper[4785]: I1126 15:25:00.945703 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wlhfz\" (UniqueName: \"kubernetes.io/projected/62cac43a-a147-46b5-bbd6-4b452a008291-kube-api-access-wlhfz\") pod \"glance-operator-controller-manager-56ccd5f88c-dzft5\" (UID: \"62cac43a-a147-46b5-bbd6-4b452a008291\") " pod="openstack-operators/glance-operator-controller-manager-56ccd5f88c-dzft5" Nov 26 15:25:00 crc kubenswrapper[4785]: I1126 15:25:00.945763 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/62cac43a-a147-46b5-bbd6-4b452a008291-apiservice-cert\") pod \"glance-operator-controller-manager-56ccd5f88c-dzft5\" (UID: \"62cac43a-a147-46b5-bbd6-4b452a008291\") " pod="openstack-operators/glance-operator-controller-manager-56ccd5f88c-dzft5" Nov 26 15:25:00 crc kubenswrapper[4785]: I1126 15:25:00.945836 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/62cac43a-a147-46b5-bbd6-4b452a008291-webhook-cert\") pod \"glance-operator-controller-manager-56ccd5f88c-dzft5\" (UID: \"62cac43a-a147-46b5-bbd6-4b452a008291\") " pod="openstack-operators/glance-operator-controller-manager-56ccd5f88c-dzft5" Nov 26 15:25:01 crc kubenswrapper[4785]: I1126 15:25:01.053985 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/62cac43a-a147-46b5-bbd6-4b452a008291-webhook-cert\") pod \"glance-operator-controller-manager-56ccd5f88c-dzft5\" (UID: \"62cac43a-a147-46b5-bbd6-4b452a008291\") " pod="openstack-operators/glance-operator-controller-manager-56ccd5f88c-dzft5" Nov 26 15:25:01 crc kubenswrapper[4785]: I1126 15:25:01.054097 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wlhfz\" (UniqueName: \"kubernetes.io/projected/62cac43a-a147-46b5-bbd6-4b452a008291-kube-api-access-wlhfz\") pod \"glance-operator-controller-manager-56ccd5f88c-dzft5\" (UID: \"62cac43a-a147-46b5-bbd6-4b452a008291\") " pod="openstack-operators/glance-operator-controller-manager-56ccd5f88c-dzft5" Nov 26 15:25:01 crc kubenswrapper[4785]: I1126 15:25:01.054145 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/62cac43a-a147-46b5-bbd6-4b452a008291-apiservice-cert\") pod \"glance-operator-controller-manager-56ccd5f88c-dzft5\" (UID: \"62cac43a-a147-46b5-bbd6-4b452a008291\") " pod="openstack-operators/glance-operator-controller-manager-56ccd5f88c-dzft5" Nov 26 15:25:01 crc kubenswrapper[4785]: I1126 15:25:01.061789 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/62cac43a-a147-46b5-bbd6-4b452a008291-apiservice-cert\") pod \"glance-operator-controller-manager-56ccd5f88c-dzft5\" (UID: \"62cac43a-a147-46b5-bbd6-4b452a008291\") " pod="openstack-operators/glance-operator-controller-manager-56ccd5f88c-dzft5" Nov 26 15:25:01 crc kubenswrapper[4785]: I1126 15:25:01.072639 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/62cac43a-a147-46b5-bbd6-4b452a008291-webhook-cert\") pod \"glance-operator-controller-manager-56ccd5f88c-dzft5\" (UID: \"62cac43a-a147-46b5-bbd6-4b452a008291\") " pod="openstack-operators/glance-operator-controller-manager-56ccd5f88c-dzft5" Nov 26 15:25:01 crc kubenswrapper[4785]: I1126 15:25:01.077989 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wlhfz\" (UniqueName: \"kubernetes.io/projected/62cac43a-a147-46b5-bbd6-4b452a008291-kube-api-access-wlhfz\") pod \"glance-operator-controller-manager-56ccd5f88c-dzft5\" (UID: \"62cac43a-a147-46b5-bbd6-4b452a008291\") " pod="openstack-operators/glance-operator-controller-manager-56ccd5f88c-dzft5" Nov 26 15:25:01 crc kubenswrapper[4785]: I1126 15:25:01.144208 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-56ccd5f88c-dzft5" Nov 26 15:25:01 crc kubenswrapper[4785]: I1126 15:25:01.568293 4785 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-56ccd5f88c-dzft5"] Nov 26 15:25:01 crc kubenswrapper[4785]: W1126 15:25:01.577851 4785 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod62cac43a_a147_46b5_bbd6_4b452a008291.slice/crio-c77d8fd08a6dfebaf2041fb20ef124cf985b1d9563c45271f0324817a77f3e0d WatchSource:0}: Error finding container c77d8fd08a6dfebaf2041fb20ef124cf985b1d9563c45271f0324817a77f3e0d: Status 404 returned error can't find the container with id c77d8fd08a6dfebaf2041fb20ef124cf985b1d9563c45271f0324817a77f3e0d Nov 26 15:25:01 crc kubenswrapper[4785]: I1126 15:25:01.910408 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-56ccd5f88c-dzft5" event={"ID":"62cac43a-a147-46b5-bbd6-4b452a008291","Type":"ContainerStarted","Data":"c77d8fd08a6dfebaf2041fb20ef124cf985b1d9563c45271f0324817a77f3e0d"} Nov 26 15:25:03 crc kubenswrapper[4785]: I1126 15:25:03.927266 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-56ccd5f88c-dzft5" event={"ID":"62cac43a-a147-46b5-bbd6-4b452a008291","Type":"ContainerStarted","Data":"76f39fa4101934ea2d6699cd939e3a8b8fd3a504908f7a2fc04158b6f504764c"} Nov 26 15:25:03 crc kubenswrapper[4785]: I1126 15:25:03.927687 4785 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-56ccd5f88c-dzft5" Nov 26 15:25:03 crc kubenswrapper[4785]: I1126 15:25:03.946706 4785 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-56ccd5f88c-dzft5" podStartSLOduration=2.44397174 podStartE2EDuration="3.946688313s" podCreationTimestamp="2025-11-26 15:25:00 +0000 UTC" firstStartedPulling="2025-11-26 15:25:01.580140302 +0000 UTC m=+1065.258506076" lastFinishedPulling="2025-11-26 15:25:03.082856885 +0000 UTC m=+1066.761222649" observedRunningTime="2025-11-26 15:25:03.944395711 +0000 UTC m=+1067.622761495" watchObservedRunningTime="2025-11-26 15:25:03.946688313 +0000 UTC m=+1067.625054067" Nov 26 15:25:11 crc kubenswrapper[4785]: I1126 15:25:11.150397 4785 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-56ccd5f88c-dzft5" Nov 26 15:25:13 crc kubenswrapper[4785]: I1126 15:25:13.922786 4785 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-db-create-dz8z5"] Nov 26 15:25:13 crc kubenswrapper[4785]: I1126 15:25:13.925591 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-create-dz8z5" Nov 26 15:25:13 crc kubenswrapper[4785]: I1126 15:25:13.928914 4785 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-db-create-dz8z5"] Nov 26 15:25:14 crc kubenswrapper[4785]: I1126 15:25:14.027006 4785 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/openstackclient"] Nov 26 15:25:14 crc kubenswrapper[4785]: I1126 15:25:14.027859 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/openstackclient" Nov 26 15:25:14 crc kubenswrapper[4785]: I1126 15:25:14.030229 4785 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"openstack-config-secret" Nov 26 15:25:14 crc kubenswrapper[4785]: I1126 15:25:14.030353 4785 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"default-dockercfg-sflsz" Nov 26 15:25:14 crc kubenswrapper[4785]: I1126 15:25:14.032486 4785 reflector.go:368] Caches populated for *v1.ConfigMap from object-"glance-kuttl-tests"/"openstack-config" Nov 26 15:25:14 crc kubenswrapper[4785]: I1126 15:25:14.033687 4785 reflector.go:368] Caches populated for *v1.ConfigMap from object-"glance-kuttl-tests"/"openstack-scripts-9db6gc427h" Nov 26 15:25:14 crc kubenswrapper[4785]: I1126 15:25:14.039328 4785 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/openstackclient"] Nov 26 15:25:14 crc kubenswrapper[4785]: I1126 15:25:14.067891 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/61469b1b-225b-4ccb-b192-167ce458296d-operator-scripts\") pod \"glance-db-create-dz8z5\" (UID: \"61469b1b-225b-4ccb-b192-167ce458296d\") " pod="glance-kuttl-tests/glance-db-create-dz8z5" Nov 26 15:25:14 crc kubenswrapper[4785]: I1126 15:25:14.067995 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4hq4h\" (UniqueName: \"kubernetes.io/projected/61469b1b-225b-4ccb-b192-167ce458296d-kube-api-access-4hq4h\") pod \"glance-db-create-dz8z5\" (UID: \"61469b1b-225b-4ccb-b192-167ce458296d\") " pod="glance-kuttl-tests/glance-db-create-dz8z5" Nov 26 15:25:14 crc kubenswrapper[4785]: I1126 15:25:14.124491 4785 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-f202-account-create-update-ntk2j"] Nov 26 15:25:14 crc kubenswrapper[4785]: I1126 15:25:14.126064 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-f202-account-create-update-ntk2j" Nov 26 15:25:14 crc kubenswrapper[4785]: I1126 15:25:14.130469 4785 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-f202-account-create-update-ntk2j"] Nov 26 15:25:14 crc kubenswrapper[4785]: I1126 15:25:14.132380 4785 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-db-secret" Nov 26 15:25:14 crc kubenswrapper[4785]: I1126 15:25:14.169053 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/61469b1b-225b-4ccb-b192-167ce458296d-operator-scripts\") pod \"glance-db-create-dz8z5\" (UID: \"61469b1b-225b-4ccb-b192-167ce458296d\") " pod="glance-kuttl-tests/glance-db-create-dz8z5" Nov 26 15:25:14 crc kubenswrapper[4785]: I1126 15:25:14.169357 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/57bb65c9-f6e0-4ee2-ac07-5e1e6d2f70eb-openstack-config\") pod \"openstackclient\" (UID: \"57bb65c9-f6e0-4ee2-ac07-5e1e6d2f70eb\") " pod="glance-kuttl-tests/openstackclient" Nov 26 15:25:14 crc kubenswrapper[4785]: I1126 15:25:14.169518 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-scripts\" (UniqueName: \"kubernetes.io/configmap/57bb65c9-f6e0-4ee2-ac07-5e1e6d2f70eb-openstack-scripts\") pod \"openstackclient\" (UID: \"57bb65c9-f6e0-4ee2-ac07-5e1e6d2f70eb\") " pod="glance-kuttl-tests/openstackclient" Nov 26 15:25:14 crc kubenswrapper[4785]: I1126 15:25:14.169651 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-khjmf\" (UniqueName: \"kubernetes.io/projected/57bb65c9-f6e0-4ee2-ac07-5e1e6d2f70eb-kube-api-access-khjmf\") pod \"openstackclient\" (UID: \"57bb65c9-f6e0-4ee2-ac07-5e1e6d2f70eb\") " pod="glance-kuttl-tests/openstackclient" Nov 26 15:25:14 crc kubenswrapper[4785]: I1126 15:25:14.169800 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/57bb65c9-f6e0-4ee2-ac07-5e1e6d2f70eb-openstack-config-secret\") pod \"openstackclient\" (UID: \"57bb65c9-f6e0-4ee2-ac07-5e1e6d2f70eb\") " pod="glance-kuttl-tests/openstackclient" Nov 26 15:25:14 crc kubenswrapper[4785]: I1126 15:25:14.169930 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4hq4h\" (UniqueName: \"kubernetes.io/projected/61469b1b-225b-4ccb-b192-167ce458296d-kube-api-access-4hq4h\") pod \"glance-db-create-dz8z5\" (UID: \"61469b1b-225b-4ccb-b192-167ce458296d\") " pod="glance-kuttl-tests/glance-db-create-dz8z5" Nov 26 15:25:14 crc kubenswrapper[4785]: I1126 15:25:14.169809 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/61469b1b-225b-4ccb-b192-167ce458296d-operator-scripts\") pod \"glance-db-create-dz8z5\" (UID: \"61469b1b-225b-4ccb-b192-167ce458296d\") " pod="glance-kuttl-tests/glance-db-create-dz8z5" Nov 26 15:25:14 crc kubenswrapper[4785]: I1126 15:25:14.189623 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4hq4h\" (UniqueName: \"kubernetes.io/projected/61469b1b-225b-4ccb-b192-167ce458296d-kube-api-access-4hq4h\") pod \"glance-db-create-dz8z5\" (UID: \"61469b1b-225b-4ccb-b192-167ce458296d\") " pod="glance-kuttl-tests/glance-db-create-dz8z5" Nov 26 15:25:14 crc kubenswrapper[4785]: I1126 15:25:14.248869 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-create-dz8z5" Nov 26 15:25:14 crc kubenswrapper[4785]: I1126 15:25:14.272525 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/57bb65c9-f6e0-4ee2-ac07-5e1e6d2f70eb-openstack-config\") pod \"openstackclient\" (UID: \"57bb65c9-f6e0-4ee2-ac07-5e1e6d2f70eb\") " pod="glance-kuttl-tests/openstackclient" Nov 26 15:25:14 crc kubenswrapper[4785]: I1126 15:25:14.272616 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1c0484b3-bba6-4484-a160-ea8d7ce284d7-operator-scripts\") pod \"glance-f202-account-create-update-ntk2j\" (UID: \"1c0484b3-bba6-4484-a160-ea8d7ce284d7\") " pod="glance-kuttl-tests/glance-f202-account-create-update-ntk2j" Nov 26 15:25:14 crc kubenswrapper[4785]: I1126 15:25:14.272667 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-scripts\" (UniqueName: \"kubernetes.io/configmap/57bb65c9-f6e0-4ee2-ac07-5e1e6d2f70eb-openstack-scripts\") pod \"openstackclient\" (UID: \"57bb65c9-f6e0-4ee2-ac07-5e1e6d2f70eb\") " pod="glance-kuttl-tests/openstackclient" Nov 26 15:25:14 crc kubenswrapper[4785]: I1126 15:25:14.272687 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-khjmf\" (UniqueName: \"kubernetes.io/projected/57bb65c9-f6e0-4ee2-ac07-5e1e6d2f70eb-kube-api-access-khjmf\") pod \"openstackclient\" (UID: \"57bb65c9-f6e0-4ee2-ac07-5e1e6d2f70eb\") " pod="glance-kuttl-tests/openstackclient" Nov 26 15:25:14 crc kubenswrapper[4785]: I1126 15:25:14.272727 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/57bb65c9-f6e0-4ee2-ac07-5e1e6d2f70eb-openstack-config-secret\") pod \"openstackclient\" (UID: \"57bb65c9-f6e0-4ee2-ac07-5e1e6d2f70eb\") " pod="glance-kuttl-tests/openstackclient" Nov 26 15:25:14 crc kubenswrapper[4785]: I1126 15:25:14.272808 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gmt5h\" (UniqueName: \"kubernetes.io/projected/1c0484b3-bba6-4484-a160-ea8d7ce284d7-kube-api-access-gmt5h\") pod \"glance-f202-account-create-update-ntk2j\" (UID: \"1c0484b3-bba6-4484-a160-ea8d7ce284d7\") " pod="glance-kuttl-tests/glance-f202-account-create-update-ntk2j" Nov 26 15:25:14 crc kubenswrapper[4785]: I1126 15:25:14.273999 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/57bb65c9-f6e0-4ee2-ac07-5e1e6d2f70eb-openstack-config\") pod \"openstackclient\" (UID: \"57bb65c9-f6e0-4ee2-ac07-5e1e6d2f70eb\") " pod="glance-kuttl-tests/openstackclient" Nov 26 15:25:14 crc kubenswrapper[4785]: I1126 15:25:14.274257 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-scripts\" (UniqueName: \"kubernetes.io/configmap/57bb65c9-f6e0-4ee2-ac07-5e1e6d2f70eb-openstack-scripts\") pod \"openstackclient\" (UID: \"57bb65c9-f6e0-4ee2-ac07-5e1e6d2f70eb\") " pod="glance-kuttl-tests/openstackclient" Nov 26 15:25:14 crc kubenswrapper[4785]: I1126 15:25:14.277039 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/57bb65c9-f6e0-4ee2-ac07-5e1e6d2f70eb-openstack-config-secret\") pod \"openstackclient\" (UID: \"57bb65c9-f6e0-4ee2-ac07-5e1e6d2f70eb\") " pod="glance-kuttl-tests/openstackclient" Nov 26 15:25:14 crc kubenswrapper[4785]: I1126 15:25:14.290167 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-khjmf\" (UniqueName: \"kubernetes.io/projected/57bb65c9-f6e0-4ee2-ac07-5e1e6d2f70eb-kube-api-access-khjmf\") pod \"openstackclient\" (UID: \"57bb65c9-f6e0-4ee2-ac07-5e1e6d2f70eb\") " pod="glance-kuttl-tests/openstackclient" Nov 26 15:25:14 crc kubenswrapper[4785]: I1126 15:25:14.344877 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/openstackclient" Nov 26 15:25:14 crc kubenswrapper[4785]: I1126 15:25:14.374431 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gmt5h\" (UniqueName: \"kubernetes.io/projected/1c0484b3-bba6-4484-a160-ea8d7ce284d7-kube-api-access-gmt5h\") pod \"glance-f202-account-create-update-ntk2j\" (UID: \"1c0484b3-bba6-4484-a160-ea8d7ce284d7\") " pod="glance-kuttl-tests/glance-f202-account-create-update-ntk2j" Nov 26 15:25:14 crc kubenswrapper[4785]: I1126 15:25:14.374515 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1c0484b3-bba6-4484-a160-ea8d7ce284d7-operator-scripts\") pod \"glance-f202-account-create-update-ntk2j\" (UID: \"1c0484b3-bba6-4484-a160-ea8d7ce284d7\") " pod="glance-kuttl-tests/glance-f202-account-create-update-ntk2j" Nov 26 15:25:14 crc kubenswrapper[4785]: I1126 15:25:14.388150 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1c0484b3-bba6-4484-a160-ea8d7ce284d7-operator-scripts\") pod \"glance-f202-account-create-update-ntk2j\" (UID: \"1c0484b3-bba6-4484-a160-ea8d7ce284d7\") " pod="glance-kuttl-tests/glance-f202-account-create-update-ntk2j" Nov 26 15:25:14 crc kubenswrapper[4785]: I1126 15:25:14.407434 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gmt5h\" (UniqueName: \"kubernetes.io/projected/1c0484b3-bba6-4484-a160-ea8d7ce284d7-kube-api-access-gmt5h\") pod \"glance-f202-account-create-update-ntk2j\" (UID: \"1c0484b3-bba6-4484-a160-ea8d7ce284d7\") " pod="glance-kuttl-tests/glance-f202-account-create-update-ntk2j" Nov 26 15:25:14 crc kubenswrapper[4785]: I1126 15:25:14.469182 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-f202-account-create-update-ntk2j" Nov 26 15:25:14 crc kubenswrapper[4785]: I1126 15:25:14.674169 4785 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-db-create-dz8z5"] Nov 26 15:25:14 crc kubenswrapper[4785]: I1126 15:25:14.755165 4785 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/openstackclient"] Nov 26 15:25:14 crc kubenswrapper[4785]: W1126 15:25:14.768112 4785 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod57bb65c9_f6e0_4ee2_ac07_5e1e6d2f70eb.slice/crio-31dc47a1decc816f7599e2b4a6f82f7731c1cd2a4c724dc73569b1970eee545e WatchSource:0}: Error finding container 31dc47a1decc816f7599e2b4a6f82f7731c1cd2a4c724dc73569b1970eee545e: Status 404 returned error can't find the container with id 31dc47a1decc816f7599e2b4a6f82f7731c1cd2a4c724dc73569b1970eee545e Nov 26 15:25:14 crc kubenswrapper[4785]: I1126 15:25:14.908489 4785 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-f202-account-create-update-ntk2j"] Nov 26 15:25:14 crc kubenswrapper[4785]: W1126 15:25:14.912776 4785 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1c0484b3_bba6_4484_a160_ea8d7ce284d7.slice/crio-67cf84dd9330a13068e27ee8f6460739dcc9356fe5813e52eb68c19f44e18ab2 WatchSource:0}: Error finding container 67cf84dd9330a13068e27ee8f6460739dcc9356fe5813e52eb68c19f44e18ab2: Status 404 returned error can't find the container with id 67cf84dd9330a13068e27ee8f6460739dcc9356fe5813e52eb68c19f44e18ab2 Nov 26 15:25:15 crc kubenswrapper[4785]: I1126 15:25:15.003422 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/openstackclient" event={"ID":"57bb65c9-f6e0-4ee2-ac07-5e1e6d2f70eb","Type":"ContainerStarted","Data":"31dc47a1decc816f7599e2b4a6f82f7731c1cd2a4c724dc73569b1970eee545e"} Nov 26 15:25:15 crc kubenswrapper[4785]: I1126 15:25:15.005938 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-f202-account-create-update-ntk2j" event={"ID":"1c0484b3-bba6-4484-a160-ea8d7ce284d7","Type":"ContainerStarted","Data":"67cf84dd9330a13068e27ee8f6460739dcc9356fe5813e52eb68c19f44e18ab2"} Nov 26 15:25:15 crc kubenswrapper[4785]: I1126 15:25:15.007713 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-create-dz8z5" event={"ID":"61469b1b-225b-4ccb-b192-167ce458296d","Type":"ContainerStarted","Data":"74ec31c8a8a5f6a8a079033bb0b6dc933a2ee0a52ccbde9648717cfa0cc00d91"} Nov 26 15:25:15 crc kubenswrapper[4785]: I1126 15:25:15.007738 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-create-dz8z5" event={"ID":"61469b1b-225b-4ccb-b192-167ce458296d","Type":"ContainerStarted","Data":"8d3edbf095b94f9d498bf2886080ef7982d39a0d8a5e066a3f2b86ee4ee00e43"} Nov 26 15:25:15 crc kubenswrapper[4785]: I1126 15:25:15.021802 4785 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-db-create-dz8z5" podStartSLOduration=2.02178288 podStartE2EDuration="2.02178288s" podCreationTimestamp="2025-11-26 15:25:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 15:25:15.020457464 +0000 UTC m=+1078.698823258" watchObservedRunningTime="2025-11-26 15:25:15.02178288 +0000 UTC m=+1078.700148644" Nov 26 15:25:16 crc kubenswrapper[4785]: I1126 15:25:16.016037 4785 generic.go:334] "Generic (PLEG): container finished" podID="61469b1b-225b-4ccb-b192-167ce458296d" containerID="74ec31c8a8a5f6a8a079033bb0b6dc933a2ee0a52ccbde9648717cfa0cc00d91" exitCode=0 Nov 26 15:25:16 crc kubenswrapper[4785]: I1126 15:25:16.016141 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-create-dz8z5" event={"ID":"61469b1b-225b-4ccb-b192-167ce458296d","Type":"ContainerDied","Data":"74ec31c8a8a5f6a8a079033bb0b6dc933a2ee0a52ccbde9648717cfa0cc00d91"} Nov 26 15:25:16 crc kubenswrapper[4785]: I1126 15:25:16.018789 4785 generic.go:334] "Generic (PLEG): container finished" podID="1c0484b3-bba6-4484-a160-ea8d7ce284d7" containerID="46c065037f15eff56f988da4c26fe1e139a76145894ebc7129dcf286c98afb94" exitCode=0 Nov 26 15:25:16 crc kubenswrapper[4785]: I1126 15:25:16.018823 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-f202-account-create-update-ntk2j" event={"ID":"1c0484b3-bba6-4484-a160-ea8d7ce284d7","Type":"ContainerDied","Data":"46c065037f15eff56f988da4c26fe1e139a76145894ebc7129dcf286c98afb94"} Nov 26 15:25:17 crc kubenswrapper[4785]: I1126 15:25:17.398015 4785 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-f202-account-create-update-ntk2j" Nov 26 15:25:17 crc kubenswrapper[4785]: I1126 15:25:17.402078 4785 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-create-dz8z5" Nov 26 15:25:17 crc kubenswrapper[4785]: I1126 15:25:17.519361 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gmt5h\" (UniqueName: \"kubernetes.io/projected/1c0484b3-bba6-4484-a160-ea8d7ce284d7-kube-api-access-gmt5h\") pod \"1c0484b3-bba6-4484-a160-ea8d7ce284d7\" (UID: \"1c0484b3-bba6-4484-a160-ea8d7ce284d7\") " Nov 26 15:25:17 crc kubenswrapper[4785]: I1126 15:25:17.519408 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4hq4h\" (UniqueName: \"kubernetes.io/projected/61469b1b-225b-4ccb-b192-167ce458296d-kube-api-access-4hq4h\") pod \"61469b1b-225b-4ccb-b192-167ce458296d\" (UID: \"61469b1b-225b-4ccb-b192-167ce458296d\") " Nov 26 15:25:17 crc kubenswrapper[4785]: I1126 15:25:17.519466 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1c0484b3-bba6-4484-a160-ea8d7ce284d7-operator-scripts\") pod \"1c0484b3-bba6-4484-a160-ea8d7ce284d7\" (UID: \"1c0484b3-bba6-4484-a160-ea8d7ce284d7\") " Nov 26 15:25:17 crc kubenswrapper[4785]: I1126 15:25:17.519539 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/61469b1b-225b-4ccb-b192-167ce458296d-operator-scripts\") pod \"61469b1b-225b-4ccb-b192-167ce458296d\" (UID: \"61469b1b-225b-4ccb-b192-167ce458296d\") " Nov 26 15:25:17 crc kubenswrapper[4785]: I1126 15:25:17.520085 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/61469b1b-225b-4ccb-b192-167ce458296d-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "61469b1b-225b-4ccb-b192-167ce458296d" (UID: "61469b1b-225b-4ccb-b192-167ce458296d"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 15:25:17 crc kubenswrapper[4785]: I1126 15:25:17.520107 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1c0484b3-bba6-4484-a160-ea8d7ce284d7-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "1c0484b3-bba6-4484-a160-ea8d7ce284d7" (UID: "1c0484b3-bba6-4484-a160-ea8d7ce284d7"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 15:25:17 crc kubenswrapper[4785]: I1126 15:25:17.530182 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/61469b1b-225b-4ccb-b192-167ce458296d-kube-api-access-4hq4h" (OuterVolumeSpecName: "kube-api-access-4hq4h") pod "61469b1b-225b-4ccb-b192-167ce458296d" (UID: "61469b1b-225b-4ccb-b192-167ce458296d"). InnerVolumeSpecName "kube-api-access-4hq4h". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 15:25:17 crc kubenswrapper[4785]: I1126 15:25:17.530391 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1c0484b3-bba6-4484-a160-ea8d7ce284d7-kube-api-access-gmt5h" (OuterVolumeSpecName: "kube-api-access-gmt5h") pod "1c0484b3-bba6-4484-a160-ea8d7ce284d7" (UID: "1c0484b3-bba6-4484-a160-ea8d7ce284d7"). InnerVolumeSpecName "kube-api-access-gmt5h". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 15:25:17 crc kubenswrapper[4785]: I1126 15:25:17.620956 4785 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/61469b1b-225b-4ccb-b192-167ce458296d-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 26 15:25:17 crc kubenswrapper[4785]: I1126 15:25:17.621287 4785 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gmt5h\" (UniqueName: \"kubernetes.io/projected/1c0484b3-bba6-4484-a160-ea8d7ce284d7-kube-api-access-gmt5h\") on node \"crc\" DevicePath \"\"" Nov 26 15:25:17 crc kubenswrapper[4785]: I1126 15:25:17.621299 4785 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4hq4h\" (UniqueName: \"kubernetes.io/projected/61469b1b-225b-4ccb-b192-167ce458296d-kube-api-access-4hq4h\") on node \"crc\" DevicePath \"\"" Nov 26 15:25:17 crc kubenswrapper[4785]: I1126 15:25:17.621308 4785 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1c0484b3-bba6-4484-a160-ea8d7ce284d7-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 26 15:25:18 crc kubenswrapper[4785]: I1126 15:25:18.035288 4785 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-f202-account-create-update-ntk2j" Nov 26 15:25:18 crc kubenswrapper[4785]: I1126 15:25:18.035295 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-f202-account-create-update-ntk2j" event={"ID":"1c0484b3-bba6-4484-a160-ea8d7ce284d7","Type":"ContainerDied","Data":"67cf84dd9330a13068e27ee8f6460739dcc9356fe5813e52eb68c19f44e18ab2"} Nov 26 15:25:18 crc kubenswrapper[4785]: I1126 15:25:18.035343 4785 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="67cf84dd9330a13068e27ee8f6460739dcc9356fe5813e52eb68c19f44e18ab2" Nov 26 15:25:18 crc kubenswrapper[4785]: I1126 15:25:18.037626 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-create-dz8z5" event={"ID":"61469b1b-225b-4ccb-b192-167ce458296d","Type":"ContainerDied","Data":"8d3edbf095b94f9d498bf2886080ef7982d39a0d8a5e066a3f2b86ee4ee00e43"} Nov 26 15:25:18 crc kubenswrapper[4785]: I1126 15:25:18.037656 4785 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8d3edbf095b94f9d498bf2886080ef7982d39a0d8a5e066a3f2b86ee4ee00e43" Nov 26 15:25:18 crc kubenswrapper[4785]: I1126 15:25:18.037744 4785 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-create-dz8z5" Nov 26 15:25:19 crc kubenswrapper[4785]: I1126 15:25:19.260323 4785 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-db-sync-jg5x5"] Nov 26 15:25:19 crc kubenswrapper[4785]: E1126 15:25:19.260673 4785 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="61469b1b-225b-4ccb-b192-167ce458296d" containerName="mariadb-database-create" Nov 26 15:25:19 crc kubenswrapper[4785]: I1126 15:25:19.260689 4785 state_mem.go:107] "Deleted CPUSet assignment" podUID="61469b1b-225b-4ccb-b192-167ce458296d" containerName="mariadb-database-create" Nov 26 15:25:19 crc kubenswrapper[4785]: E1126 15:25:19.260702 4785 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c0484b3-bba6-4484-a160-ea8d7ce284d7" containerName="mariadb-account-create-update" Nov 26 15:25:19 crc kubenswrapper[4785]: I1126 15:25:19.260710 4785 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c0484b3-bba6-4484-a160-ea8d7ce284d7" containerName="mariadb-account-create-update" Nov 26 15:25:19 crc kubenswrapper[4785]: I1126 15:25:19.260879 4785 memory_manager.go:354] "RemoveStaleState removing state" podUID="1c0484b3-bba6-4484-a160-ea8d7ce284d7" containerName="mariadb-account-create-update" Nov 26 15:25:19 crc kubenswrapper[4785]: I1126 15:25:19.260917 4785 memory_manager.go:354] "RemoveStaleState removing state" podUID="61469b1b-225b-4ccb-b192-167ce458296d" containerName="mariadb-database-create" Nov 26 15:25:19 crc kubenswrapper[4785]: I1126 15:25:19.261453 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-sync-jg5x5" Nov 26 15:25:19 crc kubenswrapper[4785]: I1126 15:25:19.264182 4785 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-config-data" Nov 26 15:25:19 crc kubenswrapper[4785]: I1126 15:25:19.264375 4785 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-glance-dockercfg-krbzt" Nov 26 15:25:19 crc kubenswrapper[4785]: I1126 15:25:19.275375 4785 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-db-sync-jg5x5"] Nov 26 15:25:19 crc kubenswrapper[4785]: I1126 15:25:19.352772 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dhc4g\" (UniqueName: \"kubernetes.io/projected/b8ae3bfd-bcde-4b41-b6ea-32789198b54f-kube-api-access-dhc4g\") pod \"glance-db-sync-jg5x5\" (UID: \"b8ae3bfd-bcde-4b41-b6ea-32789198b54f\") " pod="glance-kuttl-tests/glance-db-sync-jg5x5" Nov 26 15:25:19 crc kubenswrapper[4785]: I1126 15:25:19.352888 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/b8ae3bfd-bcde-4b41-b6ea-32789198b54f-db-sync-config-data\") pod \"glance-db-sync-jg5x5\" (UID: \"b8ae3bfd-bcde-4b41-b6ea-32789198b54f\") " pod="glance-kuttl-tests/glance-db-sync-jg5x5" Nov 26 15:25:19 crc kubenswrapper[4785]: I1126 15:25:19.352925 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b8ae3bfd-bcde-4b41-b6ea-32789198b54f-config-data\") pod \"glance-db-sync-jg5x5\" (UID: \"b8ae3bfd-bcde-4b41-b6ea-32789198b54f\") " pod="glance-kuttl-tests/glance-db-sync-jg5x5" Nov 26 15:25:19 crc kubenswrapper[4785]: I1126 15:25:19.454276 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dhc4g\" (UniqueName: \"kubernetes.io/projected/b8ae3bfd-bcde-4b41-b6ea-32789198b54f-kube-api-access-dhc4g\") pod \"glance-db-sync-jg5x5\" (UID: \"b8ae3bfd-bcde-4b41-b6ea-32789198b54f\") " pod="glance-kuttl-tests/glance-db-sync-jg5x5" Nov 26 15:25:19 crc kubenswrapper[4785]: I1126 15:25:19.454376 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/b8ae3bfd-bcde-4b41-b6ea-32789198b54f-db-sync-config-data\") pod \"glance-db-sync-jg5x5\" (UID: \"b8ae3bfd-bcde-4b41-b6ea-32789198b54f\") " pod="glance-kuttl-tests/glance-db-sync-jg5x5" Nov 26 15:25:19 crc kubenswrapper[4785]: I1126 15:25:19.454409 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b8ae3bfd-bcde-4b41-b6ea-32789198b54f-config-data\") pod \"glance-db-sync-jg5x5\" (UID: \"b8ae3bfd-bcde-4b41-b6ea-32789198b54f\") " pod="glance-kuttl-tests/glance-db-sync-jg5x5" Nov 26 15:25:19 crc kubenswrapper[4785]: I1126 15:25:19.461571 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b8ae3bfd-bcde-4b41-b6ea-32789198b54f-config-data\") pod \"glance-db-sync-jg5x5\" (UID: \"b8ae3bfd-bcde-4b41-b6ea-32789198b54f\") " pod="glance-kuttl-tests/glance-db-sync-jg5x5" Nov 26 15:25:19 crc kubenswrapper[4785]: I1126 15:25:19.472586 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/b8ae3bfd-bcde-4b41-b6ea-32789198b54f-db-sync-config-data\") pod \"glance-db-sync-jg5x5\" (UID: \"b8ae3bfd-bcde-4b41-b6ea-32789198b54f\") " pod="glance-kuttl-tests/glance-db-sync-jg5x5" Nov 26 15:25:19 crc kubenswrapper[4785]: I1126 15:25:19.506212 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dhc4g\" (UniqueName: \"kubernetes.io/projected/b8ae3bfd-bcde-4b41-b6ea-32789198b54f-kube-api-access-dhc4g\") pod \"glance-db-sync-jg5x5\" (UID: \"b8ae3bfd-bcde-4b41-b6ea-32789198b54f\") " pod="glance-kuttl-tests/glance-db-sync-jg5x5" Nov 26 15:25:19 crc kubenswrapper[4785]: I1126 15:25:19.580476 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-sync-jg5x5" Nov 26 15:25:24 crc kubenswrapper[4785]: I1126 15:25:24.550385 4785 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-db-sync-jg5x5"] Nov 26 15:25:24 crc kubenswrapper[4785]: W1126 15:25:24.570230 4785 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb8ae3bfd_bcde_4b41_b6ea_32789198b54f.slice/crio-b72a8b0e5b113728a18742c67069ee1c753112c839dc29520ad409e7d93c970e WatchSource:0}: Error finding container b72a8b0e5b113728a18742c67069ee1c753112c839dc29520ad409e7d93c970e: Status 404 returned error can't find the container with id b72a8b0e5b113728a18742c67069ee1c753112c839dc29520ad409e7d93c970e Nov 26 15:25:25 crc kubenswrapper[4785]: I1126 15:25:25.158435 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/openstackclient" event={"ID":"57bb65c9-f6e0-4ee2-ac07-5e1e6d2f70eb","Type":"ContainerStarted","Data":"dc1013b1b55bc0934364cfa2c204ec072a83a7854196dafb89c23a1dbe8640b6"} Nov 26 15:25:25 crc kubenswrapper[4785]: I1126 15:25:25.160686 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-sync-jg5x5" event={"ID":"b8ae3bfd-bcde-4b41-b6ea-32789198b54f","Type":"ContainerStarted","Data":"b72a8b0e5b113728a18742c67069ee1c753112c839dc29520ad409e7d93c970e"} Nov 26 15:25:27 crc kubenswrapper[4785]: I1126 15:25:27.058405 4785 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/openstackclient" podStartSLOduration=3.557555248 podStartE2EDuration="13.058390058s" podCreationTimestamp="2025-11-26 15:25:14 +0000 UTC" firstStartedPulling="2025-11-26 15:25:14.770155983 +0000 UTC m=+1078.448521747" lastFinishedPulling="2025-11-26 15:25:24.270990773 +0000 UTC m=+1087.949356557" observedRunningTime="2025-11-26 15:25:25.185679215 +0000 UTC m=+1088.864044979" watchObservedRunningTime="2025-11-26 15:25:27.058390058 +0000 UTC m=+1090.736755822" Nov 26 15:25:36 crc kubenswrapper[4785]: I1126 15:25:36.264476 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-sync-jg5x5" event={"ID":"b8ae3bfd-bcde-4b41-b6ea-32789198b54f","Type":"ContainerStarted","Data":"c8b85bd7f94737f2e6c8a48ba50cac91415821b1776a1bc6e3adbaf10332887e"} Nov 26 15:25:36 crc kubenswrapper[4785]: I1126 15:25:36.280321 4785 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-db-sync-jg5x5" podStartSLOduration=6.294674032 podStartE2EDuration="17.280301739s" podCreationTimestamp="2025-11-26 15:25:19 +0000 UTC" firstStartedPulling="2025-11-26 15:25:24.573316581 +0000 UTC m=+1088.251682355" lastFinishedPulling="2025-11-26 15:25:35.558944288 +0000 UTC m=+1099.237310062" observedRunningTime="2025-11-26 15:25:36.278161111 +0000 UTC m=+1099.956526915" watchObservedRunningTime="2025-11-26 15:25:36.280301739 +0000 UTC m=+1099.958667503" Nov 26 15:25:37 crc kubenswrapper[4785]: I1126 15:25:37.289519 4785 patch_prober.go:28] interesting pod/machine-config-daemon-gkxdl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 26 15:25:37 crc kubenswrapper[4785]: I1126 15:25:37.289669 4785 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-gkxdl" podUID="5539d39a-e2bc-4e7f-8b5a-a5e3e10c4ba4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 26 15:25:43 crc kubenswrapper[4785]: I1126 15:25:43.322396 4785 generic.go:334] "Generic (PLEG): container finished" podID="b8ae3bfd-bcde-4b41-b6ea-32789198b54f" containerID="c8b85bd7f94737f2e6c8a48ba50cac91415821b1776a1bc6e3adbaf10332887e" exitCode=0 Nov 26 15:25:43 crc kubenswrapper[4785]: I1126 15:25:43.322581 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-sync-jg5x5" event={"ID":"b8ae3bfd-bcde-4b41-b6ea-32789198b54f","Type":"ContainerDied","Data":"c8b85bd7f94737f2e6c8a48ba50cac91415821b1776a1bc6e3adbaf10332887e"} Nov 26 15:25:44 crc kubenswrapper[4785]: I1126 15:25:44.659447 4785 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-sync-jg5x5" Nov 26 15:25:44 crc kubenswrapper[4785]: I1126 15:25:44.735026 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b8ae3bfd-bcde-4b41-b6ea-32789198b54f-config-data\") pod \"b8ae3bfd-bcde-4b41-b6ea-32789198b54f\" (UID: \"b8ae3bfd-bcde-4b41-b6ea-32789198b54f\") " Nov 26 15:25:44 crc kubenswrapper[4785]: I1126 15:25:44.735270 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dhc4g\" (UniqueName: \"kubernetes.io/projected/b8ae3bfd-bcde-4b41-b6ea-32789198b54f-kube-api-access-dhc4g\") pod \"b8ae3bfd-bcde-4b41-b6ea-32789198b54f\" (UID: \"b8ae3bfd-bcde-4b41-b6ea-32789198b54f\") " Nov 26 15:25:44 crc kubenswrapper[4785]: I1126 15:25:44.735305 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/b8ae3bfd-bcde-4b41-b6ea-32789198b54f-db-sync-config-data\") pod \"b8ae3bfd-bcde-4b41-b6ea-32789198b54f\" (UID: \"b8ae3bfd-bcde-4b41-b6ea-32789198b54f\") " Nov 26 15:25:44 crc kubenswrapper[4785]: I1126 15:25:44.740397 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b8ae3bfd-bcde-4b41-b6ea-32789198b54f-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "b8ae3bfd-bcde-4b41-b6ea-32789198b54f" (UID: "b8ae3bfd-bcde-4b41-b6ea-32789198b54f"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 15:25:44 crc kubenswrapper[4785]: I1126 15:25:44.742070 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b8ae3bfd-bcde-4b41-b6ea-32789198b54f-kube-api-access-dhc4g" (OuterVolumeSpecName: "kube-api-access-dhc4g") pod "b8ae3bfd-bcde-4b41-b6ea-32789198b54f" (UID: "b8ae3bfd-bcde-4b41-b6ea-32789198b54f"). InnerVolumeSpecName "kube-api-access-dhc4g". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 15:25:44 crc kubenswrapper[4785]: I1126 15:25:44.796452 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b8ae3bfd-bcde-4b41-b6ea-32789198b54f-config-data" (OuterVolumeSpecName: "config-data") pod "b8ae3bfd-bcde-4b41-b6ea-32789198b54f" (UID: "b8ae3bfd-bcde-4b41-b6ea-32789198b54f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 15:25:44 crc kubenswrapper[4785]: I1126 15:25:44.838012 4785 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dhc4g\" (UniqueName: \"kubernetes.io/projected/b8ae3bfd-bcde-4b41-b6ea-32789198b54f-kube-api-access-dhc4g\") on node \"crc\" DevicePath \"\"" Nov 26 15:25:44 crc kubenswrapper[4785]: I1126 15:25:44.838046 4785 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/b8ae3bfd-bcde-4b41-b6ea-32789198b54f-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Nov 26 15:25:44 crc kubenswrapper[4785]: I1126 15:25:44.838057 4785 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b8ae3bfd-bcde-4b41-b6ea-32789198b54f-config-data\") on node \"crc\" DevicePath \"\"" Nov 26 15:25:45 crc kubenswrapper[4785]: I1126 15:25:45.338922 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-sync-jg5x5" event={"ID":"b8ae3bfd-bcde-4b41-b6ea-32789198b54f","Type":"ContainerDied","Data":"b72a8b0e5b113728a18742c67069ee1c753112c839dc29520ad409e7d93c970e"} Nov 26 15:25:45 crc kubenswrapper[4785]: I1126 15:25:45.338969 4785 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b72a8b0e5b113728a18742c67069ee1c753112c839dc29520ad409e7d93c970e" Nov 26 15:25:45 crc kubenswrapper[4785]: I1126 15:25:45.339033 4785 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-sync-jg5x5" Nov 26 15:25:46 crc kubenswrapper[4785]: I1126 15:25:46.507924 4785 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-default-single-1"] Nov 26 15:25:46 crc kubenswrapper[4785]: E1126 15:25:46.508621 4785 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8ae3bfd-bcde-4b41-b6ea-32789198b54f" containerName="glance-db-sync" Nov 26 15:25:46 crc kubenswrapper[4785]: I1126 15:25:46.508640 4785 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8ae3bfd-bcde-4b41-b6ea-32789198b54f" containerName="glance-db-sync" Nov 26 15:25:46 crc kubenswrapper[4785]: I1126 15:25:46.508817 4785 memory_manager.go:354] "RemoveStaleState removing state" podUID="b8ae3bfd-bcde-4b41-b6ea-32789198b54f" containerName="glance-db-sync" Nov 26 15:25:46 crc kubenswrapper[4785]: I1126 15:25:46.509694 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-1" Nov 26 15:25:46 crc kubenswrapper[4785]: I1126 15:25:46.512669 4785 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-scripts" Nov 26 15:25:46 crc kubenswrapper[4785]: I1126 15:25:46.512922 4785 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-glance-dockercfg-krbzt" Nov 26 15:25:46 crc kubenswrapper[4785]: I1126 15:25:46.513237 4785 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-default-single-config-data" Nov 26 15:25:46 crc kubenswrapper[4785]: I1126 15:25:46.532468 4785 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-single-1"] Nov 26 15:25:46 crc kubenswrapper[4785]: I1126 15:25:46.559877 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/61667a66-3950-4b82-8bf7-3f6e45dd6953-etc-nvme\") pod \"glance-default-single-1\" (UID: \"61667a66-3950-4b82-8bf7-3f6e45dd6953\") " pod="glance-kuttl-tests/glance-default-single-1" Nov 26 15:25:46 crc kubenswrapper[4785]: I1126 15:25:46.559951 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-single-1\" (UID: \"61667a66-3950-4b82-8bf7-3f6e45dd6953\") " pod="glance-kuttl-tests/glance-default-single-1" Nov 26 15:25:46 crc kubenswrapper[4785]: I1126 15:25:46.559987 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/61667a66-3950-4b82-8bf7-3f6e45dd6953-var-locks-brick\") pod \"glance-default-single-1\" (UID: \"61667a66-3950-4b82-8bf7-3f6e45dd6953\") " pod="glance-kuttl-tests/glance-default-single-1" Nov 26 15:25:46 crc kubenswrapper[4785]: I1126 15:25:46.560224 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/61667a66-3950-4b82-8bf7-3f6e45dd6953-config-data\") pod \"glance-default-single-1\" (UID: \"61667a66-3950-4b82-8bf7-3f6e45dd6953\") " pod="glance-kuttl-tests/glance-default-single-1" Nov 26 15:25:46 crc kubenswrapper[4785]: I1126 15:25:46.560293 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/61667a66-3950-4b82-8bf7-3f6e45dd6953-lib-modules\") pod \"glance-default-single-1\" (UID: \"61667a66-3950-4b82-8bf7-3f6e45dd6953\") " pod="glance-kuttl-tests/glance-default-single-1" Nov 26 15:25:46 crc kubenswrapper[4785]: I1126 15:25:46.560371 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/61667a66-3950-4b82-8bf7-3f6e45dd6953-logs\") pod \"glance-default-single-1\" (UID: \"61667a66-3950-4b82-8bf7-3f6e45dd6953\") " pod="glance-kuttl-tests/glance-default-single-1" Nov 26 15:25:46 crc kubenswrapper[4785]: I1126 15:25:46.560580 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/61667a66-3950-4b82-8bf7-3f6e45dd6953-httpd-run\") pod \"glance-default-single-1\" (UID: \"61667a66-3950-4b82-8bf7-3f6e45dd6953\") " pod="glance-kuttl-tests/glance-default-single-1" Nov 26 15:25:46 crc kubenswrapper[4785]: I1126 15:25:46.560635 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/61667a66-3950-4b82-8bf7-3f6e45dd6953-etc-iscsi\") pod \"glance-default-single-1\" (UID: \"61667a66-3950-4b82-8bf7-3f6e45dd6953\") " pod="glance-kuttl-tests/glance-default-single-1" Nov 26 15:25:46 crc kubenswrapper[4785]: I1126 15:25:46.560726 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/61667a66-3950-4b82-8bf7-3f6e45dd6953-scripts\") pod \"glance-default-single-1\" (UID: \"61667a66-3950-4b82-8bf7-3f6e45dd6953\") " pod="glance-kuttl-tests/glance-default-single-1" Nov 26 15:25:46 crc kubenswrapper[4785]: I1126 15:25:46.560756 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/61667a66-3950-4b82-8bf7-3f6e45dd6953-dev\") pod \"glance-default-single-1\" (UID: \"61667a66-3950-4b82-8bf7-3f6e45dd6953\") " pod="glance-kuttl-tests/glance-default-single-1" Nov 26 15:25:46 crc kubenswrapper[4785]: I1126 15:25:46.560792 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/61667a66-3950-4b82-8bf7-3f6e45dd6953-run\") pod \"glance-default-single-1\" (UID: \"61667a66-3950-4b82-8bf7-3f6e45dd6953\") " pod="glance-kuttl-tests/glance-default-single-1" Nov 26 15:25:46 crc kubenswrapper[4785]: I1126 15:25:46.560865 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z4fqk\" (UniqueName: \"kubernetes.io/projected/61667a66-3950-4b82-8bf7-3f6e45dd6953-kube-api-access-z4fqk\") pod \"glance-default-single-1\" (UID: \"61667a66-3950-4b82-8bf7-3f6e45dd6953\") " pod="glance-kuttl-tests/glance-default-single-1" Nov 26 15:25:46 crc kubenswrapper[4785]: I1126 15:25:46.560916 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-single-1\" (UID: \"61667a66-3950-4b82-8bf7-3f6e45dd6953\") " pod="glance-kuttl-tests/glance-default-single-1" Nov 26 15:25:46 crc kubenswrapper[4785]: I1126 15:25:46.560952 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/61667a66-3950-4b82-8bf7-3f6e45dd6953-sys\") pod \"glance-default-single-1\" (UID: \"61667a66-3950-4b82-8bf7-3f6e45dd6953\") " pod="glance-kuttl-tests/glance-default-single-1" Nov 26 15:25:46 crc kubenswrapper[4785]: I1126 15:25:46.662707 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/61667a66-3950-4b82-8bf7-3f6e45dd6953-etc-nvme\") pod \"glance-default-single-1\" (UID: \"61667a66-3950-4b82-8bf7-3f6e45dd6953\") " pod="glance-kuttl-tests/glance-default-single-1" Nov 26 15:25:46 crc kubenswrapper[4785]: I1126 15:25:46.662761 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-single-1\" (UID: \"61667a66-3950-4b82-8bf7-3f6e45dd6953\") " pod="glance-kuttl-tests/glance-default-single-1" Nov 26 15:25:46 crc kubenswrapper[4785]: I1126 15:25:46.662787 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/61667a66-3950-4b82-8bf7-3f6e45dd6953-var-locks-brick\") pod \"glance-default-single-1\" (UID: \"61667a66-3950-4b82-8bf7-3f6e45dd6953\") " pod="glance-kuttl-tests/glance-default-single-1" Nov 26 15:25:46 crc kubenswrapper[4785]: I1126 15:25:46.662824 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/61667a66-3950-4b82-8bf7-3f6e45dd6953-config-data\") pod \"glance-default-single-1\" (UID: \"61667a66-3950-4b82-8bf7-3f6e45dd6953\") " pod="glance-kuttl-tests/glance-default-single-1" Nov 26 15:25:46 crc kubenswrapper[4785]: I1126 15:25:46.662874 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/61667a66-3950-4b82-8bf7-3f6e45dd6953-lib-modules\") pod \"glance-default-single-1\" (UID: \"61667a66-3950-4b82-8bf7-3f6e45dd6953\") " pod="glance-kuttl-tests/glance-default-single-1" Nov 26 15:25:46 crc kubenswrapper[4785]: I1126 15:25:46.662906 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/61667a66-3950-4b82-8bf7-3f6e45dd6953-logs\") pod \"glance-default-single-1\" (UID: \"61667a66-3950-4b82-8bf7-3f6e45dd6953\") " pod="glance-kuttl-tests/glance-default-single-1" Nov 26 15:25:46 crc kubenswrapper[4785]: I1126 15:25:46.662950 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/61667a66-3950-4b82-8bf7-3f6e45dd6953-httpd-run\") pod \"glance-default-single-1\" (UID: \"61667a66-3950-4b82-8bf7-3f6e45dd6953\") " pod="glance-kuttl-tests/glance-default-single-1" Nov 26 15:25:46 crc kubenswrapper[4785]: I1126 15:25:46.662976 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/61667a66-3950-4b82-8bf7-3f6e45dd6953-etc-iscsi\") pod \"glance-default-single-1\" (UID: \"61667a66-3950-4b82-8bf7-3f6e45dd6953\") " pod="glance-kuttl-tests/glance-default-single-1" Nov 26 15:25:46 crc kubenswrapper[4785]: I1126 15:25:46.663008 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/61667a66-3950-4b82-8bf7-3f6e45dd6953-dev\") pod \"glance-default-single-1\" (UID: \"61667a66-3950-4b82-8bf7-3f6e45dd6953\") " pod="glance-kuttl-tests/glance-default-single-1" Nov 26 15:25:46 crc kubenswrapper[4785]: I1126 15:25:46.663028 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/61667a66-3950-4b82-8bf7-3f6e45dd6953-scripts\") pod \"glance-default-single-1\" (UID: \"61667a66-3950-4b82-8bf7-3f6e45dd6953\") " pod="glance-kuttl-tests/glance-default-single-1" Nov 26 15:25:46 crc kubenswrapper[4785]: I1126 15:25:46.663051 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/61667a66-3950-4b82-8bf7-3f6e45dd6953-run\") pod \"glance-default-single-1\" (UID: \"61667a66-3950-4b82-8bf7-3f6e45dd6953\") " pod="glance-kuttl-tests/glance-default-single-1" Nov 26 15:25:46 crc kubenswrapper[4785]: I1126 15:25:46.663118 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/61667a66-3950-4b82-8bf7-3f6e45dd6953-run\") pod \"glance-default-single-1\" (UID: \"61667a66-3950-4b82-8bf7-3f6e45dd6953\") " pod="glance-kuttl-tests/glance-default-single-1" Nov 26 15:25:46 crc kubenswrapper[4785]: I1126 15:25:46.663476 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z4fqk\" (UniqueName: \"kubernetes.io/projected/61667a66-3950-4b82-8bf7-3f6e45dd6953-kube-api-access-z4fqk\") pod \"glance-default-single-1\" (UID: \"61667a66-3950-4b82-8bf7-3f6e45dd6953\") " pod="glance-kuttl-tests/glance-default-single-1" Nov 26 15:25:46 crc kubenswrapper[4785]: I1126 15:25:46.663515 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-single-1\" (UID: \"61667a66-3950-4b82-8bf7-3f6e45dd6953\") " pod="glance-kuttl-tests/glance-default-single-1" Nov 26 15:25:46 crc kubenswrapper[4785]: I1126 15:25:46.663542 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/61667a66-3950-4b82-8bf7-3f6e45dd6953-sys\") pod \"glance-default-single-1\" (UID: \"61667a66-3950-4b82-8bf7-3f6e45dd6953\") " pod="glance-kuttl-tests/glance-default-single-1" Nov 26 15:25:46 crc kubenswrapper[4785]: I1126 15:25:46.663644 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/61667a66-3950-4b82-8bf7-3f6e45dd6953-sys\") pod \"glance-default-single-1\" (UID: \"61667a66-3950-4b82-8bf7-3f6e45dd6953\") " pod="glance-kuttl-tests/glance-default-single-1" Nov 26 15:25:46 crc kubenswrapper[4785]: I1126 15:25:46.663723 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/61667a66-3950-4b82-8bf7-3f6e45dd6953-etc-nvme\") pod \"glance-default-single-1\" (UID: \"61667a66-3950-4b82-8bf7-3f6e45dd6953\") " pod="glance-kuttl-tests/glance-default-single-1" Nov 26 15:25:46 crc kubenswrapper[4785]: I1126 15:25:46.663857 4785 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-single-1\" (UID: \"61667a66-3950-4b82-8bf7-3f6e45dd6953\") device mount path \"/mnt/openstack/pv06\"" pod="glance-kuttl-tests/glance-default-single-1" Nov 26 15:25:46 crc kubenswrapper[4785]: I1126 15:25:46.663899 4785 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-single-1\" (UID: \"61667a66-3950-4b82-8bf7-3f6e45dd6953\") device mount path \"/mnt/openstack/pv07\"" pod="glance-kuttl-tests/glance-default-single-1" Nov 26 15:25:46 crc kubenswrapper[4785]: I1126 15:25:46.663903 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/61667a66-3950-4b82-8bf7-3f6e45dd6953-etc-iscsi\") pod \"glance-default-single-1\" (UID: \"61667a66-3950-4b82-8bf7-3f6e45dd6953\") " pod="glance-kuttl-tests/glance-default-single-1" Nov 26 15:25:46 crc kubenswrapper[4785]: I1126 15:25:46.663969 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/61667a66-3950-4b82-8bf7-3f6e45dd6953-dev\") pod \"glance-default-single-1\" (UID: \"61667a66-3950-4b82-8bf7-3f6e45dd6953\") " pod="glance-kuttl-tests/glance-default-single-1" Nov 26 15:25:46 crc kubenswrapper[4785]: I1126 15:25:46.664258 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/61667a66-3950-4b82-8bf7-3f6e45dd6953-logs\") pod \"glance-default-single-1\" (UID: \"61667a66-3950-4b82-8bf7-3f6e45dd6953\") " pod="glance-kuttl-tests/glance-default-single-1" Nov 26 15:25:46 crc kubenswrapper[4785]: I1126 15:25:46.664268 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/61667a66-3950-4b82-8bf7-3f6e45dd6953-httpd-run\") pod \"glance-default-single-1\" (UID: \"61667a66-3950-4b82-8bf7-3f6e45dd6953\") " pod="glance-kuttl-tests/glance-default-single-1" Nov 26 15:25:46 crc kubenswrapper[4785]: I1126 15:25:46.664313 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/61667a66-3950-4b82-8bf7-3f6e45dd6953-lib-modules\") pod \"glance-default-single-1\" (UID: \"61667a66-3950-4b82-8bf7-3f6e45dd6953\") " pod="glance-kuttl-tests/glance-default-single-1" Nov 26 15:25:46 crc kubenswrapper[4785]: I1126 15:25:46.664523 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/61667a66-3950-4b82-8bf7-3f6e45dd6953-var-locks-brick\") pod \"glance-default-single-1\" (UID: \"61667a66-3950-4b82-8bf7-3f6e45dd6953\") " pod="glance-kuttl-tests/glance-default-single-1" Nov 26 15:25:46 crc kubenswrapper[4785]: I1126 15:25:46.668587 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/61667a66-3950-4b82-8bf7-3f6e45dd6953-scripts\") pod \"glance-default-single-1\" (UID: \"61667a66-3950-4b82-8bf7-3f6e45dd6953\") " pod="glance-kuttl-tests/glance-default-single-1" Nov 26 15:25:46 crc kubenswrapper[4785]: I1126 15:25:46.689057 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/61667a66-3950-4b82-8bf7-3f6e45dd6953-config-data\") pod \"glance-default-single-1\" (UID: \"61667a66-3950-4b82-8bf7-3f6e45dd6953\") " pod="glance-kuttl-tests/glance-default-single-1" Nov 26 15:25:46 crc kubenswrapper[4785]: I1126 15:25:46.693876 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z4fqk\" (UniqueName: \"kubernetes.io/projected/61667a66-3950-4b82-8bf7-3f6e45dd6953-kube-api-access-z4fqk\") pod \"glance-default-single-1\" (UID: \"61667a66-3950-4b82-8bf7-3f6e45dd6953\") " pod="glance-kuttl-tests/glance-default-single-1" Nov 26 15:25:46 crc kubenswrapper[4785]: I1126 15:25:46.700985 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-single-1\" (UID: \"61667a66-3950-4b82-8bf7-3f6e45dd6953\") " pod="glance-kuttl-tests/glance-default-single-1" Nov 26 15:25:46 crc kubenswrapper[4785]: I1126 15:25:46.705277 4785 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Nov 26 15:25:46 crc kubenswrapper[4785]: I1126 15:25:46.724390 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-0" Nov 26 15:25:46 crc kubenswrapper[4785]: I1126 15:25:46.725772 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-single-1\" (UID: \"61667a66-3950-4b82-8bf7-3f6e45dd6953\") " pod="glance-kuttl-tests/glance-default-single-1" Nov 26 15:25:46 crc kubenswrapper[4785]: I1126 15:25:46.735788 4785 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Nov 26 15:25:46 crc kubenswrapper[4785]: I1126 15:25:46.764303 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/c09ff9ef-4108-4444-80d3-b9481250da9c-var-locks-brick\") pod \"glance-default-single-0\" (UID: \"c09ff9ef-4108-4444-80d3-b9481250da9c\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 15:25:46 crc kubenswrapper[4785]: I1126 15:25:46.764343 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/c09ff9ef-4108-4444-80d3-b9481250da9c-etc-iscsi\") pod \"glance-default-single-0\" (UID: \"c09ff9ef-4108-4444-80d3-b9481250da9c\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 15:25:46 crc kubenswrapper[4785]: I1126 15:25:46.764364 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/c09ff9ef-4108-4444-80d3-b9481250da9c-run\") pod \"glance-default-single-0\" (UID: \"c09ff9ef-4108-4444-80d3-b9481250da9c\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 15:25:46 crc kubenswrapper[4785]: I1126 15:25:46.764390 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-single-0\" (UID: \"c09ff9ef-4108-4444-80d3-b9481250da9c\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 15:25:46 crc kubenswrapper[4785]: I1126 15:25:46.764487 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/c09ff9ef-4108-4444-80d3-b9481250da9c-etc-nvme\") pod \"glance-default-single-0\" (UID: \"c09ff9ef-4108-4444-80d3-b9481250da9c\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 15:25:46 crc kubenswrapper[4785]: I1126 15:25:46.764519 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c09ff9ef-4108-4444-80d3-b9481250da9c-logs\") pod \"glance-default-single-0\" (UID: \"c09ff9ef-4108-4444-80d3-b9481250da9c\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 15:25:46 crc kubenswrapper[4785]: I1126 15:25:46.764567 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mzzph\" (UniqueName: \"kubernetes.io/projected/c09ff9ef-4108-4444-80d3-b9481250da9c-kube-api-access-mzzph\") pod \"glance-default-single-0\" (UID: \"c09ff9ef-4108-4444-80d3-b9481250da9c\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 15:25:46 crc kubenswrapper[4785]: I1126 15:25:46.764595 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-single-0\" (UID: \"c09ff9ef-4108-4444-80d3-b9481250da9c\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 15:25:46 crc kubenswrapper[4785]: I1126 15:25:46.764617 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/c09ff9ef-4108-4444-80d3-b9481250da9c-lib-modules\") pod \"glance-default-single-0\" (UID: \"c09ff9ef-4108-4444-80d3-b9481250da9c\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 15:25:46 crc kubenswrapper[4785]: I1126 15:25:46.764664 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/c09ff9ef-4108-4444-80d3-b9481250da9c-sys\") pod \"glance-default-single-0\" (UID: \"c09ff9ef-4108-4444-80d3-b9481250da9c\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 15:25:46 crc kubenswrapper[4785]: I1126 15:25:46.764700 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/c09ff9ef-4108-4444-80d3-b9481250da9c-dev\") pod \"glance-default-single-0\" (UID: \"c09ff9ef-4108-4444-80d3-b9481250da9c\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 15:25:46 crc kubenswrapper[4785]: I1126 15:25:46.764761 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c09ff9ef-4108-4444-80d3-b9481250da9c-config-data\") pod \"glance-default-single-0\" (UID: \"c09ff9ef-4108-4444-80d3-b9481250da9c\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 15:25:46 crc kubenswrapper[4785]: I1126 15:25:46.764807 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c09ff9ef-4108-4444-80d3-b9481250da9c-httpd-run\") pod \"glance-default-single-0\" (UID: \"c09ff9ef-4108-4444-80d3-b9481250da9c\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 15:25:46 crc kubenswrapper[4785]: I1126 15:25:46.764826 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c09ff9ef-4108-4444-80d3-b9481250da9c-scripts\") pod \"glance-default-single-0\" (UID: \"c09ff9ef-4108-4444-80d3-b9481250da9c\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 15:25:46 crc kubenswrapper[4785]: I1126 15:25:46.827067 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-1" Nov 26 15:25:46 crc kubenswrapper[4785]: I1126 15:25:46.866430 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mzzph\" (UniqueName: \"kubernetes.io/projected/c09ff9ef-4108-4444-80d3-b9481250da9c-kube-api-access-mzzph\") pod \"glance-default-single-0\" (UID: \"c09ff9ef-4108-4444-80d3-b9481250da9c\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 15:25:46 crc kubenswrapper[4785]: I1126 15:25:46.866498 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-single-0\" (UID: \"c09ff9ef-4108-4444-80d3-b9481250da9c\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 15:25:46 crc kubenswrapper[4785]: I1126 15:25:46.866542 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/c09ff9ef-4108-4444-80d3-b9481250da9c-lib-modules\") pod \"glance-default-single-0\" (UID: \"c09ff9ef-4108-4444-80d3-b9481250da9c\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 15:25:46 crc kubenswrapper[4785]: I1126 15:25:46.866620 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/c09ff9ef-4108-4444-80d3-b9481250da9c-sys\") pod \"glance-default-single-0\" (UID: \"c09ff9ef-4108-4444-80d3-b9481250da9c\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 15:25:46 crc kubenswrapper[4785]: I1126 15:25:46.866671 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/c09ff9ef-4108-4444-80d3-b9481250da9c-dev\") pod \"glance-default-single-0\" (UID: \"c09ff9ef-4108-4444-80d3-b9481250da9c\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 15:25:46 crc kubenswrapper[4785]: I1126 15:25:46.866694 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c09ff9ef-4108-4444-80d3-b9481250da9c-config-data\") pod \"glance-default-single-0\" (UID: \"c09ff9ef-4108-4444-80d3-b9481250da9c\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 15:25:46 crc kubenswrapper[4785]: I1126 15:25:46.866699 4785 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-single-0\" (UID: \"c09ff9ef-4108-4444-80d3-b9481250da9c\") device mount path \"/mnt/openstack/pv10\"" pod="glance-kuttl-tests/glance-default-single-0" Nov 26 15:25:46 crc kubenswrapper[4785]: I1126 15:25:46.866840 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/c09ff9ef-4108-4444-80d3-b9481250da9c-sys\") pod \"glance-default-single-0\" (UID: \"c09ff9ef-4108-4444-80d3-b9481250da9c\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 15:25:46 crc kubenswrapper[4785]: I1126 15:25:46.866881 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/c09ff9ef-4108-4444-80d3-b9481250da9c-lib-modules\") pod \"glance-default-single-0\" (UID: \"c09ff9ef-4108-4444-80d3-b9481250da9c\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 15:25:46 crc kubenswrapper[4785]: I1126 15:25:46.866903 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/c09ff9ef-4108-4444-80d3-b9481250da9c-dev\") pod \"glance-default-single-0\" (UID: \"c09ff9ef-4108-4444-80d3-b9481250da9c\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 15:25:46 crc kubenswrapper[4785]: I1126 15:25:46.866713 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c09ff9ef-4108-4444-80d3-b9481250da9c-httpd-run\") pod \"glance-default-single-0\" (UID: \"c09ff9ef-4108-4444-80d3-b9481250da9c\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 15:25:46 crc kubenswrapper[4785]: I1126 15:25:46.866954 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c09ff9ef-4108-4444-80d3-b9481250da9c-scripts\") pod \"glance-default-single-0\" (UID: \"c09ff9ef-4108-4444-80d3-b9481250da9c\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 15:25:46 crc kubenswrapper[4785]: I1126 15:25:46.867010 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/c09ff9ef-4108-4444-80d3-b9481250da9c-var-locks-brick\") pod \"glance-default-single-0\" (UID: \"c09ff9ef-4108-4444-80d3-b9481250da9c\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 15:25:46 crc kubenswrapper[4785]: I1126 15:25:46.867033 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/c09ff9ef-4108-4444-80d3-b9481250da9c-etc-iscsi\") pod \"glance-default-single-0\" (UID: \"c09ff9ef-4108-4444-80d3-b9481250da9c\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 15:25:46 crc kubenswrapper[4785]: I1126 15:25:46.867052 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/c09ff9ef-4108-4444-80d3-b9481250da9c-run\") pod \"glance-default-single-0\" (UID: \"c09ff9ef-4108-4444-80d3-b9481250da9c\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 15:25:46 crc kubenswrapper[4785]: I1126 15:25:46.867075 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-single-0\" (UID: \"c09ff9ef-4108-4444-80d3-b9481250da9c\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 15:25:46 crc kubenswrapper[4785]: I1126 15:25:46.867104 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/c09ff9ef-4108-4444-80d3-b9481250da9c-etc-nvme\") pod \"glance-default-single-0\" (UID: \"c09ff9ef-4108-4444-80d3-b9481250da9c\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 15:25:46 crc kubenswrapper[4785]: I1126 15:25:46.867118 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c09ff9ef-4108-4444-80d3-b9481250da9c-logs\") pod \"glance-default-single-0\" (UID: \"c09ff9ef-4108-4444-80d3-b9481250da9c\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 15:25:46 crc kubenswrapper[4785]: I1126 15:25:46.867204 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c09ff9ef-4108-4444-80d3-b9481250da9c-httpd-run\") pod \"glance-default-single-0\" (UID: \"c09ff9ef-4108-4444-80d3-b9481250da9c\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 15:25:46 crc kubenswrapper[4785]: I1126 15:25:46.867227 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/c09ff9ef-4108-4444-80d3-b9481250da9c-etc-iscsi\") pod \"glance-default-single-0\" (UID: \"c09ff9ef-4108-4444-80d3-b9481250da9c\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 15:25:46 crc kubenswrapper[4785]: I1126 15:25:46.867299 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/c09ff9ef-4108-4444-80d3-b9481250da9c-var-locks-brick\") pod \"glance-default-single-0\" (UID: \"c09ff9ef-4108-4444-80d3-b9481250da9c\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 15:25:46 crc kubenswrapper[4785]: I1126 15:25:46.867400 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c09ff9ef-4108-4444-80d3-b9481250da9c-logs\") pod \"glance-default-single-0\" (UID: \"c09ff9ef-4108-4444-80d3-b9481250da9c\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 15:25:46 crc kubenswrapper[4785]: I1126 15:25:46.867405 4785 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-single-0\" (UID: \"c09ff9ef-4108-4444-80d3-b9481250da9c\") device mount path \"/mnt/openstack/pv09\"" pod="glance-kuttl-tests/glance-default-single-0" Nov 26 15:25:46 crc kubenswrapper[4785]: I1126 15:25:46.867528 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/c09ff9ef-4108-4444-80d3-b9481250da9c-etc-nvme\") pod \"glance-default-single-0\" (UID: \"c09ff9ef-4108-4444-80d3-b9481250da9c\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 15:25:46 crc kubenswrapper[4785]: I1126 15:25:46.867576 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/c09ff9ef-4108-4444-80d3-b9481250da9c-run\") pod \"glance-default-single-0\" (UID: \"c09ff9ef-4108-4444-80d3-b9481250da9c\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 15:25:46 crc kubenswrapper[4785]: I1126 15:25:46.874804 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c09ff9ef-4108-4444-80d3-b9481250da9c-config-data\") pod \"glance-default-single-0\" (UID: \"c09ff9ef-4108-4444-80d3-b9481250da9c\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 15:25:46 crc kubenswrapper[4785]: I1126 15:25:46.884186 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c09ff9ef-4108-4444-80d3-b9481250da9c-scripts\") pod \"glance-default-single-0\" (UID: \"c09ff9ef-4108-4444-80d3-b9481250da9c\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 15:25:46 crc kubenswrapper[4785]: I1126 15:25:46.885976 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-single-0\" (UID: \"c09ff9ef-4108-4444-80d3-b9481250da9c\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 15:25:46 crc kubenswrapper[4785]: I1126 15:25:46.887955 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-single-0\" (UID: \"c09ff9ef-4108-4444-80d3-b9481250da9c\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 15:25:46 crc kubenswrapper[4785]: I1126 15:25:46.890106 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mzzph\" (UniqueName: \"kubernetes.io/projected/c09ff9ef-4108-4444-80d3-b9481250da9c-kube-api-access-mzzph\") pod \"glance-default-single-0\" (UID: \"c09ff9ef-4108-4444-80d3-b9481250da9c\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 15:25:47 crc kubenswrapper[4785]: I1126 15:25:47.080761 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-0" Nov 26 15:25:47 crc kubenswrapper[4785]: I1126 15:25:47.258503 4785 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-single-1"] Nov 26 15:25:47 crc kubenswrapper[4785]: W1126 15:25:47.263596 4785 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod61667a66_3950_4b82_8bf7_3f6e45dd6953.slice/crio-6b2dbab28c7fbcc7bec800d4b890f0d77bdd1fb423aac40f989ea9e24a8a7746 WatchSource:0}: Error finding container 6b2dbab28c7fbcc7bec800d4b890f0d77bdd1fb423aac40f989ea9e24a8a7746: Status 404 returned error can't find the container with id 6b2dbab28c7fbcc7bec800d4b890f0d77bdd1fb423aac40f989ea9e24a8a7746 Nov 26 15:25:47 crc kubenswrapper[4785]: I1126 15:25:47.352495 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-1" event={"ID":"61667a66-3950-4b82-8bf7-3f6e45dd6953","Type":"ContainerStarted","Data":"6b2dbab28c7fbcc7bec800d4b890f0d77bdd1fb423aac40f989ea9e24a8a7746"} Nov 26 15:25:47 crc kubenswrapper[4785]: I1126 15:25:47.480845 4785 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Nov 26 15:25:47 crc kubenswrapper[4785]: W1126 15:25:47.484927 4785 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc09ff9ef_4108_4444_80d3_b9481250da9c.slice/crio-e1e4b5c533c9bd4954a3d85d51dfc17b9b243a561f5c1f01aa51edfb4f4f9009 WatchSource:0}: Error finding container e1e4b5c533c9bd4954a3d85d51dfc17b9b243a561f5c1f01aa51edfb4f4f9009: Status 404 returned error can't find the container with id e1e4b5c533c9bd4954a3d85d51dfc17b9b243a561f5c1f01aa51edfb4f4f9009 Nov 26 15:25:48 crc kubenswrapper[4785]: I1126 15:25:48.363283 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"c09ff9ef-4108-4444-80d3-b9481250da9c","Type":"ContainerStarted","Data":"e1e4b5c533c9bd4954a3d85d51dfc17b9b243a561f5c1f01aa51edfb4f4f9009"} Nov 26 15:25:54 crc kubenswrapper[4785]: I1126 15:25:54.441116 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"c09ff9ef-4108-4444-80d3-b9481250da9c","Type":"ContainerStarted","Data":"855a35fc744f70c03455a11f101a9047ef394054c7d74c9f2459313952093282"} Nov 26 15:25:54 crc kubenswrapper[4785]: I1126 15:25:54.441750 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"c09ff9ef-4108-4444-80d3-b9481250da9c","Type":"ContainerStarted","Data":"0a483ab208e3d4d4d975074a3654d106f9e7cdf95c537b4bcfd3409018003509"} Nov 26 15:25:54 crc kubenswrapper[4785]: I1126 15:25:54.445667 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-1" event={"ID":"61667a66-3950-4b82-8bf7-3f6e45dd6953","Type":"ContainerStarted","Data":"6ec36ae05819f8fb34b1132d6a92dd2eed8f56e8a5f821dac544950caf201de0"} Nov 26 15:25:54 crc kubenswrapper[4785]: I1126 15:25:54.445726 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-1" event={"ID":"61667a66-3950-4b82-8bf7-3f6e45dd6953","Type":"ContainerStarted","Data":"33272b0cd95abb9205ee4016a4b0cb79b068dc24d7ef859f220f4d2312fae2f3"} Nov 26 15:25:54 crc kubenswrapper[4785]: I1126 15:25:54.469429 4785 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-default-single-0" podStartSLOduration=9.46940924 podStartE2EDuration="9.46940924s" podCreationTimestamp="2025-11-26 15:25:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 15:25:54.460121544 +0000 UTC m=+1118.138487328" watchObservedRunningTime="2025-11-26 15:25:54.46940924 +0000 UTC m=+1118.147775004" Nov 26 15:25:54 crc kubenswrapper[4785]: I1126 15:25:54.495224 4785 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-default-single-1" podStartSLOduration=8.495207254 podStartE2EDuration="8.495207254s" podCreationTimestamp="2025-11-26 15:25:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 15:25:54.486713789 +0000 UTC m=+1118.165079573" watchObservedRunningTime="2025-11-26 15:25:54.495207254 +0000 UTC m=+1118.173573018" Nov 26 15:25:56 crc kubenswrapper[4785]: I1126 15:25:56.828165 4785 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-single-1" Nov 26 15:25:56 crc kubenswrapper[4785]: I1126 15:25:56.828889 4785 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-single-1" Nov 26 15:25:56 crc kubenswrapper[4785]: I1126 15:25:56.868950 4785 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-single-1" Nov 26 15:25:56 crc kubenswrapper[4785]: I1126 15:25:56.889177 4785 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-single-1" Nov 26 15:25:57 crc kubenswrapper[4785]: I1126 15:25:57.081584 4785 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-single-0" Nov 26 15:25:57 crc kubenswrapper[4785]: I1126 15:25:57.081949 4785 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-single-0" Nov 26 15:25:57 crc kubenswrapper[4785]: I1126 15:25:57.112108 4785 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-single-0" Nov 26 15:25:57 crc kubenswrapper[4785]: I1126 15:25:57.131938 4785 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-single-0" Nov 26 15:25:57 crc kubenswrapper[4785]: I1126 15:25:57.468677 4785 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-single-0" Nov 26 15:25:57 crc kubenswrapper[4785]: I1126 15:25:57.468720 4785 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-single-0" Nov 26 15:25:57 crc kubenswrapper[4785]: I1126 15:25:57.468732 4785 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-single-1" Nov 26 15:25:57 crc kubenswrapper[4785]: I1126 15:25:57.468743 4785 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-single-1" Nov 26 15:26:00 crc kubenswrapper[4785]: I1126 15:26:00.424745 4785 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-single-0" Nov 26 15:26:00 crc kubenswrapper[4785]: I1126 15:26:00.431600 4785 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-single-1" Nov 26 15:26:02 crc kubenswrapper[4785]: I1126 15:26:02.296297 4785 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-single-0" Nov 26 15:26:02 crc kubenswrapper[4785]: I1126 15:26:02.315150 4785 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-single-1" Nov 26 15:26:02 crc kubenswrapper[4785]: I1126 15:26:02.391321 4785 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Nov 26 15:26:02 crc kubenswrapper[4785]: I1126 15:26:02.515266 4785 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-single-0" podUID="c09ff9ef-4108-4444-80d3-b9481250da9c" containerName="glance-log" containerID="cri-o://0a483ab208e3d4d4d975074a3654d106f9e7cdf95c537b4bcfd3409018003509" gracePeriod=30 Nov 26 15:26:02 crc kubenswrapper[4785]: I1126 15:26:02.515324 4785 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-single-0" podUID="c09ff9ef-4108-4444-80d3-b9481250da9c" containerName="glance-httpd" containerID="cri-o://855a35fc744f70c03455a11f101a9047ef394054c7d74c9f2459313952093282" gracePeriod=30 Nov 26 15:26:03 crc kubenswrapper[4785]: I1126 15:26:03.523953 4785 generic.go:334] "Generic (PLEG): container finished" podID="c09ff9ef-4108-4444-80d3-b9481250da9c" containerID="0a483ab208e3d4d4d975074a3654d106f9e7cdf95c537b4bcfd3409018003509" exitCode=143 Nov 26 15:26:03 crc kubenswrapper[4785]: I1126 15:26:03.524084 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"c09ff9ef-4108-4444-80d3-b9481250da9c","Type":"ContainerDied","Data":"0a483ab208e3d4d4d975074a3654d106f9e7cdf95c537b4bcfd3409018003509"} Nov 26 15:26:05 crc kubenswrapper[4785]: I1126 15:26:05.947662 4785 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-0" Nov 26 15:26:06 crc kubenswrapper[4785]: I1126 15:26:06.077324 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c09ff9ef-4108-4444-80d3-b9481250da9c-scripts\") pod \"c09ff9ef-4108-4444-80d3-b9481250da9c\" (UID: \"c09ff9ef-4108-4444-80d3-b9481250da9c\") " Nov 26 15:26:06 crc kubenswrapper[4785]: I1126 15:26:06.077376 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"c09ff9ef-4108-4444-80d3-b9481250da9c\" (UID: \"c09ff9ef-4108-4444-80d3-b9481250da9c\") " Nov 26 15:26:06 crc kubenswrapper[4785]: I1126 15:26:06.077424 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c09ff9ef-4108-4444-80d3-b9481250da9c-logs\") pod \"c09ff9ef-4108-4444-80d3-b9481250da9c\" (UID: \"c09ff9ef-4108-4444-80d3-b9481250da9c\") " Nov 26 15:26:06 crc kubenswrapper[4785]: I1126 15:26:06.077455 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/c09ff9ef-4108-4444-80d3-b9481250da9c-var-locks-brick\") pod \"c09ff9ef-4108-4444-80d3-b9481250da9c\" (UID: \"c09ff9ef-4108-4444-80d3-b9481250da9c\") " Nov 26 15:26:06 crc kubenswrapper[4785]: I1126 15:26:06.077473 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/c09ff9ef-4108-4444-80d3-b9481250da9c-dev\") pod \"c09ff9ef-4108-4444-80d3-b9481250da9c\" (UID: \"c09ff9ef-4108-4444-80d3-b9481250da9c\") " Nov 26 15:26:06 crc kubenswrapper[4785]: I1126 15:26:06.077505 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mzzph\" (UniqueName: \"kubernetes.io/projected/c09ff9ef-4108-4444-80d3-b9481250da9c-kube-api-access-mzzph\") pod \"c09ff9ef-4108-4444-80d3-b9481250da9c\" (UID: \"c09ff9ef-4108-4444-80d3-b9481250da9c\") " Nov 26 15:26:06 crc kubenswrapper[4785]: I1126 15:26:06.077526 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/c09ff9ef-4108-4444-80d3-b9481250da9c-sys\") pod \"c09ff9ef-4108-4444-80d3-b9481250da9c\" (UID: \"c09ff9ef-4108-4444-80d3-b9481250da9c\") " Nov 26 15:26:06 crc kubenswrapper[4785]: I1126 15:26:06.077543 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c09ff9ef-4108-4444-80d3-b9481250da9c-var-locks-brick" (OuterVolumeSpecName: "var-locks-brick") pod "c09ff9ef-4108-4444-80d3-b9481250da9c" (UID: "c09ff9ef-4108-4444-80d3-b9481250da9c"). InnerVolumeSpecName "var-locks-brick". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 15:26:06 crc kubenswrapper[4785]: I1126 15:26:06.077577 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c09ff9ef-4108-4444-80d3-b9481250da9c-httpd-run\") pod \"c09ff9ef-4108-4444-80d3-b9481250da9c\" (UID: \"c09ff9ef-4108-4444-80d3-b9481250da9c\") " Nov 26 15:26:06 crc kubenswrapper[4785]: I1126 15:26:06.077604 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/c09ff9ef-4108-4444-80d3-b9481250da9c-lib-modules\") pod \"c09ff9ef-4108-4444-80d3-b9481250da9c\" (UID: \"c09ff9ef-4108-4444-80d3-b9481250da9c\") " Nov 26 15:26:06 crc kubenswrapper[4785]: I1126 15:26:06.077604 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c09ff9ef-4108-4444-80d3-b9481250da9c-sys" (OuterVolumeSpecName: "sys") pod "c09ff9ef-4108-4444-80d3-b9481250da9c" (UID: "c09ff9ef-4108-4444-80d3-b9481250da9c"). InnerVolumeSpecName "sys". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 15:26:06 crc kubenswrapper[4785]: I1126 15:26:06.077609 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c09ff9ef-4108-4444-80d3-b9481250da9c-dev" (OuterVolumeSpecName: "dev") pod "c09ff9ef-4108-4444-80d3-b9481250da9c" (UID: "c09ff9ef-4108-4444-80d3-b9481250da9c"). InnerVolumeSpecName "dev". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 15:26:06 crc kubenswrapper[4785]: I1126 15:26:06.077620 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c09ff9ef-4108-4444-80d3-b9481250da9c-lib-modules" (OuterVolumeSpecName: "lib-modules") pod "c09ff9ef-4108-4444-80d3-b9481250da9c" (UID: "c09ff9ef-4108-4444-80d3-b9481250da9c"). InnerVolumeSpecName "lib-modules". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 15:26:06 crc kubenswrapper[4785]: I1126 15:26:06.077634 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance-cache\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"c09ff9ef-4108-4444-80d3-b9481250da9c\" (UID: \"c09ff9ef-4108-4444-80d3-b9481250da9c\") " Nov 26 15:26:06 crc kubenswrapper[4785]: I1126 15:26:06.077823 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/c09ff9ef-4108-4444-80d3-b9481250da9c-run\") pod \"c09ff9ef-4108-4444-80d3-b9481250da9c\" (UID: \"c09ff9ef-4108-4444-80d3-b9481250da9c\") " Nov 26 15:26:06 crc kubenswrapper[4785]: I1126 15:26:06.077880 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/c09ff9ef-4108-4444-80d3-b9481250da9c-etc-iscsi\") pod \"c09ff9ef-4108-4444-80d3-b9481250da9c\" (UID: \"c09ff9ef-4108-4444-80d3-b9481250da9c\") " Nov 26 15:26:06 crc kubenswrapper[4785]: I1126 15:26:06.077943 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c09ff9ef-4108-4444-80d3-b9481250da9c-config-data\") pod \"c09ff9ef-4108-4444-80d3-b9481250da9c\" (UID: \"c09ff9ef-4108-4444-80d3-b9481250da9c\") " Nov 26 15:26:06 crc kubenswrapper[4785]: I1126 15:26:06.077978 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/c09ff9ef-4108-4444-80d3-b9481250da9c-etc-nvme\") pod \"c09ff9ef-4108-4444-80d3-b9481250da9c\" (UID: \"c09ff9ef-4108-4444-80d3-b9481250da9c\") " Nov 26 15:26:06 crc kubenswrapper[4785]: I1126 15:26:06.077943 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c09ff9ef-4108-4444-80d3-b9481250da9c-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "c09ff9ef-4108-4444-80d3-b9481250da9c" (UID: "c09ff9ef-4108-4444-80d3-b9481250da9c"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 15:26:06 crc kubenswrapper[4785]: I1126 15:26:06.077972 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c09ff9ef-4108-4444-80d3-b9481250da9c-etc-iscsi" (OuterVolumeSpecName: "etc-iscsi") pod "c09ff9ef-4108-4444-80d3-b9481250da9c" (UID: "c09ff9ef-4108-4444-80d3-b9481250da9c"). InnerVolumeSpecName "etc-iscsi". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 15:26:06 crc kubenswrapper[4785]: I1126 15:26:06.077999 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c09ff9ef-4108-4444-80d3-b9481250da9c-run" (OuterVolumeSpecName: "run") pod "c09ff9ef-4108-4444-80d3-b9481250da9c" (UID: "c09ff9ef-4108-4444-80d3-b9481250da9c"). InnerVolumeSpecName "run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 15:26:06 crc kubenswrapper[4785]: I1126 15:26:06.078085 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c09ff9ef-4108-4444-80d3-b9481250da9c-etc-nvme" (OuterVolumeSpecName: "etc-nvme") pod "c09ff9ef-4108-4444-80d3-b9481250da9c" (UID: "c09ff9ef-4108-4444-80d3-b9481250da9c"). InnerVolumeSpecName "etc-nvme". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 15:26:06 crc kubenswrapper[4785]: I1126 15:26:06.078221 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c09ff9ef-4108-4444-80d3-b9481250da9c-logs" (OuterVolumeSpecName: "logs") pod "c09ff9ef-4108-4444-80d3-b9481250da9c" (UID: "c09ff9ef-4108-4444-80d3-b9481250da9c"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 15:26:06 crc kubenswrapper[4785]: I1126 15:26:06.078813 4785 reconciler_common.go:293] "Volume detached for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/c09ff9ef-4108-4444-80d3-b9481250da9c-sys\") on node \"crc\" DevicePath \"\"" Nov 26 15:26:06 crc kubenswrapper[4785]: I1126 15:26:06.078840 4785 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c09ff9ef-4108-4444-80d3-b9481250da9c-httpd-run\") on node \"crc\" DevicePath \"\"" Nov 26 15:26:06 crc kubenswrapper[4785]: I1126 15:26:06.078860 4785 reconciler_common.go:293] "Volume detached for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/c09ff9ef-4108-4444-80d3-b9481250da9c-lib-modules\") on node \"crc\" DevicePath \"\"" Nov 26 15:26:06 crc kubenswrapper[4785]: I1126 15:26:06.078877 4785 reconciler_common.go:293] "Volume detached for volume \"run\" (UniqueName: \"kubernetes.io/host-path/c09ff9ef-4108-4444-80d3-b9481250da9c-run\") on node \"crc\" DevicePath \"\"" Nov 26 15:26:06 crc kubenswrapper[4785]: I1126 15:26:06.078894 4785 reconciler_common.go:293] "Volume detached for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/c09ff9ef-4108-4444-80d3-b9481250da9c-etc-iscsi\") on node \"crc\" DevicePath \"\"" Nov 26 15:26:06 crc kubenswrapper[4785]: I1126 15:26:06.078913 4785 reconciler_common.go:293] "Volume detached for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/c09ff9ef-4108-4444-80d3-b9481250da9c-etc-nvme\") on node \"crc\" DevicePath \"\"" Nov 26 15:26:06 crc kubenswrapper[4785]: I1126 15:26:06.078929 4785 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c09ff9ef-4108-4444-80d3-b9481250da9c-logs\") on node \"crc\" DevicePath \"\"" Nov 26 15:26:06 crc kubenswrapper[4785]: I1126 15:26:06.078948 4785 reconciler_common.go:293] "Volume detached for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/c09ff9ef-4108-4444-80d3-b9481250da9c-var-locks-brick\") on node \"crc\" DevicePath \"\"" Nov 26 15:26:06 crc kubenswrapper[4785]: I1126 15:26:06.078968 4785 reconciler_common.go:293] "Volume detached for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/c09ff9ef-4108-4444-80d3-b9481250da9c-dev\") on node \"crc\" DevicePath \"\"" Nov 26 15:26:06 crc kubenswrapper[4785]: I1126 15:26:06.089892 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c09ff9ef-4108-4444-80d3-b9481250da9c-scripts" (OuterVolumeSpecName: "scripts") pod "c09ff9ef-4108-4444-80d3-b9481250da9c" (UID: "c09ff9ef-4108-4444-80d3-b9481250da9c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 15:26:06 crc kubenswrapper[4785]: I1126 15:26:06.089946 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage09-crc" (OuterVolumeSpecName: "glance-cache") pod "c09ff9ef-4108-4444-80d3-b9481250da9c" (UID: "c09ff9ef-4108-4444-80d3-b9481250da9c"). InnerVolumeSpecName "local-storage09-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Nov 26 15:26:06 crc kubenswrapper[4785]: I1126 15:26:06.089963 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c09ff9ef-4108-4444-80d3-b9481250da9c-kube-api-access-mzzph" (OuterVolumeSpecName: "kube-api-access-mzzph") pod "c09ff9ef-4108-4444-80d3-b9481250da9c" (UID: "c09ff9ef-4108-4444-80d3-b9481250da9c"). InnerVolumeSpecName "kube-api-access-mzzph". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 15:26:06 crc kubenswrapper[4785]: I1126 15:26:06.091799 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage10-crc" (OuterVolumeSpecName: "glance") pod "c09ff9ef-4108-4444-80d3-b9481250da9c" (UID: "c09ff9ef-4108-4444-80d3-b9481250da9c"). InnerVolumeSpecName "local-storage10-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Nov 26 15:26:06 crc kubenswrapper[4785]: I1126 15:26:06.124105 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c09ff9ef-4108-4444-80d3-b9481250da9c-config-data" (OuterVolumeSpecName: "config-data") pod "c09ff9ef-4108-4444-80d3-b9481250da9c" (UID: "c09ff9ef-4108-4444-80d3-b9481250da9c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 15:26:06 crc kubenswrapper[4785]: I1126 15:26:06.180414 4785 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c09ff9ef-4108-4444-80d3-b9481250da9c-config-data\") on node \"crc\" DevicePath \"\"" Nov 26 15:26:06 crc kubenswrapper[4785]: I1126 15:26:06.180450 4785 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c09ff9ef-4108-4444-80d3-b9481250da9c-scripts\") on node \"crc\" DevicePath \"\"" Nov 26 15:26:06 crc kubenswrapper[4785]: I1126 15:26:06.180488 4785 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" " Nov 26 15:26:06 crc kubenswrapper[4785]: I1126 15:26:06.180498 4785 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mzzph\" (UniqueName: \"kubernetes.io/projected/c09ff9ef-4108-4444-80d3-b9481250da9c-kube-api-access-mzzph\") on node \"crc\" DevicePath \"\"" Nov 26 15:26:06 crc kubenswrapper[4785]: I1126 15:26:06.180514 4785 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" " Nov 26 15:26:06 crc kubenswrapper[4785]: I1126 15:26:06.192918 4785 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage10-crc" (UniqueName: "kubernetes.io/local-volume/local-storage10-crc") on node "crc" Nov 26 15:26:06 crc kubenswrapper[4785]: I1126 15:26:06.193898 4785 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage09-crc" (UniqueName: "kubernetes.io/local-volume/local-storage09-crc") on node "crc" Nov 26 15:26:06 crc kubenswrapper[4785]: I1126 15:26:06.282385 4785 reconciler_common.go:293] "Volume detached for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" DevicePath \"\"" Nov 26 15:26:06 crc kubenswrapper[4785]: I1126 15:26:06.282420 4785 reconciler_common.go:293] "Volume detached for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" DevicePath \"\"" Nov 26 15:26:06 crc kubenswrapper[4785]: I1126 15:26:06.555445 4785 generic.go:334] "Generic (PLEG): container finished" podID="c09ff9ef-4108-4444-80d3-b9481250da9c" containerID="855a35fc744f70c03455a11f101a9047ef394054c7d74c9f2459313952093282" exitCode=0 Nov 26 15:26:06 crc kubenswrapper[4785]: I1126 15:26:06.555495 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"c09ff9ef-4108-4444-80d3-b9481250da9c","Type":"ContainerDied","Data":"855a35fc744f70c03455a11f101a9047ef394054c7d74c9f2459313952093282"} Nov 26 15:26:06 crc kubenswrapper[4785]: I1126 15:26:06.555526 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"c09ff9ef-4108-4444-80d3-b9481250da9c","Type":"ContainerDied","Data":"e1e4b5c533c9bd4954a3d85d51dfc17b9b243a561f5c1f01aa51edfb4f4f9009"} Nov 26 15:26:06 crc kubenswrapper[4785]: I1126 15:26:06.555546 4785 scope.go:117] "RemoveContainer" containerID="855a35fc744f70c03455a11f101a9047ef394054c7d74c9f2459313952093282" Nov 26 15:26:06 crc kubenswrapper[4785]: I1126 15:26:06.555541 4785 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-0" Nov 26 15:26:06 crc kubenswrapper[4785]: I1126 15:26:06.587258 4785 scope.go:117] "RemoveContainer" containerID="0a483ab208e3d4d4d975074a3654d106f9e7cdf95c537b4bcfd3409018003509" Nov 26 15:26:06 crc kubenswrapper[4785]: I1126 15:26:06.596650 4785 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Nov 26 15:26:06 crc kubenswrapper[4785]: I1126 15:26:06.610048 4785 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Nov 26 15:26:06 crc kubenswrapper[4785]: I1126 15:26:06.610418 4785 scope.go:117] "RemoveContainer" containerID="855a35fc744f70c03455a11f101a9047ef394054c7d74c9f2459313952093282" Nov 26 15:26:06 crc kubenswrapper[4785]: E1126 15:26:06.611017 4785 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"855a35fc744f70c03455a11f101a9047ef394054c7d74c9f2459313952093282\": container with ID starting with 855a35fc744f70c03455a11f101a9047ef394054c7d74c9f2459313952093282 not found: ID does not exist" containerID="855a35fc744f70c03455a11f101a9047ef394054c7d74c9f2459313952093282" Nov 26 15:26:06 crc kubenswrapper[4785]: I1126 15:26:06.611272 4785 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"855a35fc744f70c03455a11f101a9047ef394054c7d74c9f2459313952093282"} err="failed to get container status \"855a35fc744f70c03455a11f101a9047ef394054c7d74c9f2459313952093282\": rpc error: code = NotFound desc = could not find container \"855a35fc744f70c03455a11f101a9047ef394054c7d74c9f2459313952093282\": container with ID starting with 855a35fc744f70c03455a11f101a9047ef394054c7d74c9f2459313952093282 not found: ID does not exist" Nov 26 15:26:06 crc kubenswrapper[4785]: I1126 15:26:06.611444 4785 scope.go:117] "RemoveContainer" containerID="0a483ab208e3d4d4d975074a3654d106f9e7cdf95c537b4bcfd3409018003509" Nov 26 15:26:06 crc kubenswrapper[4785]: E1126 15:26:06.612186 4785 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0a483ab208e3d4d4d975074a3654d106f9e7cdf95c537b4bcfd3409018003509\": container with ID starting with 0a483ab208e3d4d4d975074a3654d106f9e7cdf95c537b4bcfd3409018003509 not found: ID does not exist" containerID="0a483ab208e3d4d4d975074a3654d106f9e7cdf95c537b4bcfd3409018003509" Nov 26 15:26:06 crc kubenswrapper[4785]: I1126 15:26:06.612216 4785 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0a483ab208e3d4d4d975074a3654d106f9e7cdf95c537b4bcfd3409018003509"} err="failed to get container status \"0a483ab208e3d4d4d975074a3654d106f9e7cdf95c537b4bcfd3409018003509\": rpc error: code = NotFound desc = could not find container \"0a483ab208e3d4d4d975074a3654d106f9e7cdf95c537b4bcfd3409018003509\": container with ID starting with 0a483ab208e3d4d4d975074a3654d106f9e7cdf95c537b4bcfd3409018003509 not found: ID does not exist" Nov 26 15:26:06 crc kubenswrapper[4785]: I1126 15:26:06.633372 4785 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Nov 26 15:26:06 crc kubenswrapper[4785]: E1126 15:26:06.633721 4785 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c09ff9ef-4108-4444-80d3-b9481250da9c" containerName="glance-httpd" Nov 26 15:26:06 crc kubenswrapper[4785]: I1126 15:26:06.633736 4785 state_mem.go:107] "Deleted CPUSet assignment" podUID="c09ff9ef-4108-4444-80d3-b9481250da9c" containerName="glance-httpd" Nov 26 15:26:06 crc kubenswrapper[4785]: E1126 15:26:06.633758 4785 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c09ff9ef-4108-4444-80d3-b9481250da9c" containerName="glance-log" Nov 26 15:26:06 crc kubenswrapper[4785]: I1126 15:26:06.633768 4785 state_mem.go:107] "Deleted CPUSet assignment" podUID="c09ff9ef-4108-4444-80d3-b9481250da9c" containerName="glance-log" Nov 26 15:26:06 crc kubenswrapper[4785]: I1126 15:26:06.633901 4785 memory_manager.go:354] "RemoveStaleState removing state" podUID="c09ff9ef-4108-4444-80d3-b9481250da9c" containerName="glance-log" Nov 26 15:26:06 crc kubenswrapper[4785]: I1126 15:26:06.633923 4785 memory_manager.go:354] "RemoveStaleState removing state" podUID="c09ff9ef-4108-4444-80d3-b9481250da9c" containerName="glance-httpd" Nov 26 15:26:06 crc kubenswrapper[4785]: I1126 15:26:06.634786 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-0" Nov 26 15:26:06 crc kubenswrapper[4785]: I1126 15:26:06.666163 4785 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Nov 26 15:26:06 crc kubenswrapper[4785]: I1126 15:26:06.788737 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/dc9a52a1-12cd-4620-93ad-a3efe24a9a52-httpd-run\") pod \"glance-default-single-0\" (UID: \"dc9a52a1-12cd-4620-93ad-a3efe24a9a52\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 15:26:06 crc kubenswrapper[4785]: I1126 15:26:06.788785 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/dc9a52a1-12cd-4620-93ad-a3efe24a9a52-dev\") pod \"glance-default-single-0\" (UID: \"dc9a52a1-12cd-4620-93ad-a3efe24a9a52\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 15:26:06 crc kubenswrapper[4785]: I1126 15:26:06.788812 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dc9a52a1-12cd-4620-93ad-a3efe24a9a52-scripts\") pod \"glance-default-single-0\" (UID: \"dc9a52a1-12cd-4620-93ad-a3efe24a9a52\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 15:26:06 crc kubenswrapper[4785]: I1126 15:26:06.788838 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-single-0\" (UID: \"dc9a52a1-12cd-4620-93ad-a3efe24a9a52\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 15:26:06 crc kubenswrapper[4785]: I1126 15:26:06.789000 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/dc9a52a1-12cd-4620-93ad-a3efe24a9a52-etc-iscsi\") pod \"glance-default-single-0\" (UID: \"dc9a52a1-12cd-4620-93ad-a3efe24a9a52\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 15:26:06 crc kubenswrapper[4785]: I1126 15:26:06.789024 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-djcnm\" (UniqueName: \"kubernetes.io/projected/dc9a52a1-12cd-4620-93ad-a3efe24a9a52-kube-api-access-djcnm\") pod \"glance-default-single-0\" (UID: \"dc9a52a1-12cd-4620-93ad-a3efe24a9a52\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 15:26:06 crc kubenswrapper[4785]: I1126 15:26:06.789111 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/dc9a52a1-12cd-4620-93ad-a3efe24a9a52-run\") pod \"glance-default-single-0\" (UID: \"dc9a52a1-12cd-4620-93ad-a3efe24a9a52\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 15:26:06 crc kubenswrapper[4785]: I1126 15:26:06.789236 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dc9a52a1-12cd-4620-93ad-a3efe24a9a52-logs\") pod \"glance-default-single-0\" (UID: \"dc9a52a1-12cd-4620-93ad-a3efe24a9a52\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 15:26:06 crc kubenswrapper[4785]: I1126 15:26:06.789259 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc9a52a1-12cd-4620-93ad-a3efe24a9a52-config-data\") pod \"glance-default-single-0\" (UID: \"dc9a52a1-12cd-4620-93ad-a3efe24a9a52\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 15:26:06 crc kubenswrapper[4785]: I1126 15:26:06.789285 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/dc9a52a1-12cd-4620-93ad-a3efe24a9a52-etc-nvme\") pod \"glance-default-single-0\" (UID: \"dc9a52a1-12cd-4620-93ad-a3efe24a9a52\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 15:26:06 crc kubenswrapper[4785]: I1126 15:26:06.789317 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/dc9a52a1-12cd-4620-93ad-a3efe24a9a52-var-locks-brick\") pod \"glance-default-single-0\" (UID: \"dc9a52a1-12cd-4620-93ad-a3efe24a9a52\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 15:26:06 crc kubenswrapper[4785]: I1126 15:26:06.789381 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/dc9a52a1-12cd-4620-93ad-a3efe24a9a52-sys\") pod \"glance-default-single-0\" (UID: \"dc9a52a1-12cd-4620-93ad-a3efe24a9a52\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 15:26:06 crc kubenswrapper[4785]: I1126 15:26:06.789403 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/dc9a52a1-12cd-4620-93ad-a3efe24a9a52-lib-modules\") pod \"glance-default-single-0\" (UID: \"dc9a52a1-12cd-4620-93ad-a3efe24a9a52\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 15:26:06 crc kubenswrapper[4785]: I1126 15:26:06.789480 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-single-0\" (UID: \"dc9a52a1-12cd-4620-93ad-a3efe24a9a52\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 15:26:06 crc kubenswrapper[4785]: I1126 15:26:06.891013 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dc9a52a1-12cd-4620-93ad-a3efe24a9a52-logs\") pod \"glance-default-single-0\" (UID: \"dc9a52a1-12cd-4620-93ad-a3efe24a9a52\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 15:26:06 crc kubenswrapper[4785]: I1126 15:26:06.891068 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc9a52a1-12cd-4620-93ad-a3efe24a9a52-config-data\") pod \"glance-default-single-0\" (UID: \"dc9a52a1-12cd-4620-93ad-a3efe24a9a52\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 15:26:06 crc kubenswrapper[4785]: I1126 15:26:06.891098 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/dc9a52a1-12cd-4620-93ad-a3efe24a9a52-etc-nvme\") pod \"glance-default-single-0\" (UID: \"dc9a52a1-12cd-4620-93ad-a3efe24a9a52\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 15:26:06 crc kubenswrapper[4785]: I1126 15:26:06.891145 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/dc9a52a1-12cd-4620-93ad-a3efe24a9a52-var-locks-brick\") pod \"glance-default-single-0\" (UID: \"dc9a52a1-12cd-4620-93ad-a3efe24a9a52\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 15:26:06 crc kubenswrapper[4785]: I1126 15:26:06.891178 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/dc9a52a1-12cd-4620-93ad-a3efe24a9a52-sys\") pod \"glance-default-single-0\" (UID: \"dc9a52a1-12cd-4620-93ad-a3efe24a9a52\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 15:26:06 crc kubenswrapper[4785]: I1126 15:26:06.891205 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/dc9a52a1-12cd-4620-93ad-a3efe24a9a52-lib-modules\") pod \"glance-default-single-0\" (UID: \"dc9a52a1-12cd-4620-93ad-a3efe24a9a52\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 15:26:06 crc kubenswrapper[4785]: I1126 15:26:06.891266 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/dc9a52a1-12cd-4620-93ad-a3efe24a9a52-lib-modules\") pod \"glance-default-single-0\" (UID: \"dc9a52a1-12cd-4620-93ad-a3efe24a9a52\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 15:26:06 crc kubenswrapper[4785]: I1126 15:26:06.891288 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/dc9a52a1-12cd-4620-93ad-a3efe24a9a52-etc-nvme\") pod \"glance-default-single-0\" (UID: \"dc9a52a1-12cd-4620-93ad-a3efe24a9a52\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 15:26:06 crc kubenswrapper[4785]: I1126 15:26:06.891326 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-single-0\" (UID: \"dc9a52a1-12cd-4620-93ad-a3efe24a9a52\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 15:26:06 crc kubenswrapper[4785]: I1126 15:26:06.891346 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/dc9a52a1-12cd-4620-93ad-a3efe24a9a52-sys\") pod \"glance-default-single-0\" (UID: \"dc9a52a1-12cd-4620-93ad-a3efe24a9a52\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 15:26:06 crc kubenswrapper[4785]: I1126 15:26:06.891421 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/dc9a52a1-12cd-4620-93ad-a3efe24a9a52-httpd-run\") pod \"glance-default-single-0\" (UID: \"dc9a52a1-12cd-4620-93ad-a3efe24a9a52\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 15:26:06 crc kubenswrapper[4785]: I1126 15:26:06.891458 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/dc9a52a1-12cd-4620-93ad-a3efe24a9a52-dev\") pod \"glance-default-single-0\" (UID: \"dc9a52a1-12cd-4620-93ad-a3efe24a9a52\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 15:26:06 crc kubenswrapper[4785]: I1126 15:26:06.891487 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dc9a52a1-12cd-4620-93ad-a3efe24a9a52-logs\") pod \"glance-default-single-0\" (UID: \"dc9a52a1-12cd-4620-93ad-a3efe24a9a52\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 15:26:06 crc kubenswrapper[4785]: I1126 15:26:06.891500 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dc9a52a1-12cd-4620-93ad-a3efe24a9a52-scripts\") pod \"glance-default-single-0\" (UID: \"dc9a52a1-12cd-4620-93ad-a3efe24a9a52\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 15:26:06 crc kubenswrapper[4785]: I1126 15:26:06.891521 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/dc9a52a1-12cd-4620-93ad-a3efe24a9a52-var-locks-brick\") pod \"glance-default-single-0\" (UID: \"dc9a52a1-12cd-4620-93ad-a3efe24a9a52\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 15:26:06 crc kubenswrapper[4785]: I1126 15:26:06.891616 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-single-0\" (UID: \"dc9a52a1-12cd-4620-93ad-a3efe24a9a52\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 15:26:06 crc kubenswrapper[4785]: I1126 15:26:06.891618 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/dc9a52a1-12cd-4620-93ad-a3efe24a9a52-dev\") pod \"glance-default-single-0\" (UID: \"dc9a52a1-12cd-4620-93ad-a3efe24a9a52\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 15:26:06 crc kubenswrapper[4785]: I1126 15:26:06.891686 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/dc9a52a1-12cd-4620-93ad-a3efe24a9a52-etc-iscsi\") pod \"glance-default-single-0\" (UID: \"dc9a52a1-12cd-4620-93ad-a3efe24a9a52\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 15:26:06 crc kubenswrapper[4785]: I1126 15:26:06.891715 4785 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-single-0\" (UID: \"dc9a52a1-12cd-4620-93ad-a3efe24a9a52\") device mount path \"/mnt/openstack/pv10\"" pod="glance-kuttl-tests/glance-default-single-0" Nov 26 15:26:06 crc kubenswrapper[4785]: I1126 15:26:06.891729 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-djcnm\" (UniqueName: \"kubernetes.io/projected/dc9a52a1-12cd-4620-93ad-a3efe24a9a52-kube-api-access-djcnm\") pod \"glance-default-single-0\" (UID: \"dc9a52a1-12cd-4620-93ad-a3efe24a9a52\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 15:26:06 crc kubenswrapper[4785]: I1126 15:26:06.891784 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/dc9a52a1-12cd-4620-93ad-a3efe24a9a52-run\") pod \"glance-default-single-0\" (UID: \"dc9a52a1-12cd-4620-93ad-a3efe24a9a52\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 15:26:06 crc kubenswrapper[4785]: I1126 15:26:06.891793 4785 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-single-0\" (UID: \"dc9a52a1-12cd-4620-93ad-a3efe24a9a52\") device mount path \"/mnt/openstack/pv09\"" pod="glance-kuttl-tests/glance-default-single-0" Nov 26 15:26:06 crc kubenswrapper[4785]: I1126 15:26:06.891927 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/dc9a52a1-12cd-4620-93ad-a3efe24a9a52-httpd-run\") pod \"glance-default-single-0\" (UID: \"dc9a52a1-12cd-4620-93ad-a3efe24a9a52\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 15:26:06 crc kubenswrapper[4785]: I1126 15:26:06.891938 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/dc9a52a1-12cd-4620-93ad-a3efe24a9a52-run\") pod \"glance-default-single-0\" (UID: \"dc9a52a1-12cd-4620-93ad-a3efe24a9a52\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 15:26:06 crc kubenswrapper[4785]: I1126 15:26:06.891976 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/dc9a52a1-12cd-4620-93ad-a3efe24a9a52-etc-iscsi\") pod \"glance-default-single-0\" (UID: \"dc9a52a1-12cd-4620-93ad-a3efe24a9a52\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 15:26:06 crc kubenswrapper[4785]: I1126 15:26:06.896489 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc9a52a1-12cd-4620-93ad-a3efe24a9a52-config-data\") pod \"glance-default-single-0\" (UID: \"dc9a52a1-12cd-4620-93ad-a3efe24a9a52\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 15:26:06 crc kubenswrapper[4785]: I1126 15:26:06.911731 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-single-0\" (UID: \"dc9a52a1-12cd-4620-93ad-a3efe24a9a52\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 15:26:06 crc kubenswrapper[4785]: I1126 15:26:06.912165 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dc9a52a1-12cd-4620-93ad-a3efe24a9a52-scripts\") pod \"glance-default-single-0\" (UID: \"dc9a52a1-12cd-4620-93ad-a3efe24a9a52\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 15:26:06 crc kubenswrapper[4785]: I1126 15:26:06.917411 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-djcnm\" (UniqueName: \"kubernetes.io/projected/dc9a52a1-12cd-4620-93ad-a3efe24a9a52-kube-api-access-djcnm\") pod \"glance-default-single-0\" (UID: \"dc9a52a1-12cd-4620-93ad-a3efe24a9a52\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 15:26:06 crc kubenswrapper[4785]: I1126 15:26:06.918934 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-single-0\" (UID: \"dc9a52a1-12cd-4620-93ad-a3efe24a9a52\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 15:26:06 crc kubenswrapper[4785]: I1126 15:26:06.984254 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-0" Nov 26 15:26:07 crc kubenswrapper[4785]: I1126 15:26:07.047274 4785 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c09ff9ef-4108-4444-80d3-b9481250da9c" path="/var/lib/kubelet/pods/c09ff9ef-4108-4444-80d3-b9481250da9c/volumes" Nov 26 15:26:07 crc kubenswrapper[4785]: I1126 15:26:07.289163 4785 patch_prober.go:28] interesting pod/machine-config-daemon-gkxdl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 26 15:26:07 crc kubenswrapper[4785]: I1126 15:26:07.289210 4785 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-gkxdl" podUID="5539d39a-e2bc-4e7f-8b5a-a5e3e10c4ba4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 26 15:26:07 crc kubenswrapper[4785]: I1126 15:26:07.441171 4785 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Nov 26 15:26:07 crc kubenswrapper[4785]: I1126 15:26:07.579395 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"dc9a52a1-12cd-4620-93ad-a3efe24a9a52","Type":"ContainerStarted","Data":"49b0157cc046c2d76a8bea37a2d34206304eeb6f5a748b6a2bbf27c5f9c07b5c"} Nov 26 15:26:08 crc kubenswrapper[4785]: I1126 15:26:08.587735 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"dc9a52a1-12cd-4620-93ad-a3efe24a9a52","Type":"ContainerStarted","Data":"88454a72926dc7f711ffee8b3886325e47489edf2878329f4ff1c3bbff5d315e"} Nov 26 15:26:08 crc kubenswrapper[4785]: I1126 15:26:08.588266 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"dc9a52a1-12cd-4620-93ad-a3efe24a9a52","Type":"ContainerStarted","Data":"13648058e886cece9bc35e0f2e07e768f1d478fcc56b30a3011754830e6afc97"} Nov 26 15:26:08 crc kubenswrapper[4785]: I1126 15:26:08.606044 4785 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-default-single-0" podStartSLOduration=2.606024519 podStartE2EDuration="2.606024519s" podCreationTimestamp="2025-11-26 15:26:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 15:26:08.605211537 +0000 UTC m=+1132.283577321" watchObservedRunningTime="2025-11-26 15:26:08.606024519 +0000 UTC m=+1132.284390293" Nov 26 15:26:16 crc kubenswrapper[4785]: I1126 15:26:16.984466 4785 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-single-0" Nov 26 15:26:16 crc kubenswrapper[4785]: I1126 15:26:16.985069 4785 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-single-0" Nov 26 15:26:17 crc kubenswrapper[4785]: I1126 15:26:17.024261 4785 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-single-0" Nov 26 15:26:17 crc kubenswrapper[4785]: I1126 15:26:17.026573 4785 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-single-0" Nov 26 15:26:17 crc kubenswrapper[4785]: I1126 15:26:17.659169 4785 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-single-0" Nov 26 15:26:17 crc kubenswrapper[4785]: I1126 15:26:17.659242 4785 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-single-0" Nov 26 15:26:19 crc kubenswrapper[4785]: I1126 15:26:19.627368 4785 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-single-0" Nov 26 15:26:19 crc kubenswrapper[4785]: I1126 15:26:19.642073 4785 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-single-0" Nov 26 15:26:37 crc kubenswrapper[4785]: I1126 15:26:37.288985 4785 patch_prober.go:28] interesting pod/machine-config-daemon-gkxdl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 26 15:26:37 crc kubenswrapper[4785]: I1126 15:26:37.289476 4785 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-gkxdl" podUID="5539d39a-e2bc-4e7f-8b5a-a5e3e10c4ba4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 26 15:26:37 crc kubenswrapper[4785]: I1126 15:26:37.289535 4785 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-gkxdl" Nov 26 15:26:37 crc kubenswrapper[4785]: I1126 15:26:37.290486 4785 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"931a84441789a47b1ac55236dde8c88c217189d0a2b8d7b82c6917d783312096"} pod="openshift-machine-config-operator/machine-config-daemon-gkxdl" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 26 15:26:37 crc kubenswrapper[4785]: I1126 15:26:37.290613 4785 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-gkxdl" podUID="5539d39a-e2bc-4e7f-8b5a-a5e3e10c4ba4" containerName="machine-config-daemon" containerID="cri-o://931a84441789a47b1ac55236dde8c88c217189d0a2b8d7b82c6917d783312096" gracePeriod=600 Nov 26 15:26:37 crc kubenswrapper[4785]: I1126 15:26:37.847089 4785 generic.go:334] "Generic (PLEG): container finished" podID="5539d39a-e2bc-4e7f-8b5a-a5e3e10c4ba4" containerID="931a84441789a47b1ac55236dde8c88c217189d0a2b8d7b82c6917d783312096" exitCode=0 Nov 26 15:26:37 crc kubenswrapper[4785]: I1126 15:26:37.847176 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-gkxdl" event={"ID":"5539d39a-e2bc-4e7f-8b5a-a5e3e10c4ba4","Type":"ContainerDied","Data":"931a84441789a47b1ac55236dde8c88c217189d0a2b8d7b82c6917d783312096"} Nov 26 15:26:37 crc kubenswrapper[4785]: I1126 15:26:37.847542 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-gkxdl" event={"ID":"5539d39a-e2bc-4e7f-8b5a-a5e3e10c4ba4","Type":"ContainerStarted","Data":"516aa1c954a9db24fdce21dd42520586d80f2cd7166b0bbaba72d20a1a2dfd33"} Nov 26 15:26:37 crc kubenswrapper[4785]: I1126 15:26:37.847591 4785 scope.go:117] "RemoveContainer" containerID="a6ffa546d6cd3d829cf51dc79161c420c203f2371af580e656e6ea6f7619320e" Nov 26 15:26:38 crc kubenswrapper[4785]: I1126 15:26:38.654149 4785 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-db-sync-jg5x5"] Nov 26 15:26:38 crc kubenswrapper[4785]: I1126 15:26:38.659935 4785 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-db-sync-jg5x5"] Nov 26 15:26:38 crc kubenswrapper[4785]: I1126 15:26:38.706846 4785 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glancef202-account-delete-w75db"] Nov 26 15:26:38 crc kubenswrapper[4785]: I1126 15:26:38.707878 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glancef202-account-delete-w75db" Nov 26 15:26:38 crc kubenswrapper[4785]: I1126 15:26:38.713594 4785 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glancef202-account-delete-w75db"] Nov 26 15:26:38 crc kubenswrapper[4785]: I1126 15:26:38.776969 4785 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-single-1"] Nov 26 15:26:38 crc kubenswrapper[4785]: I1126 15:26:38.777374 4785 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-single-1" podUID="61667a66-3950-4b82-8bf7-3f6e45dd6953" containerName="glance-httpd" containerID="cri-o://6ec36ae05819f8fb34b1132d6a92dd2eed8f56e8a5f821dac544950caf201de0" gracePeriod=30 Nov 26 15:26:38 crc kubenswrapper[4785]: I1126 15:26:38.777299 4785 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-single-1" podUID="61667a66-3950-4b82-8bf7-3f6e45dd6953" containerName="glance-log" containerID="cri-o://33272b0cd95abb9205ee4016a4b0cb79b068dc24d7ef859f220f4d2312fae2f3" gracePeriod=30 Nov 26 15:26:38 crc kubenswrapper[4785]: I1126 15:26:38.789709 4785 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Nov 26 15:26:38 crc kubenswrapper[4785]: I1126 15:26:38.789988 4785 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-single-0" podUID="dc9a52a1-12cd-4620-93ad-a3efe24a9a52" containerName="glance-log" containerID="cri-o://13648058e886cece9bc35e0f2e07e768f1d478fcc56b30a3011754830e6afc97" gracePeriod=30 Nov 26 15:26:38 crc kubenswrapper[4785]: I1126 15:26:38.790119 4785 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-single-0" podUID="dc9a52a1-12cd-4620-93ad-a3efe24a9a52" containerName="glance-httpd" containerID="cri-o://88454a72926dc7f711ffee8b3886325e47489edf2878329f4ff1c3bbff5d315e" gracePeriod=30 Nov 26 15:26:38 crc kubenswrapper[4785]: I1126 15:26:38.808068 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/30b0671e-f3d2-47ab-88c0-bb29580c3139-operator-scripts\") pod \"glancef202-account-delete-w75db\" (UID: \"30b0671e-f3d2-47ab-88c0-bb29580c3139\") " pod="glance-kuttl-tests/glancef202-account-delete-w75db" Nov 26 15:26:38 crc kubenswrapper[4785]: I1126 15:26:38.808174 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j6bqk\" (UniqueName: \"kubernetes.io/projected/30b0671e-f3d2-47ab-88c0-bb29580c3139-kube-api-access-j6bqk\") pod \"glancef202-account-delete-w75db\" (UID: \"30b0671e-f3d2-47ab-88c0-bb29580c3139\") " pod="glance-kuttl-tests/glancef202-account-delete-w75db" Nov 26 15:26:38 crc kubenswrapper[4785]: I1126 15:26:38.885128 4785 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/openstackclient"] Nov 26 15:26:38 crc kubenswrapper[4785]: I1126 15:26:38.885389 4785 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/openstackclient" podUID="57bb65c9-f6e0-4ee2-ac07-5e1e6d2f70eb" containerName="openstackclient" containerID="cri-o://dc1013b1b55bc0934364cfa2c204ec072a83a7854196dafb89c23a1dbe8640b6" gracePeriod=30 Nov 26 15:26:38 crc kubenswrapper[4785]: I1126 15:26:38.912620 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j6bqk\" (UniqueName: \"kubernetes.io/projected/30b0671e-f3d2-47ab-88c0-bb29580c3139-kube-api-access-j6bqk\") pod \"glancef202-account-delete-w75db\" (UID: \"30b0671e-f3d2-47ab-88c0-bb29580c3139\") " pod="glance-kuttl-tests/glancef202-account-delete-w75db" Nov 26 15:26:38 crc kubenswrapper[4785]: I1126 15:26:38.912756 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/30b0671e-f3d2-47ab-88c0-bb29580c3139-operator-scripts\") pod \"glancef202-account-delete-w75db\" (UID: \"30b0671e-f3d2-47ab-88c0-bb29580c3139\") " pod="glance-kuttl-tests/glancef202-account-delete-w75db" Nov 26 15:26:38 crc kubenswrapper[4785]: I1126 15:26:38.913950 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/30b0671e-f3d2-47ab-88c0-bb29580c3139-operator-scripts\") pod \"glancef202-account-delete-w75db\" (UID: \"30b0671e-f3d2-47ab-88c0-bb29580c3139\") " pod="glance-kuttl-tests/glancef202-account-delete-w75db" Nov 26 15:26:38 crc kubenswrapper[4785]: I1126 15:26:38.936180 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j6bqk\" (UniqueName: \"kubernetes.io/projected/30b0671e-f3d2-47ab-88c0-bb29580c3139-kube-api-access-j6bqk\") pod \"glancef202-account-delete-w75db\" (UID: \"30b0671e-f3d2-47ab-88c0-bb29580c3139\") " pod="glance-kuttl-tests/glancef202-account-delete-w75db" Nov 26 15:26:39 crc kubenswrapper[4785]: I1126 15:26:39.045377 4785 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b8ae3bfd-bcde-4b41-b6ea-32789198b54f" path="/var/lib/kubelet/pods/b8ae3bfd-bcde-4b41-b6ea-32789198b54f/volumes" Nov 26 15:26:39 crc kubenswrapper[4785]: I1126 15:26:39.064033 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glancef202-account-delete-w75db" Nov 26 15:26:39 crc kubenswrapper[4785]: I1126 15:26:39.247091 4785 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/openstackclient" Nov 26 15:26:39 crc kubenswrapper[4785]: I1126 15:26:39.317384 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-scripts\" (UniqueName: \"kubernetes.io/configmap/57bb65c9-f6e0-4ee2-ac07-5e1e6d2f70eb-openstack-scripts\") pod \"57bb65c9-f6e0-4ee2-ac07-5e1e6d2f70eb\" (UID: \"57bb65c9-f6e0-4ee2-ac07-5e1e6d2f70eb\") " Nov 26 15:26:39 crc kubenswrapper[4785]: I1126 15:26:39.317466 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/57bb65c9-f6e0-4ee2-ac07-5e1e6d2f70eb-openstack-config-secret\") pod \"57bb65c9-f6e0-4ee2-ac07-5e1e6d2f70eb\" (UID: \"57bb65c9-f6e0-4ee2-ac07-5e1e6d2f70eb\") " Nov 26 15:26:39 crc kubenswrapper[4785]: I1126 15:26:39.317537 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/57bb65c9-f6e0-4ee2-ac07-5e1e6d2f70eb-openstack-config\") pod \"57bb65c9-f6e0-4ee2-ac07-5e1e6d2f70eb\" (UID: \"57bb65c9-f6e0-4ee2-ac07-5e1e6d2f70eb\") " Nov 26 15:26:39 crc kubenswrapper[4785]: I1126 15:26:39.317570 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-khjmf\" (UniqueName: \"kubernetes.io/projected/57bb65c9-f6e0-4ee2-ac07-5e1e6d2f70eb-kube-api-access-khjmf\") pod \"57bb65c9-f6e0-4ee2-ac07-5e1e6d2f70eb\" (UID: \"57bb65c9-f6e0-4ee2-ac07-5e1e6d2f70eb\") " Nov 26 15:26:39 crc kubenswrapper[4785]: I1126 15:26:39.318944 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/57bb65c9-f6e0-4ee2-ac07-5e1e6d2f70eb-openstack-scripts" (OuterVolumeSpecName: "openstack-scripts") pod "57bb65c9-f6e0-4ee2-ac07-5e1e6d2f70eb" (UID: "57bb65c9-f6e0-4ee2-ac07-5e1e6d2f70eb"). InnerVolumeSpecName "openstack-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 15:26:39 crc kubenswrapper[4785]: I1126 15:26:39.322722 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57bb65c9-f6e0-4ee2-ac07-5e1e6d2f70eb-kube-api-access-khjmf" (OuterVolumeSpecName: "kube-api-access-khjmf") pod "57bb65c9-f6e0-4ee2-ac07-5e1e6d2f70eb" (UID: "57bb65c9-f6e0-4ee2-ac07-5e1e6d2f70eb"). InnerVolumeSpecName "kube-api-access-khjmf". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 15:26:39 crc kubenswrapper[4785]: E1126 15:26:39.335276 4785 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/57bb65c9-f6e0-4ee2-ac07-5e1e6d2f70eb-openstack-config podName:57bb65c9-f6e0-4ee2-ac07-5e1e6d2f70eb nodeName:}" failed. No retries permitted until 2025-11-26 15:26:39.835242681 +0000 UTC m=+1163.513608445 (durationBeforeRetry 500ms). Error: error cleaning subPath mounts for volume "openstack-config" (UniqueName: "kubernetes.io/configmap/57bb65c9-f6e0-4ee2-ac07-5e1e6d2f70eb-openstack-config") pod "57bb65c9-f6e0-4ee2-ac07-5e1e6d2f70eb" (UID: "57bb65c9-f6e0-4ee2-ac07-5e1e6d2f70eb") : error deleting /var/lib/kubelet/pods/57bb65c9-f6e0-4ee2-ac07-5e1e6d2f70eb/volume-subpaths: remove /var/lib/kubelet/pods/57bb65c9-f6e0-4ee2-ac07-5e1e6d2f70eb/volume-subpaths: no such file or directory Nov 26 15:26:39 crc kubenswrapper[4785]: I1126 15:26:39.339359 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/57bb65c9-f6e0-4ee2-ac07-5e1e6d2f70eb-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "57bb65c9-f6e0-4ee2-ac07-5e1e6d2f70eb" (UID: "57bb65c9-f6e0-4ee2-ac07-5e1e6d2f70eb"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 15:26:39 crc kubenswrapper[4785]: I1126 15:26:39.419107 4785 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-khjmf\" (UniqueName: \"kubernetes.io/projected/57bb65c9-f6e0-4ee2-ac07-5e1e6d2f70eb-kube-api-access-khjmf\") on node \"crc\" DevicePath \"\"" Nov 26 15:26:39 crc kubenswrapper[4785]: I1126 15:26:39.419141 4785 reconciler_common.go:293] "Volume detached for volume \"openstack-scripts\" (UniqueName: \"kubernetes.io/configmap/57bb65c9-f6e0-4ee2-ac07-5e1e6d2f70eb-openstack-scripts\") on node \"crc\" DevicePath \"\"" Nov 26 15:26:39 crc kubenswrapper[4785]: I1126 15:26:39.419150 4785 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/57bb65c9-f6e0-4ee2-ac07-5e1e6d2f70eb-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Nov 26 15:26:39 crc kubenswrapper[4785]: I1126 15:26:39.470839 4785 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glancef202-account-delete-w75db"] Nov 26 15:26:39 crc kubenswrapper[4785]: W1126 15:26:39.478630 4785 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod30b0671e_f3d2_47ab_88c0_bb29580c3139.slice/crio-520fcb0173c09f0c65beeaa2ffc630427bdb9a7a1674c020bbb533b5c89d6fcd WatchSource:0}: Error finding container 520fcb0173c09f0c65beeaa2ffc630427bdb9a7a1674c020bbb533b5c89d6fcd: Status 404 returned error can't find the container with id 520fcb0173c09f0c65beeaa2ffc630427bdb9a7a1674c020bbb533b5c89d6fcd Nov 26 15:26:39 crc kubenswrapper[4785]: I1126 15:26:39.865344 4785 generic.go:334] "Generic (PLEG): container finished" podID="dc9a52a1-12cd-4620-93ad-a3efe24a9a52" containerID="13648058e886cece9bc35e0f2e07e768f1d478fcc56b30a3011754830e6afc97" exitCode=143 Nov 26 15:26:39 crc kubenswrapper[4785]: I1126 15:26:39.865415 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"dc9a52a1-12cd-4620-93ad-a3efe24a9a52","Type":"ContainerDied","Data":"13648058e886cece9bc35e0f2e07e768f1d478fcc56b30a3011754830e6afc97"} Nov 26 15:26:39 crc kubenswrapper[4785]: I1126 15:26:39.867421 4785 generic.go:334] "Generic (PLEG): container finished" podID="57bb65c9-f6e0-4ee2-ac07-5e1e6d2f70eb" containerID="dc1013b1b55bc0934364cfa2c204ec072a83a7854196dafb89c23a1dbe8640b6" exitCode=143 Nov 26 15:26:39 crc kubenswrapper[4785]: I1126 15:26:39.867474 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/openstackclient" event={"ID":"57bb65c9-f6e0-4ee2-ac07-5e1e6d2f70eb","Type":"ContainerDied","Data":"dc1013b1b55bc0934364cfa2c204ec072a83a7854196dafb89c23a1dbe8640b6"} Nov 26 15:26:39 crc kubenswrapper[4785]: I1126 15:26:39.867511 4785 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/openstackclient" Nov 26 15:26:39 crc kubenswrapper[4785]: I1126 15:26:39.867836 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/openstackclient" event={"ID":"57bb65c9-f6e0-4ee2-ac07-5e1e6d2f70eb","Type":"ContainerDied","Data":"31dc47a1decc816f7599e2b4a6f82f7731c1cd2a4c724dc73569b1970eee545e"} Nov 26 15:26:39 crc kubenswrapper[4785]: I1126 15:26:39.867872 4785 scope.go:117] "RemoveContainer" containerID="dc1013b1b55bc0934364cfa2c204ec072a83a7854196dafb89c23a1dbe8640b6" Nov 26 15:26:39 crc kubenswrapper[4785]: I1126 15:26:39.868989 4785 generic.go:334] "Generic (PLEG): container finished" podID="30b0671e-f3d2-47ab-88c0-bb29580c3139" containerID="a22899436e8934c2833baf58645fe45b83e14098745e66a57da67570714e34db" exitCode=0 Nov 26 15:26:39 crc kubenswrapper[4785]: I1126 15:26:39.869078 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glancef202-account-delete-w75db" event={"ID":"30b0671e-f3d2-47ab-88c0-bb29580c3139","Type":"ContainerDied","Data":"a22899436e8934c2833baf58645fe45b83e14098745e66a57da67570714e34db"} Nov 26 15:26:39 crc kubenswrapper[4785]: I1126 15:26:39.869124 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glancef202-account-delete-w75db" event={"ID":"30b0671e-f3d2-47ab-88c0-bb29580c3139","Type":"ContainerStarted","Data":"520fcb0173c09f0c65beeaa2ffc630427bdb9a7a1674c020bbb533b5c89d6fcd"} Nov 26 15:26:39 crc kubenswrapper[4785]: I1126 15:26:39.871482 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-1" event={"ID":"61667a66-3950-4b82-8bf7-3f6e45dd6953","Type":"ContainerDied","Data":"33272b0cd95abb9205ee4016a4b0cb79b068dc24d7ef859f220f4d2312fae2f3"} Nov 26 15:26:39 crc kubenswrapper[4785]: I1126 15:26:39.871376 4785 generic.go:334] "Generic (PLEG): container finished" podID="61667a66-3950-4b82-8bf7-3f6e45dd6953" containerID="33272b0cd95abb9205ee4016a4b0cb79b068dc24d7ef859f220f4d2312fae2f3" exitCode=143 Nov 26 15:26:39 crc kubenswrapper[4785]: I1126 15:26:39.904282 4785 scope.go:117] "RemoveContainer" containerID="dc1013b1b55bc0934364cfa2c204ec072a83a7854196dafb89c23a1dbe8640b6" Nov 26 15:26:39 crc kubenswrapper[4785]: E1126 15:26:39.904706 4785 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dc1013b1b55bc0934364cfa2c204ec072a83a7854196dafb89c23a1dbe8640b6\": container with ID starting with dc1013b1b55bc0934364cfa2c204ec072a83a7854196dafb89c23a1dbe8640b6 not found: ID does not exist" containerID="dc1013b1b55bc0934364cfa2c204ec072a83a7854196dafb89c23a1dbe8640b6" Nov 26 15:26:39 crc kubenswrapper[4785]: I1126 15:26:39.904751 4785 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dc1013b1b55bc0934364cfa2c204ec072a83a7854196dafb89c23a1dbe8640b6"} err="failed to get container status \"dc1013b1b55bc0934364cfa2c204ec072a83a7854196dafb89c23a1dbe8640b6\": rpc error: code = NotFound desc = could not find container \"dc1013b1b55bc0934364cfa2c204ec072a83a7854196dafb89c23a1dbe8640b6\": container with ID starting with dc1013b1b55bc0934364cfa2c204ec072a83a7854196dafb89c23a1dbe8640b6 not found: ID does not exist" Nov 26 15:26:39 crc kubenswrapper[4785]: I1126 15:26:39.925707 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/57bb65c9-f6e0-4ee2-ac07-5e1e6d2f70eb-openstack-config\") pod \"57bb65c9-f6e0-4ee2-ac07-5e1e6d2f70eb\" (UID: \"57bb65c9-f6e0-4ee2-ac07-5e1e6d2f70eb\") " Nov 26 15:26:39 crc kubenswrapper[4785]: I1126 15:26:39.926591 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/57bb65c9-f6e0-4ee2-ac07-5e1e6d2f70eb-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "57bb65c9-f6e0-4ee2-ac07-5e1e6d2f70eb" (UID: "57bb65c9-f6e0-4ee2-ac07-5e1e6d2f70eb"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 15:26:40 crc kubenswrapper[4785]: I1126 15:26:40.027113 4785 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/57bb65c9-f6e0-4ee2-ac07-5e1e6d2f70eb-openstack-config\") on node \"crc\" DevicePath \"\"" Nov 26 15:26:40 crc kubenswrapper[4785]: I1126 15:26:40.204670 4785 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/openstackclient"] Nov 26 15:26:40 crc kubenswrapper[4785]: I1126 15:26:40.212365 4785 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/openstackclient"] Nov 26 15:26:41 crc kubenswrapper[4785]: I1126 15:26:41.049523 4785 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57bb65c9-f6e0-4ee2-ac07-5e1e6d2f70eb" path="/var/lib/kubelet/pods/57bb65c9-f6e0-4ee2-ac07-5e1e6d2f70eb/volumes" Nov 26 15:26:41 crc kubenswrapper[4785]: I1126 15:26:41.187722 4785 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glancef202-account-delete-w75db" Nov 26 15:26:41 crc kubenswrapper[4785]: I1126 15:26:41.244456 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/30b0671e-f3d2-47ab-88c0-bb29580c3139-operator-scripts\") pod \"30b0671e-f3d2-47ab-88c0-bb29580c3139\" (UID: \"30b0671e-f3d2-47ab-88c0-bb29580c3139\") " Nov 26 15:26:41 crc kubenswrapper[4785]: I1126 15:26:41.244600 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j6bqk\" (UniqueName: \"kubernetes.io/projected/30b0671e-f3d2-47ab-88c0-bb29580c3139-kube-api-access-j6bqk\") pod \"30b0671e-f3d2-47ab-88c0-bb29580c3139\" (UID: \"30b0671e-f3d2-47ab-88c0-bb29580c3139\") " Nov 26 15:26:41 crc kubenswrapper[4785]: I1126 15:26:41.245459 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/30b0671e-f3d2-47ab-88c0-bb29580c3139-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "30b0671e-f3d2-47ab-88c0-bb29580c3139" (UID: "30b0671e-f3d2-47ab-88c0-bb29580c3139"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 15:26:41 crc kubenswrapper[4785]: I1126 15:26:41.249412 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/30b0671e-f3d2-47ab-88c0-bb29580c3139-kube-api-access-j6bqk" (OuterVolumeSpecName: "kube-api-access-j6bqk") pod "30b0671e-f3d2-47ab-88c0-bb29580c3139" (UID: "30b0671e-f3d2-47ab-88c0-bb29580c3139"). InnerVolumeSpecName "kube-api-access-j6bqk". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 15:26:41 crc kubenswrapper[4785]: I1126 15:26:41.347161 4785 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/30b0671e-f3d2-47ab-88c0-bb29580c3139-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 26 15:26:41 crc kubenswrapper[4785]: I1126 15:26:41.347219 4785 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j6bqk\" (UniqueName: \"kubernetes.io/projected/30b0671e-f3d2-47ab-88c0-bb29580c3139-kube-api-access-j6bqk\") on node \"crc\" DevicePath \"\"" Nov 26 15:26:41 crc kubenswrapper[4785]: I1126 15:26:41.894437 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glancef202-account-delete-w75db" event={"ID":"30b0671e-f3d2-47ab-88c0-bb29580c3139","Type":"ContainerDied","Data":"520fcb0173c09f0c65beeaa2ffc630427bdb9a7a1674c020bbb533b5c89d6fcd"} Nov 26 15:26:41 crc kubenswrapper[4785]: I1126 15:26:41.894488 4785 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="520fcb0173c09f0c65beeaa2ffc630427bdb9a7a1674c020bbb533b5c89d6fcd" Nov 26 15:26:41 crc kubenswrapper[4785]: I1126 15:26:41.894520 4785 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glancef202-account-delete-w75db" Nov 26 15:26:42 crc kubenswrapper[4785]: I1126 15:26:42.250932 4785 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-1" Nov 26 15:26:42 crc kubenswrapper[4785]: I1126 15:26:42.333735 4785 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-0" Nov 26 15:26:42 crc kubenswrapper[4785]: I1126 15:26:42.361578 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"61667a66-3950-4b82-8bf7-3f6e45dd6953\" (UID: \"61667a66-3950-4b82-8bf7-3f6e45dd6953\") " Nov 26 15:26:42 crc kubenswrapper[4785]: I1126 15:26:42.361659 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/dc9a52a1-12cd-4620-93ad-a3efe24a9a52-httpd-run\") pod \"dc9a52a1-12cd-4620-93ad-a3efe24a9a52\" (UID: \"dc9a52a1-12cd-4620-93ad-a3efe24a9a52\") " Nov 26 15:26:42 crc kubenswrapper[4785]: I1126 15:26:42.361699 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/61667a66-3950-4b82-8bf7-3f6e45dd6953-run\") pod \"61667a66-3950-4b82-8bf7-3f6e45dd6953\" (UID: \"61667a66-3950-4b82-8bf7-3f6e45dd6953\") " Nov 26 15:26:42 crc kubenswrapper[4785]: I1126 15:26:42.361725 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/dc9a52a1-12cd-4620-93ad-a3efe24a9a52-etc-iscsi\") pod \"dc9a52a1-12cd-4620-93ad-a3efe24a9a52\" (UID: \"dc9a52a1-12cd-4620-93ad-a3efe24a9a52\") " Nov 26 15:26:42 crc kubenswrapper[4785]: I1126 15:26:42.361755 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance-cache\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"61667a66-3950-4b82-8bf7-3f6e45dd6953\" (UID: \"61667a66-3950-4b82-8bf7-3f6e45dd6953\") " Nov 26 15:26:42 crc kubenswrapper[4785]: I1126 15:26:42.361796 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dc9a52a1-12cd-4620-93ad-a3efe24a9a52-logs\") pod \"dc9a52a1-12cd-4620-93ad-a3efe24a9a52\" (UID: \"dc9a52a1-12cd-4620-93ad-a3efe24a9a52\") " Nov 26 15:26:42 crc kubenswrapper[4785]: I1126 15:26:42.361837 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/61667a66-3950-4b82-8bf7-3f6e45dd6953-etc-nvme\") pod \"61667a66-3950-4b82-8bf7-3f6e45dd6953\" (UID: \"61667a66-3950-4b82-8bf7-3f6e45dd6953\") " Nov 26 15:26:42 crc kubenswrapper[4785]: I1126 15:26:42.361868 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"dc9a52a1-12cd-4620-93ad-a3efe24a9a52\" (UID: \"dc9a52a1-12cd-4620-93ad-a3efe24a9a52\") " Nov 26 15:26:42 crc kubenswrapper[4785]: I1126 15:26:42.361794 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/61667a66-3950-4b82-8bf7-3f6e45dd6953-run" (OuterVolumeSpecName: "run") pod "61667a66-3950-4b82-8bf7-3f6e45dd6953" (UID: "61667a66-3950-4b82-8bf7-3f6e45dd6953"). InnerVolumeSpecName "run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 15:26:42 crc kubenswrapper[4785]: I1126 15:26:42.361821 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/dc9a52a1-12cd-4620-93ad-a3efe24a9a52-etc-iscsi" (OuterVolumeSpecName: "etc-iscsi") pod "dc9a52a1-12cd-4620-93ad-a3efe24a9a52" (UID: "dc9a52a1-12cd-4620-93ad-a3efe24a9a52"). InnerVolumeSpecName "etc-iscsi". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 15:26:42 crc kubenswrapper[4785]: I1126 15:26:42.361914 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dc9a52a1-12cd-4620-93ad-a3efe24a9a52-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "dc9a52a1-12cd-4620-93ad-a3efe24a9a52" (UID: "dc9a52a1-12cd-4620-93ad-a3efe24a9a52"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 15:26:42 crc kubenswrapper[4785]: I1126 15:26:42.361869 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/61667a66-3950-4b82-8bf7-3f6e45dd6953-etc-nvme" (OuterVolumeSpecName: "etc-nvme") pod "61667a66-3950-4b82-8bf7-3f6e45dd6953" (UID: "61667a66-3950-4b82-8bf7-3f6e45dd6953"). InnerVolumeSpecName "etc-nvme". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 15:26:42 crc kubenswrapper[4785]: I1126 15:26:42.361942 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/dc9a52a1-12cd-4620-93ad-a3efe24a9a52-dev\") pod \"dc9a52a1-12cd-4620-93ad-a3efe24a9a52\" (UID: \"dc9a52a1-12cd-4620-93ad-a3efe24a9a52\") " Nov 26 15:26:42 crc kubenswrapper[4785]: I1126 15:26:42.361973 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/61667a66-3950-4b82-8bf7-3f6e45dd6953-var-locks-brick\") pod \"61667a66-3950-4b82-8bf7-3f6e45dd6953\" (UID: \"61667a66-3950-4b82-8bf7-3f6e45dd6953\") " Nov 26 15:26:42 crc kubenswrapper[4785]: I1126 15:26:42.362003 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/61667a66-3950-4b82-8bf7-3f6e45dd6953-lib-modules\") pod \"61667a66-3950-4b82-8bf7-3f6e45dd6953\" (UID: \"61667a66-3950-4b82-8bf7-3f6e45dd6953\") " Nov 26 15:26:42 crc kubenswrapper[4785]: I1126 15:26:42.362041 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/61667a66-3950-4b82-8bf7-3f6e45dd6953-lib-modules" (OuterVolumeSpecName: "lib-modules") pod "61667a66-3950-4b82-8bf7-3f6e45dd6953" (UID: "61667a66-3950-4b82-8bf7-3f6e45dd6953"). InnerVolumeSpecName "lib-modules". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 15:26:42 crc kubenswrapper[4785]: I1126 15:26:42.362040 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z4fqk\" (UniqueName: \"kubernetes.io/projected/61667a66-3950-4b82-8bf7-3f6e45dd6953-kube-api-access-z4fqk\") pod \"61667a66-3950-4b82-8bf7-3f6e45dd6953\" (UID: \"61667a66-3950-4b82-8bf7-3f6e45dd6953\") " Nov 26 15:26:42 crc kubenswrapper[4785]: I1126 15:26:42.362113 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/61667a66-3950-4b82-8bf7-3f6e45dd6953-sys\") pod \"61667a66-3950-4b82-8bf7-3f6e45dd6953\" (UID: \"61667a66-3950-4b82-8bf7-3f6e45dd6953\") " Nov 26 15:26:42 crc kubenswrapper[4785]: I1126 15:26:42.362039 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/61667a66-3950-4b82-8bf7-3f6e45dd6953-var-locks-brick" (OuterVolumeSpecName: "var-locks-brick") pod "61667a66-3950-4b82-8bf7-3f6e45dd6953" (UID: "61667a66-3950-4b82-8bf7-3f6e45dd6953"). InnerVolumeSpecName "var-locks-brick". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 15:26:42 crc kubenswrapper[4785]: I1126 15:26:42.362037 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/dc9a52a1-12cd-4620-93ad-a3efe24a9a52-dev" (OuterVolumeSpecName: "dev") pod "dc9a52a1-12cd-4620-93ad-a3efe24a9a52" (UID: "dc9a52a1-12cd-4620-93ad-a3efe24a9a52"). InnerVolumeSpecName "dev". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 15:26:42 crc kubenswrapper[4785]: I1126 15:26:42.362157 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/dc9a52a1-12cd-4620-93ad-a3efe24a9a52-etc-nvme\") pod \"dc9a52a1-12cd-4620-93ad-a3efe24a9a52\" (UID: \"dc9a52a1-12cd-4620-93ad-a3efe24a9a52\") " Nov 26 15:26:42 crc kubenswrapper[4785]: I1126 15:26:42.362187 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/dc9a52a1-12cd-4620-93ad-a3efe24a9a52-etc-nvme" (OuterVolumeSpecName: "etc-nvme") pod "dc9a52a1-12cd-4620-93ad-a3efe24a9a52" (UID: "dc9a52a1-12cd-4620-93ad-a3efe24a9a52"). InnerVolumeSpecName "etc-nvme". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 15:26:42 crc kubenswrapper[4785]: I1126 15:26:42.362199 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dc9a52a1-12cd-4620-93ad-a3efe24a9a52-logs" (OuterVolumeSpecName: "logs") pod "dc9a52a1-12cd-4620-93ad-a3efe24a9a52" (UID: "dc9a52a1-12cd-4620-93ad-a3efe24a9a52"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 15:26:42 crc kubenswrapper[4785]: I1126 15:26:42.362229 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/61667a66-3950-4b82-8bf7-3f6e45dd6953-sys" (OuterVolumeSpecName: "sys") pod "61667a66-3950-4b82-8bf7-3f6e45dd6953" (UID: "61667a66-3950-4b82-8bf7-3f6e45dd6953"). InnerVolumeSpecName "sys". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 15:26:42 crc kubenswrapper[4785]: I1126 15:26:42.362230 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/dc9a52a1-12cd-4620-93ad-a3efe24a9a52-lib-modules\") pod \"dc9a52a1-12cd-4620-93ad-a3efe24a9a52\" (UID: \"dc9a52a1-12cd-4620-93ad-a3efe24a9a52\") " Nov 26 15:26:42 crc kubenswrapper[4785]: I1126 15:26:42.362305 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/61667a66-3950-4b82-8bf7-3f6e45dd6953-etc-iscsi\") pod \"61667a66-3950-4b82-8bf7-3f6e45dd6953\" (UID: \"61667a66-3950-4b82-8bf7-3f6e45dd6953\") " Nov 26 15:26:42 crc kubenswrapper[4785]: I1126 15:26:42.362332 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-djcnm\" (UniqueName: \"kubernetes.io/projected/dc9a52a1-12cd-4620-93ad-a3efe24a9a52-kube-api-access-djcnm\") pod \"dc9a52a1-12cd-4620-93ad-a3efe24a9a52\" (UID: \"dc9a52a1-12cd-4620-93ad-a3efe24a9a52\") " Nov 26 15:26:42 crc kubenswrapper[4785]: I1126 15:26:42.362355 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/61667a66-3950-4b82-8bf7-3f6e45dd6953-dev\") pod \"61667a66-3950-4b82-8bf7-3f6e45dd6953\" (UID: \"61667a66-3950-4b82-8bf7-3f6e45dd6953\") " Nov 26 15:26:42 crc kubenswrapper[4785]: I1126 15:26:42.362256 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/dc9a52a1-12cd-4620-93ad-a3efe24a9a52-lib-modules" (OuterVolumeSpecName: "lib-modules") pod "dc9a52a1-12cd-4620-93ad-a3efe24a9a52" (UID: "dc9a52a1-12cd-4620-93ad-a3efe24a9a52"). InnerVolumeSpecName "lib-modules". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 15:26:42 crc kubenswrapper[4785]: I1126 15:26:42.362376 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/61667a66-3950-4b82-8bf7-3f6e45dd6953-httpd-run\") pod \"61667a66-3950-4b82-8bf7-3f6e45dd6953\" (UID: \"61667a66-3950-4b82-8bf7-3f6e45dd6953\") " Nov 26 15:26:42 crc kubenswrapper[4785]: I1126 15:26:42.362419 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/61667a66-3950-4b82-8bf7-3f6e45dd6953-scripts\") pod \"61667a66-3950-4b82-8bf7-3f6e45dd6953\" (UID: \"61667a66-3950-4b82-8bf7-3f6e45dd6953\") " Nov 26 15:26:42 crc kubenswrapper[4785]: I1126 15:26:42.362433 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/dc9a52a1-12cd-4620-93ad-a3efe24a9a52-run\") pod \"dc9a52a1-12cd-4620-93ad-a3efe24a9a52\" (UID: \"dc9a52a1-12cd-4620-93ad-a3efe24a9a52\") " Nov 26 15:26:42 crc kubenswrapper[4785]: I1126 15:26:42.362454 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/61667a66-3950-4b82-8bf7-3f6e45dd6953-logs\") pod \"61667a66-3950-4b82-8bf7-3f6e45dd6953\" (UID: \"61667a66-3950-4b82-8bf7-3f6e45dd6953\") " Nov 26 15:26:42 crc kubenswrapper[4785]: I1126 15:26:42.362467 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/dc9a52a1-12cd-4620-93ad-a3efe24a9a52-var-locks-brick\") pod \"dc9a52a1-12cd-4620-93ad-a3efe24a9a52\" (UID: \"dc9a52a1-12cd-4620-93ad-a3efe24a9a52\") " Nov 26 15:26:42 crc kubenswrapper[4785]: I1126 15:26:42.362490 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance-cache\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"dc9a52a1-12cd-4620-93ad-a3efe24a9a52\" (UID: \"dc9a52a1-12cd-4620-93ad-a3efe24a9a52\") " Nov 26 15:26:42 crc kubenswrapper[4785]: I1126 15:26:42.362525 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/61667a66-3950-4b82-8bf7-3f6e45dd6953-config-data\") pod \"61667a66-3950-4b82-8bf7-3f6e45dd6953\" (UID: \"61667a66-3950-4b82-8bf7-3f6e45dd6953\") " Nov 26 15:26:42 crc kubenswrapper[4785]: I1126 15:26:42.362545 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc9a52a1-12cd-4620-93ad-a3efe24a9a52-config-data\") pod \"dc9a52a1-12cd-4620-93ad-a3efe24a9a52\" (UID: \"dc9a52a1-12cd-4620-93ad-a3efe24a9a52\") " Nov 26 15:26:42 crc kubenswrapper[4785]: I1126 15:26:42.362578 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dc9a52a1-12cd-4620-93ad-a3efe24a9a52-scripts\") pod \"dc9a52a1-12cd-4620-93ad-a3efe24a9a52\" (UID: \"dc9a52a1-12cd-4620-93ad-a3efe24a9a52\") " Nov 26 15:26:42 crc kubenswrapper[4785]: I1126 15:26:42.362591 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/dc9a52a1-12cd-4620-93ad-a3efe24a9a52-sys\") pod \"dc9a52a1-12cd-4620-93ad-a3efe24a9a52\" (UID: \"dc9a52a1-12cd-4620-93ad-a3efe24a9a52\") " Nov 26 15:26:42 crc kubenswrapper[4785]: I1126 15:26:42.362347 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/61667a66-3950-4b82-8bf7-3f6e45dd6953-etc-iscsi" (OuterVolumeSpecName: "etc-iscsi") pod "61667a66-3950-4b82-8bf7-3f6e45dd6953" (UID: "61667a66-3950-4b82-8bf7-3f6e45dd6953"). InnerVolumeSpecName "etc-iscsi". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 15:26:42 crc kubenswrapper[4785]: I1126 15:26:42.362780 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/61667a66-3950-4b82-8bf7-3f6e45dd6953-dev" (OuterVolumeSpecName: "dev") pod "61667a66-3950-4b82-8bf7-3f6e45dd6953" (UID: "61667a66-3950-4b82-8bf7-3f6e45dd6953"). InnerVolumeSpecName "dev". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 15:26:42 crc kubenswrapper[4785]: I1126 15:26:42.362889 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/61667a66-3950-4b82-8bf7-3f6e45dd6953-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "61667a66-3950-4b82-8bf7-3f6e45dd6953" (UID: "61667a66-3950-4b82-8bf7-3f6e45dd6953"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 15:26:42 crc kubenswrapper[4785]: I1126 15:26:42.363063 4785 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/dc9a52a1-12cd-4620-93ad-a3efe24a9a52-httpd-run\") on node \"crc\" DevicePath \"\"" Nov 26 15:26:42 crc kubenswrapper[4785]: I1126 15:26:42.363090 4785 reconciler_common.go:293] "Volume detached for volume \"run\" (UniqueName: \"kubernetes.io/host-path/61667a66-3950-4b82-8bf7-3f6e45dd6953-run\") on node \"crc\" DevicePath \"\"" Nov 26 15:26:42 crc kubenswrapper[4785]: I1126 15:26:42.363107 4785 reconciler_common.go:293] "Volume detached for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/dc9a52a1-12cd-4620-93ad-a3efe24a9a52-etc-iscsi\") on node \"crc\" DevicePath \"\"" Nov 26 15:26:42 crc kubenswrapper[4785]: I1126 15:26:42.363125 4785 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dc9a52a1-12cd-4620-93ad-a3efe24a9a52-logs\") on node \"crc\" DevicePath \"\"" Nov 26 15:26:42 crc kubenswrapper[4785]: I1126 15:26:42.363142 4785 reconciler_common.go:293] "Volume detached for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/61667a66-3950-4b82-8bf7-3f6e45dd6953-etc-nvme\") on node \"crc\" DevicePath \"\"" Nov 26 15:26:42 crc kubenswrapper[4785]: I1126 15:26:42.363160 4785 reconciler_common.go:293] "Volume detached for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/61667a66-3950-4b82-8bf7-3f6e45dd6953-var-locks-brick\") on node \"crc\" DevicePath \"\"" Nov 26 15:26:42 crc kubenswrapper[4785]: I1126 15:26:42.363178 4785 reconciler_common.go:293] "Volume detached for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/dc9a52a1-12cd-4620-93ad-a3efe24a9a52-dev\") on node \"crc\" DevicePath \"\"" Nov 26 15:26:42 crc kubenswrapper[4785]: I1126 15:26:42.363195 4785 reconciler_common.go:293] "Volume detached for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/61667a66-3950-4b82-8bf7-3f6e45dd6953-lib-modules\") on node \"crc\" DevicePath \"\"" Nov 26 15:26:42 crc kubenswrapper[4785]: I1126 15:26:42.363212 4785 reconciler_common.go:293] "Volume detached for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/61667a66-3950-4b82-8bf7-3f6e45dd6953-sys\") on node \"crc\" DevicePath \"\"" Nov 26 15:26:42 crc kubenswrapper[4785]: I1126 15:26:42.363228 4785 reconciler_common.go:293] "Volume detached for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/dc9a52a1-12cd-4620-93ad-a3efe24a9a52-etc-nvme\") on node \"crc\" DevicePath \"\"" Nov 26 15:26:42 crc kubenswrapper[4785]: I1126 15:26:42.363245 4785 reconciler_common.go:293] "Volume detached for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/dc9a52a1-12cd-4620-93ad-a3efe24a9a52-lib-modules\") on node \"crc\" DevicePath \"\"" Nov 26 15:26:42 crc kubenswrapper[4785]: I1126 15:26:42.363262 4785 reconciler_common.go:293] "Volume detached for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/61667a66-3950-4b82-8bf7-3f6e45dd6953-etc-iscsi\") on node \"crc\" DevicePath \"\"" Nov 26 15:26:42 crc kubenswrapper[4785]: I1126 15:26:42.363278 4785 reconciler_common.go:293] "Volume detached for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/61667a66-3950-4b82-8bf7-3f6e45dd6953-dev\") on node \"crc\" DevicePath \"\"" Nov 26 15:26:42 crc kubenswrapper[4785]: I1126 15:26:42.363294 4785 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/61667a66-3950-4b82-8bf7-3f6e45dd6953-httpd-run\") on node \"crc\" DevicePath \"\"" Nov 26 15:26:42 crc kubenswrapper[4785]: I1126 15:26:42.363684 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/dc9a52a1-12cd-4620-93ad-a3efe24a9a52-run" (OuterVolumeSpecName: "run") pod "dc9a52a1-12cd-4620-93ad-a3efe24a9a52" (UID: "dc9a52a1-12cd-4620-93ad-a3efe24a9a52"). InnerVolumeSpecName "run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 15:26:42 crc kubenswrapper[4785]: I1126 15:26:42.363753 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/dc9a52a1-12cd-4620-93ad-a3efe24a9a52-var-locks-brick" (OuterVolumeSpecName: "var-locks-brick") pod "dc9a52a1-12cd-4620-93ad-a3efe24a9a52" (UID: "dc9a52a1-12cd-4620-93ad-a3efe24a9a52"). InnerVolumeSpecName "var-locks-brick". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 15:26:42 crc kubenswrapper[4785]: I1126 15:26:42.363783 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/dc9a52a1-12cd-4620-93ad-a3efe24a9a52-sys" (OuterVolumeSpecName: "sys") pod "dc9a52a1-12cd-4620-93ad-a3efe24a9a52" (UID: "dc9a52a1-12cd-4620-93ad-a3efe24a9a52"). InnerVolumeSpecName "sys". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 15:26:42 crc kubenswrapper[4785]: I1126 15:26:42.364038 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/61667a66-3950-4b82-8bf7-3f6e45dd6953-logs" (OuterVolumeSpecName: "logs") pod "61667a66-3950-4b82-8bf7-3f6e45dd6953" (UID: "61667a66-3950-4b82-8bf7-3f6e45dd6953"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 15:26:42 crc kubenswrapper[4785]: I1126 15:26:42.367035 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/61667a66-3950-4b82-8bf7-3f6e45dd6953-scripts" (OuterVolumeSpecName: "scripts") pod "61667a66-3950-4b82-8bf7-3f6e45dd6953" (UID: "61667a66-3950-4b82-8bf7-3f6e45dd6953"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 15:26:42 crc kubenswrapper[4785]: I1126 15:26:42.367040 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage10-crc" (OuterVolumeSpecName: "glance") pod "dc9a52a1-12cd-4620-93ad-a3efe24a9a52" (UID: "dc9a52a1-12cd-4620-93ad-a3efe24a9a52"). InnerVolumeSpecName "local-storage10-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Nov 26 15:26:42 crc kubenswrapper[4785]: I1126 15:26:42.367116 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage07-crc" (OuterVolumeSpecName: "glance") pod "61667a66-3950-4b82-8bf7-3f6e45dd6953" (UID: "61667a66-3950-4b82-8bf7-3f6e45dd6953"). InnerVolumeSpecName "local-storage07-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Nov 26 15:26:42 crc kubenswrapper[4785]: I1126 15:26:42.367218 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dc9a52a1-12cd-4620-93ad-a3efe24a9a52-kube-api-access-djcnm" (OuterVolumeSpecName: "kube-api-access-djcnm") pod "dc9a52a1-12cd-4620-93ad-a3efe24a9a52" (UID: "dc9a52a1-12cd-4620-93ad-a3efe24a9a52"). InnerVolumeSpecName "kube-api-access-djcnm". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 15:26:42 crc kubenswrapper[4785]: I1126 15:26:42.368175 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/61667a66-3950-4b82-8bf7-3f6e45dd6953-kube-api-access-z4fqk" (OuterVolumeSpecName: "kube-api-access-z4fqk") pod "61667a66-3950-4b82-8bf7-3f6e45dd6953" (UID: "61667a66-3950-4b82-8bf7-3f6e45dd6953"). InnerVolumeSpecName "kube-api-access-z4fqk". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 15:26:42 crc kubenswrapper[4785]: I1126 15:26:42.370141 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage09-crc" (OuterVolumeSpecName: "glance-cache") pod "dc9a52a1-12cd-4620-93ad-a3efe24a9a52" (UID: "dc9a52a1-12cd-4620-93ad-a3efe24a9a52"). InnerVolumeSpecName "local-storage09-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Nov 26 15:26:42 crc kubenswrapper[4785]: I1126 15:26:42.377048 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage06-crc" (OuterVolumeSpecName: "glance-cache") pod "61667a66-3950-4b82-8bf7-3f6e45dd6953" (UID: "61667a66-3950-4b82-8bf7-3f6e45dd6953"). InnerVolumeSpecName "local-storage06-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Nov 26 15:26:42 crc kubenswrapper[4785]: I1126 15:26:42.382667 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc9a52a1-12cd-4620-93ad-a3efe24a9a52-scripts" (OuterVolumeSpecName: "scripts") pod "dc9a52a1-12cd-4620-93ad-a3efe24a9a52" (UID: "dc9a52a1-12cd-4620-93ad-a3efe24a9a52"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 15:26:42 crc kubenswrapper[4785]: I1126 15:26:42.414434 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc9a52a1-12cd-4620-93ad-a3efe24a9a52-config-data" (OuterVolumeSpecName: "config-data") pod "dc9a52a1-12cd-4620-93ad-a3efe24a9a52" (UID: "dc9a52a1-12cd-4620-93ad-a3efe24a9a52"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 15:26:42 crc kubenswrapper[4785]: I1126 15:26:42.424950 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/61667a66-3950-4b82-8bf7-3f6e45dd6953-config-data" (OuterVolumeSpecName: "config-data") pod "61667a66-3950-4b82-8bf7-3f6e45dd6953" (UID: "61667a66-3950-4b82-8bf7-3f6e45dd6953"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 15:26:42 crc kubenswrapper[4785]: I1126 15:26:42.464530 4785 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" " Nov 26 15:26:42 crc kubenswrapper[4785]: I1126 15:26:42.464655 4785 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z4fqk\" (UniqueName: \"kubernetes.io/projected/61667a66-3950-4b82-8bf7-3f6e45dd6953-kube-api-access-z4fqk\") on node \"crc\" DevicePath \"\"" Nov 26 15:26:42 crc kubenswrapper[4785]: I1126 15:26:42.464670 4785 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-djcnm\" (UniqueName: \"kubernetes.io/projected/dc9a52a1-12cd-4620-93ad-a3efe24a9a52-kube-api-access-djcnm\") on node \"crc\" DevicePath \"\"" Nov 26 15:26:42 crc kubenswrapper[4785]: I1126 15:26:42.464679 4785 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/61667a66-3950-4b82-8bf7-3f6e45dd6953-scripts\") on node \"crc\" DevicePath \"\"" Nov 26 15:26:42 crc kubenswrapper[4785]: I1126 15:26:42.464688 4785 reconciler_common.go:293] "Volume detached for volume \"run\" (UniqueName: \"kubernetes.io/host-path/dc9a52a1-12cd-4620-93ad-a3efe24a9a52-run\") on node \"crc\" DevicePath \"\"" Nov 26 15:26:42 crc kubenswrapper[4785]: I1126 15:26:42.464697 4785 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/61667a66-3950-4b82-8bf7-3f6e45dd6953-logs\") on node \"crc\" DevicePath \"\"" Nov 26 15:26:42 crc kubenswrapper[4785]: I1126 15:26:42.464706 4785 reconciler_common.go:293] "Volume detached for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/dc9a52a1-12cd-4620-93ad-a3efe24a9a52-var-locks-brick\") on node \"crc\" DevicePath \"\"" Nov 26 15:26:42 crc kubenswrapper[4785]: I1126 15:26:42.464721 4785 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" " Nov 26 15:26:42 crc kubenswrapper[4785]: I1126 15:26:42.464731 4785 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/61667a66-3950-4b82-8bf7-3f6e45dd6953-config-data\") on node \"crc\" DevicePath \"\"" Nov 26 15:26:42 crc kubenswrapper[4785]: I1126 15:26:42.464740 4785 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc9a52a1-12cd-4620-93ad-a3efe24a9a52-config-data\") on node \"crc\" DevicePath \"\"" Nov 26 15:26:42 crc kubenswrapper[4785]: I1126 15:26:42.464748 4785 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dc9a52a1-12cd-4620-93ad-a3efe24a9a52-scripts\") on node \"crc\" DevicePath \"\"" Nov 26 15:26:42 crc kubenswrapper[4785]: I1126 15:26:42.464756 4785 reconciler_common.go:293] "Volume detached for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/dc9a52a1-12cd-4620-93ad-a3efe24a9a52-sys\") on node \"crc\" DevicePath \"\"" Nov 26 15:26:42 crc kubenswrapper[4785]: I1126 15:26:42.464770 4785 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" " Nov 26 15:26:42 crc kubenswrapper[4785]: I1126 15:26:42.464783 4785 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" " Nov 26 15:26:42 crc kubenswrapper[4785]: I1126 15:26:42.476892 4785 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage06-crc" (UniqueName: "kubernetes.io/local-volume/local-storage06-crc") on node "crc" Nov 26 15:26:42 crc kubenswrapper[4785]: I1126 15:26:42.476952 4785 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage07-crc" (UniqueName: "kubernetes.io/local-volume/local-storage07-crc") on node "crc" Nov 26 15:26:42 crc kubenswrapper[4785]: I1126 15:26:42.477057 4785 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage10-crc" (UniqueName: "kubernetes.io/local-volume/local-storage10-crc") on node "crc" Nov 26 15:26:42 crc kubenswrapper[4785]: I1126 15:26:42.479343 4785 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage09-crc" (UniqueName: "kubernetes.io/local-volume/local-storage09-crc") on node "crc" Nov 26 15:26:42 crc kubenswrapper[4785]: I1126 15:26:42.565688 4785 reconciler_common.go:293] "Volume detached for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" DevicePath \"\"" Nov 26 15:26:42 crc kubenswrapper[4785]: I1126 15:26:42.565731 4785 reconciler_common.go:293] "Volume detached for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" DevicePath \"\"" Nov 26 15:26:42 crc kubenswrapper[4785]: I1126 15:26:42.565741 4785 reconciler_common.go:293] "Volume detached for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" DevicePath \"\"" Nov 26 15:26:42 crc kubenswrapper[4785]: I1126 15:26:42.565750 4785 reconciler_common.go:293] "Volume detached for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" DevicePath \"\"" Nov 26 15:26:42 crc kubenswrapper[4785]: I1126 15:26:42.902433 4785 generic.go:334] "Generic (PLEG): container finished" podID="61667a66-3950-4b82-8bf7-3f6e45dd6953" containerID="6ec36ae05819f8fb34b1132d6a92dd2eed8f56e8a5f821dac544950caf201de0" exitCode=0 Nov 26 15:26:42 crc kubenswrapper[4785]: I1126 15:26:42.902499 4785 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-1" Nov 26 15:26:42 crc kubenswrapper[4785]: I1126 15:26:42.902521 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-1" event={"ID":"61667a66-3950-4b82-8bf7-3f6e45dd6953","Type":"ContainerDied","Data":"6ec36ae05819f8fb34b1132d6a92dd2eed8f56e8a5f821dac544950caf201de0"} Nov 26 15:26:42 crc kubenswrapper[4785]: I1126 15:26:42.902626 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-1" event={"ID":"61667a66-3950-4b82-8bf7-3f6e45dd6953","Type":"ContainerDied","Data":"6b2dbab28c7fbcc7bec800d4b890f0d77bdd1fb423aac40f989ea9e24a8a7746"} Nov 26 15:26:42 crc kubenswrapper[4785]: I1126 15:26:42.902666 4785 scope.go:117] "RemoveContainer" containerID="6ec36ae05819f8fb34b1132d6a92dd2eed8f56e8a5f821dac544950caf201de0" Nov 26 15:26:42 crc kubenswrapper[4785]: I1126 15:26:42.904513 4785 generic.go:334] "Generic (PLEG): container finished" podID="dc9a52a1-12cd-4620-93ad-a3efe24a9a52" containerID="88454a72926dc7f711ffee8b3886325e47489edf2878329f4ff1c3bbff5d315e" exitCode=0 Nov 26 15:26:42 crc kubenswrapper[4785]: I1126 15:26:42.904660 4785 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-0" Nov 26 15:26:42 crc kubenswrapper[4785]: I1126 15:26:42.904717 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"dc9a52a1-12cd-4620-93ad-a3efe24a9a52","Type":"ContainerDied","Data":"88454a72926dc7f711ffee8b3886325e47489edf2878329f4ff1c3bbff5d315e"} Nov 26 15:26:42 crc kubenswrapper[4785]: I1126 15:26:42.904769 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"dc9a52a1-12cd-4620-93ad-a3efe24a9a52","Type":"ContainerDied","Data":"49b0157cc046c2d76a8bea37a2d34206304eeb6f5a748b6a2bbf27c5f9c07b5c"} Nov 26 15:26:42 crc kubenswrapper[4785]: I1126 15:26:42.933866 4785 scope.go:117] "RemoveContainer" containerID="33272b0cd95abb9205ee4016a4b0cb79b068dc24d7ef859f220f4d2312fae2f3" Nov 26 15:26:42 crc kubenswrapper[4785]: I1126 15:26:42.961686 4785 scope.go:117] "RemoveContainer" containerID="6ec36ae05819f8fb34b1132d6a92dd2eed8f56e8a5f821dac544950caf201de0" Nov 26 15:26:42 crc kubenswrapper[4785]: E1126 15:26:42.962296 4785 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6ec36ae05819f8fb34b1132d6a92dd2eed8f56e8a5f821dac544950caf201de0\": container with ID starting with 6ec36ae05819f8fb34b1132d6a92dd2eed8f56e8a5f821dac544950caf201de0 not found: ID does not exist" containerID="6ec36ae05819f8fb34b1132d6a92dd2eed8f56e8a5f821dac544950caf201de0" Nov 26 15:26:42 crc kubenswrapper[4785]: I1126 15:26:42.962370 4785 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6ec36ae05819f8fb34b1132d6a92dd2eed8f56e8a5f821dac544950caf201de0"} err="failed to get container status \"6ec36ae05819f8fb34b1132d6a92dd2eed8f56e8a5f821dac544950caf201de0\": rpc error: code = NotFound desc = could not find container \"6ec36ae05819f8fb34b1132d6a92dd2eed8f56e8a5f821dac544950caf201de0\": container with ID starting with 6ec36ae05819f8fb34b1132d6a92dd2eed8f56e8a5f821dac544950caf201de0 not found: ID does not exist" Nov 26 15:26:42 crc kubenswrapper[4785]: I1126 15:26:42.962411 4785 scope.go:117] "RemoveContainer" containerID="33272b0cd95abb9205ee4016a4b0cb79b068dc24d7ef859f220f4d2312fae2f3" Nov 26 15:26:42 crc kubenswrapper[4785]: E1126 15:26:42.962778 4785 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"33272b0cd95abb9205ee4016a4b0cb79b068dc24d7ef859f220f4d2312fae2f3\": container with ID starting with 33272b0cd95abb9205ee4016a4b0cb79b068dc24d7ef859f220f4d2312fae2f3 not found: ID does not exist" containerID="33272b0cd95abb9205ee4016a4b0cb79b068dc24d7ef859f220f4d2312fae2f3" Nov 26 15:26:42 crc kubenswrapper[4785]: I1126 15:26:42.962809 4785 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"33272b0cd95abb9205ee4016a4b0cb79b068dc24d7ef859f220f4d2312fae2f3"} err="failed to get container status \"33272b0cd95abb9205ee4016a4b0cb79b068dc24d7ef859f220f4d2312fae2f3\": rpc error: code = NotFound desc = could not find container \"33272b0cd95abb9205ee4016a4b0cb79b068dc24d7ef859f220f4d2312fae2f3\": container with ID starting with 33272b0cd95abb9205ee4016a4b0cb79b068dc24d7ef859f220f4d2312fae2f3 not found: ID does not exist" Nov 26 15:26:42 crc kubenswrapper[4785]: I1126 15:26:42.962835 4785 scope.go:117] "RemoveContainer" containerID="88454a72926dc7f711ffee8b3886325e47489edf2878329f4ff1c3bbff5d315e" Nov 26 15:26:42 crc kubenswrapper[4785]: I1126 15:26:42.974956 4785 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Nov 26 15:26:42 crc kubenswrapper[4785]: I1126 15:26:42.981708 4785 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Nov 26 15:26:42 crc kubenswrapper[4785]: I1126 15:26:42.988226 4785 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-single-1"] Nov 26 15:26:43 crc kubenswrapper[4785]: I1126 15:26:43.002958 4785 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-default-single-1"] Nov 26 15:26:43 crc kubenswrapper[4785]: I1126 15:26:43.009429 4785 scope.go:117] "RemoveContainer" containerID="13648058e886cece9bc35e0f2e07e768f1d478fcc56b30a3011754830e6afc97" Nov 26 15:26:43 crc kubenswrapper[4785]: I1126 15:26:43.025919 4785 scope.go:117] "RemoveContainer" containerID="88454a72926dc7f711ffee8b3886325e47489edf2878329f4ff1c3bbff5d315e" Nov 26 15:26:43 crc kubenswrapper[4785]: E1126 15:26:43.026420 4785 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"88454a72926dc7f711ffee8b3886325e47489edf2878329f4ff1c3bbff5d315e\": container with ID starting with 88454a72926dc7f711ffee8b3886325e47489edf2878329f4ff1c3bbff5d315e not found: ID does not exist" containerID="88454a72926dc7f711ffee8b3886325e47489edf2878329f4ff1c3bbff5d315e" Nov 26 15:26:43 crc kubenswrapper[4785]: I1126 15:26:43.026463 4785 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"88454a72926dc7f711ffee8b3886325e47489edf2878329f4ff1c3bbff5d315e"} err="failed to get container status \"88454a72926dc7f711ffee8b3886325e47489edf2878329f4ff1c3bbff5d315e\": rpc error: code = NotFound desc = could not find container \"88454a72926dc7f711ffee8b3886325e47489edf2878329f4ff1c3bbff5d315e\": container with ID starting with 88454a72926dc7f711ffee8b3886325e47489edf2878329f4ff1c3bbff5d315e not found: ID does not exist" Nov 26 15:26:43 crc kubenswrapper[4785]: I1126 15:26:43.026513 4785 scope.go:117] "RemoveContainer" containerID="13648058e886cece9bc35e0f2e07e768f1d478fcc56b30a3011754830e6afc97" Nov 26 15:26:43 crc kubenswrapper[4785]: E1126 15:26:43.026841 4785 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"13648058e886cece9bc35e0f2e07e768f1d478fcc56b30a3011754830e6afc97\": container with ID starting with 13648058e886cece9bc35e0f2e07e768f1d478fcc56b30a3011754830e6afc97 not found: ID does not exist" containerID="13648058e886cece9bc35e0f2e07e768f1d478fcc56b30a3011754830e6afc97" Nov 26 15:26:43 crc kubenswrapper[4785]: I1126 15:26:43.026893 4785 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"13648058e886cece9bc35e0f2e07e768f1d478fcc56b30a3011754830e6afc97"} err="failed to get container status \"13648058e886cece9bc35e0f2e07e768f1d478fcc56b30a3011754830e6afc97\": rpc error: code = NotFound desc = could not find container \"13648058e886cece9bc35e0f2e07e768f1d478fcc56b30a3011754830e6afc97\": container with ID starting with 13648058e886cece9bc35e0f2e07e768f1d478fcc56b30a3011754830e6afc97 not found: ID does not exist" Nov 26 15:26:43 crc kubenswrapper[4785]: E1126 15:26:43.027787 4785 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddc9a52a1_12cd_4620_93ad_a3efe24a9a52.slice/crio-49b0157cc046c2d76a8bea37a2d34206304eeb6f5a748b6a2bbf27c5f9c07b5c\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod61667a66_3950_4b82_8bf7_3f6e45dd6953.slice\": RecentStats: unable to find data in memory cache]" Nov 26 15:26:43 crc kubenswrapper[4785]: I1126 15:26:43.044677 4785 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="61667a66-3950-4b82-8bf7-3f6e45dd6953" path="/var/lib/kubelet/pods/61667a66-3950-4b82-8bf7-3f6e45dd6953/volumes" Nov 26 15:26:43 crc kubenswrapper[4785]: I1126 15:26:43.045358 4785 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dc9a52a1-12cd-4620-93ad-a3efe24a9a52" path="/var/lib/kubelet/pods/dc9a52a1-12cd-4620-93ad-a3efe24a9a52/volumes" Nov 26 15:26:43 crc kubenswrapper[4785]: I1126 15:26:43.729383 4785 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-db-create-dz8z5"] Nov 26 15:26:43 crc kubenswrapper[4785]: I1126 15:26:43.735755 4785 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-db-create-dz8z5"] Nov 26 15:26:43 crc kubenswrapper[4785]: I1126 15:26:43.745537 4785 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-f202-account-create-update-ntk2j"] Nov 26 15:26:43 crc kubenswrapper[4785]: I1126 15:26:43.752207 4785 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-f202-account-create-update-ntk2j"] Nov 26 15:26:43 crc kubenswrapper[4785]: I1126 15:26:43.758593 4785 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glancef202-account-delete-w75db"] Nov 26 15:26:43 crc kubenswrapper[4785]: I1126 15:26:43.763642 4785 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glancef202-account-delete-w75db"] Nov 26 15:26:44 crc kubenswrapper[4785]: I1126 15:26:44.040195 4785 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-db-create-ftcc4"] Nov 26 15:26:44 crc kubenswrapper[4785]: E1126 15:26:44.040507 4785 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc9a52a1-12cd-4620-93ad-a3efe24a9a52" containerName="glance-httpd" Nov 26 15:26:44 crc kubenswrapper[4785]: I1126 15:26:44.040523 4785 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc9a52a1-12cd-4620-93ad-a3efe24a9a52" containerName="glance-httpd" Nov 26 15:26:44 crc kubenswrapper[4785]: E1126 15:26:44.040545 4785 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="61667a66-3950-4b82-8bf7-3f6e45dd6953" containerName="glance-log" Nov 26 15:26:44 crc kubenswrapper[4785]: I1126 15:26:44.040574 4785 state_mem.go:107] "Deleted CPUSet assignment" podUID="61667a66-3950-4b82-8bf7-3f6e45dd6953" containerName="glance-log" Nov 26 15:26:44 crc kubenswrapper[4785]: E1126 15:26:44.040588 4785 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57bb65c9-f6e0-4ee2-ac07-5e1e6d2f70eb" containerName="openstackclient" Nov 26 15:26:44 crc kubenswrapper[4785]: I1126 15:26:44.040597 4785 state_mem.go:107] "Deleted CPUSet assignment" podUID="57bb65c9-f6e0-4ee2-ac07-5e1e6d2f70eb" containerName="openstackclient" Nov 26 15:26:44 crc kubenswrapper[4785]: E1126 15:26:44.040605 4785 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="30b0671e-f3d2-47ab-88c0-bb29580c3139" containerName="mariadb-account-delete" Nov 26 15:26:44 crc kubenswrapper[4785]: I1126 15:26:44.040614 4785 state_mem.go:107] "Deleted CPUSet assignment" podUID="30b0671e-f3d2-47ab-88c0-bb29580c3139" containerName="mariadb-account-delete" Nov 26 15:26:44 crc kubenswrapper[4785]: E1126 15:26:44.040628 4785 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="61667a66-3950-4b82-8bf7-3f6e45dd6953" containerName="glance-httpd" Nov 26 15:26:44 crc kubenswrapper[4785]: I1126 15:26:44.040635 4785 state_mem.go:107] "Deleted CPUSet assignment" podUID="61667a66-3950-4b82-8bf7-3f6e45dd6953" containerName="glance-httpd" Nov 26 15:26:44 crc kubenswrapper[4785]: E1126 15:26:44.040650 4785 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc9a52a1-12cd-4620-93ad-a3efe24a9a52" containerName="glance-log" Nov 26 15:26:44 crc kubenswrapper[4785]: I1126 15:26:44.040658 4785 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc9a52a1-12cd-4620-93ad-a3efe24a9a52" containerName="glance-log" Nov 26 15:26:44 crc kubenswrapper[4785]: I1126 15:26:44.040804 4785 memory_manager.go:354] "RemoveStaleState removing state" podUID="30b0671e-f3d2-47ab-88c0-bb29580c3139" containerName="mariadb-account-delete" Nov 26 15:26:44 crc kubenswrapper[4785]: I1126 15:26:44.040835 4785 memory_manager.go:354] "RemoveStaleState removing state" podUID="dc9a52a1-12cd-4620-93ad-a3efe24a9a52" containerName="glance-log" Nov 26 15:26:44 crc kubenswrapper[4785]: I1126 15:26:44.040844 4785 memory_manager.go:354] "RemoveStaleState removing state" podUID="61667a66-3950-4b82-8bf7-3f6e45dd6953" containerName="glance-log" Nov 26 15:26:44 crc kubenswrapper[4785]: I1126 15:26:44.040857 4785 memory_manager.go:354] "RemoveStaleState removing state" podUID="57bb65c9-f6e0-4ee2-ac07-5e1e6d2f70eb" containerName="openstackclient" Nov 26 15:26:44 crc kubenswrapper[4785]: I1126 15:26:44.040867 4785 memory_manager.go:354] "RemoveStaleState removing state" podUID="61667a66-3950-4b82-8bf7-3f6e45dd6953" containerName="glance-httpd" Nov 26 15:26:44 crc kubenswrapper[4785]: I1126 15:26:44.040876 4785 memory_manager.go:354] "RemoveStaleState removing state" podUID="dc9a52a1-12cd-4620-93ad-a3efe24a9a52" containerName="glance-httpd" Nov 26 15:26:44 crc kubenswrapper[4785]: I1126 15:26:44.041410 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-create-ftcc4" Nov 26 15:26:44 crc kubenswrapper[4785]: I1126 15:26:44.050635 4785 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-db-create-ftcc4"] Nov 26 15:26:44 crc kubenswrapper[4785]: I1126 15:26:44.055678 4785 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-4fd5-account-create-update-kwb7h"] Nov 26 15:26:44 crc kubenswrapper[4785]: I1126 15:26:44.057412 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-4fd5-account-create-update-kwb7h" Nov 26 15:26:44 crc kubenswrapper[4785]: I1126 15:26:44.059042 4785 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-db-secret" Nov 26 15:26:44 crc kubenswrapper[4785]: I1126 15:26:44.063786 4785 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-4fd5-account-create-update-kwb7h"] Nov 26 15:26:44 crc kubenswrapper[4785]: I1126 15:26:44.084265 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kdnkn\" (UniqueName: \"kubernetes.io/projected/e9c73029-3f12-4d6e-90f0-8bac8b6acf15-kube-api-access-kdnkn\") pod \"glance-4fd5-account-create-update-kwb7h\" (UID: \"e9c73029-3f12-4d6e-90f0-8bac8b6acf15\") " pod="glance-kuttl-tests/glance-4fd5-account-create-update-kwb7h" Nov 26 15:26:44 crc kubenswrapper[4785]: I1126 15:26:44.084317 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hm2lp\" (UniqueName: \"kubernetes.io/projected/bb6be7fd-738f-4482-b123-b5bec25f89ed-kube-api-access-hm2lp\") pod \"glance-db-create-ftcc4\" (UID: \"bb6be7fd-738f-4482-b123-b5bec25f89ed\") " pod="glance-kuttl-tests/glance-db-create-ftcc4" Nov 26 15:26:44 crc kubenswrapper[4785]: I1126 15:26:44.084369 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e9c73029-3f12-4d6e-90f0-8bac8b6acf15-operator-scripts\") pod \"glance-4fd5-account-create-update-kwb7h\" (UID: \"e9c73029-3f12-4d6e-90f0-8bac8b6acf15\") " pod="glance-kuttl-tests/glance-4fd5-account-create-update-kwb7h" Nov 26 15:26:44 crc kubenswrapper[4785]: I1126 15:26:44.084452 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bb6be7fd-738f-4482-b123-b5bec25f89ed-operator-scripts\") pod \"glance-db-create-ftcc4\" (UID: \"bb6be7fd-738f-4482-b123-b5bec25f89ed\") " pod="glance-kuttl-tests/glance-db-create-ftcc4" Nov 26 15:26:44 crc kubenswrapper[4785]: I1126 15:26:44.185742 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kdnkn\" (UniqueName: \"kubernetes.io/projected/e9c73029-3f12-4d6e-90f0-8bac8b6acf15-kube-api-access-kdnkn\") pod \"glance-4fd5-account-create-update-kwb7h\" (UID: \"e9c73029-3f12-4d6e-90f0-8bac8b6acf15\") " pod="glance-kuttl-tests/glance-4fd5-account-create-update-kwb7h" Nov 26 15:26:44 crc kubenswrapper[4785]: I1126 15:26:44.185777 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hm2lp\" (UniqueName: \"kubernetes.io/projected/bb6be7fd-738f-4482-b123-b5bec25f89ed-kube-api-access-hm2lp\") pod \"glance-db-create-ftcc4\" (UID: \"bb6be7fd-738f-4482-b123-b5bec25f89ed\") " pod="glance-kuttl-tests/glance-db-create-ftcc4" Nov 26 15:26:44 crc kubenswrapper[4785]: I1126 15:26:44.185801 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e9c73029-3f12-4d6e-90f0-8bac8b6acf15-operator-scripts\") pod \"glance-4fd5-account-create-update-kwb7h\" (UID: \"e9c73029-3f12-4d6e-90f0-8bac8b6acf15\") " pod="glance-kuttl-tests/glance-4fd5-account-create-update-kwb7h" Nov 26 15:26:44 crc kubenswrapper[4785]: I1126 15:26:44.185859 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bb6be7fd-738f-4482-b123-b5bec25f89ed-operator-scripts\") pod \"glance-db-create-ftcc4\" (UID: \"bb6be7fd-738f-4482-b123-b5bec25f89ed\") " pod="glance-kuttl-tests/glance-db-create-ftcc4" Nov 26 15:26:44 crc kubenswrapper[4785]: I1126 15:26:44.186491 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bb6be7fd-738f-4482-b123-b5bec25f89ed-operator-scripts\") pod \"glance-db-create-ftcc4\" (UID: \"bb6be7fd-738f-4482-b123-b5bec25f89ed\") " pod="glance-kuttl-tests/glance-db-create-ftcc4" Nov 26 15:26:44 crc kubenswrapper[4785]: I1126 15:26:44.186681 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e9c73029-3f12-4d6e-90f0-8bac8b6acf15-operator-scripts\") pod \"glance-4fd5-account-create-update-kwb7h\" (UID: \"e9c73029-3f12-4d6e-90f0-8bac8b6acf15\") " pod="glance-kuttl-tests/glance-4fd5-account-create-update-kwb7h" Nov 26 15:26:44 crc kubenswrapper[4785]: I1126 15:26:44.203728 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kdnkn\" (UniqueName: \"kubernetes.io/projected/e9c73029-3f12-4d6e-90f0-8bac8b6acf15-kube-api-access-kdnkn\") pod \"glance-4fd5-account-create-update-kwb7h\" (UID: \"e9c73029-3f12-4d6e-90f0-8bac8b6acf15\") " pod="glance-kuttl-tests/glance-4fd5-account-create-update-kwb7h" Nov 26 15:26:44 crc kubenswrapper[4785]: I1126 15:26:44.205143 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hm2lp\" (UniqueName: \"kubernetes.io/projected/bb6be7fd-738f-4482-b123-b5bec25f89ed-kube-api-access-hm2lp\") pod \"glance-db-create-ftcc4\" (UID: \"bb6be7fd-738f-4482-b123-b5bec25f89ed\") " pod="glance-kuttl-tests/glance-db-create-ftcc4" Nov 26 15:26:44 crc kubenswrapper[4785]: I1126 15:26:44.421456 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-create-ftcc4" Nov 26 15:26:44 crc kubenswrapper[4785]: I1126 15:26:44.431595 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-4fd5-account-create-update-kwb7h" Nov 26 15:26:44 crc kubenswrapper[4785]: I1126 15:26:44.927027 4785 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-db-create-ftcc4"] Nov 26 15:26:44 crc kubenswrapper[4785]: I1126 15:26:44.943440 4785 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-4fd5-account-create-update-kwb7h"] Nov 26 15:26:45 crc kubenswrapper[4785]: I1126 15:26:45.048621 4785 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1c0484b3-bba6-4484-a160-ea8d7ce284d7" path="/var/lib/kubelet/pods/1c0484b3-bba6-4484-a160-ea8d7ce284d7/volumes" Nov 26 15:26:45 crc kubenswrapper[4785]: I1126 15:26:45.050901 4785 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="30b0671e-f3d2-47ab-88c0-bb29580c3139" path="/var/lib/kubelet/pods/30b0671e-f3d2-47ab-88c0-bb29580c3139/volumes" Nov 26 15:26:45 crc kubenswrapper[4785]: I1126 15:26:45.052355 4785 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="61469b1b-225b-4ccb-b192-167ce458296d" path="/var/lib/kubelet/pods/61469b1b-225b-4ccb-b192-167ce458296d/volumes" Nov 26 15:26:45 crc kubenswrapper[4785]: I1126 15:26:45.943960 4785 generic.go:334] "Generic (PLEG): container finished" podID="bb6be7fd-738f-4482-b123-b5bec25f89ed" containerID="99d413fedadb24041128646f11fb62f124690d7419c9688868c720918a96507f" exitCode=0 Nov 26 15:26:45 crc kubenswrapper[4785]: I1126 15:26:45.944100 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-create-ftcc4" event={"ID":"bb6be7fd-738f-4482-b123-b5bec25f89ed","Type":"ContainerDied","Data":"99d413fedadb24041128646f11fb62f124690d7419c9688868c720918a96507f"} Nov 26 15:26:45 crc kubenswrapper[4785]: I1126 15:26:45.944536 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-create-ftcc4" event={"ID":"bb6be7fd-738f-4482-b123-b5bec25f89ed","Type":"ContainerStarted","Data":"c1fd575c93a06773e0ef9745172173c7f0cae8ef678f59ab3b05d484d2a4b437"} Nov 26 15:26:45 crc kubenswrapper[4785]: I1126 15:26:45.948097 4785 generic.go:334] "Generic (PLEG): container finished" podID="e9c73029-3f12-4d6e-90f0-8bac8b6acf15" containerID="3fd311ccc70cf8f4a3a683a80de70adbe247031b5a71df152a6b8b8b897179f4" exitCode=0 Nov 26 15:26:45 crc kubenswrapper[4785]: I1126 15:26:45.948169 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-4fd5-account-create-update-kwb7h" event={"ID":"e9c73029-3f12-4d6e-90f0-8bac8b6acf15","Type":"ContainerDied","Data":"3fd311ccc70cf8f4a3a683a80de70adbe247031b5a71df152a6b8b8b897179f4"} Nov 26 15:26:45 crc kubenswrapper[4785]: I1126 15:26:45.948211 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-4fd5-account-create-update-kwb7h" event={"ID":"e9c73029-3f12-4d6e-90f0-8bac8b6acf15","Type":"ContainerStarted","Data":"b922842525c9892bd671809789f424842d5ebf9bac46f5fec6492527cf27b01c"} Nov 26 15:26:47 crc kubenswrapper[4785]: I1126 15:26:47.274069 4785 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-4fd5-account-create-update-kwb7h" Nov 26 15:26:47 crc kubenswrapper[4785]: I1126 15:26:47.282274 4785 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-create-ftcc4" Nov 26 15:26:47 crc kubenswrapper[4785]: I1126 15:26:47.333510 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hm2lp\" (UniqueName: \"kubernetes.io/projected/bb6be7fd-738f-4482-b123-b5bec25f89ed-kube-api-access-hm2lp\") pod \"bb6be7fd-738f-4482-b123-b5bec25f89ed\" (UID: \"bb6be7fd-738f-4482-b123-b5bec25f89ed\") " Nov 26 15:26:47 crc kubenswrapper[4785]: I1126 15:26:47.333698 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e9c73029-3f12-4d6e-90f0-8bac8b6acf15-operator-scripts\") pod \"e9c73029-3f12-4d6e-90f0-8bac8b6acf15\" (UID: \"e9c73029-3f12-4d6e-90f0-8bac8b6acf15\") " Nov 26 15:26:47 crc kubenswrapper[4785]: I1126 15:26:47.333833 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kdnkn\" (UniqueName: \"kubernetes.io/projected/e9c73029-3f12-4d6e-90f0-8bac8b6acf15-kube-api-access-kdnkn\") pod \"e9c73029-3f12-4d6e-90f0-8bac8b6acf15\" (UID: \"e9c73029-3f12-4d6e-90f0-8bac8b6acf15\") " Nov 26 15:26:47 crc kubenswrapper[4785]: I1126 15:26:47.333861 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bb6be7fd-738f-4482-b123-b5bec25f89ed-operator-scripts\") pod \"bb6be7fd-738f-4482-b123-b5bec25f89ed\" (UID: \"bb6be7fd-738f-4482-b123-b5bec25f89ed\") " Nov 26 15:26:47 crc kubenswrapper[4785]: I1126 15:26:47.334344 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e9c73029-3f12-4d6e-90f0-8bac8b6acf15-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e9c73029-3f12-4d6e-90f0-8bac8b6acf15" (UID: "e9c73029-3f12-4d6e-90f0-8bac8b6acf15"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 15:26:47 crc kubenswrapper[4785]: I1126 15:26:47.334746 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bb6be7fd-738f-4482-b123-b5bec25f89ed-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "bb6be7fd-738f-4482-b123-b5bec25f89ed" (UID: "bb6be7fd-738f-4482-b123-b5bec25f89ed"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 15:26:47 crc kubenswrapper[4785]: I1126 15:26:47.335030 4785 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e9c73029-3f12-4d6e-90f0-8bac8b6acf15-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 26 15:26:47 crc kubenswrapper[4785]: I1126 15:26:47.339762 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e9c73029-3f12-4d6e-90f0-8bac8b6acf15-kube-api-access-kdnkn" (OuterVolumeSpecName: "kube-api-access-kdnkn") pod "e9c73029-3f12-4d6e-90f0-8bac8b6acf15" (UID: "e9c73029-3f12-4d6e-90f0-8bac8b6acf15"). InnerVolumeSpecName "kube-api-access-kdnkn". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 15:26:47 crc kubenswrapper[4785]: I1126 15:26:47.340824 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bb6be7fd-738f-4482-b123-b5bec25f89ed-kube-api-access-hm2lp" (OuterVolumeSpecName: "kube-api-access-hm2lp") pod "bb6be7fd-738f-4482-b123-b5bec25f89ed" (UID: "bb6be7fd-738f-4482-b123-b5bec25f89ed"). InnerVolumeSpecName "kube-api-access-hm2lp". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 15:26:47 crc kubenswrapper[4785]: I1126 15:26:47.435764 4785 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kdnkn\" (UniqueName: \"kubernetes.io/projected/e9c73029-3f12-4d6e-90f0-8bac8b6acf15-kube-api-access-kdnkn\") on node \"crc\" DevicePath \"\"" Nov 26 15:26:47 crc kubenswrapper[4785]: I1126 15:26:47.435793 4785 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bb6be7fd-738f-4482-b123-b5bec25f89ed-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 26 15:26:47 crc kubenswrapper[4785]: I1126 15:26:47.435802 4785 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hm2lp\" (UniqueName: \"kubernetes.io/projected/bb6be7fd-738f-4482-b123-b5bec25f89ed-kube-api-access-hm2lp\") on node \"crc\" DevicePath \"\"" Nov 26 15:26:47 crc kubenswrapper[4785]: I1126 15:26:47.963863 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-create-ftcc4" event={"ID":"bb6be7fd-738f-4482-b123-b5bec25f89ed","Type":"ContainerDied","Data":"c1fd575c93a06773e0ef9745172173c7f0cae8ef678f59ab3b05d484d2a4b437"} Nov 26 15:26:47 crc kubenswrapper[4785]: I1126 15:26:47.964104 4785 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c1fd575c93a06773e0ef9745172173c7f0cae8ef678f59ab3b05d484d2a4b437" Nov 26 15:26:47 crc kubenswrapper[4785]: I1126 15:26:47.964195 4785 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-create-ftcc4" Nov 26 15:26:47 crc kubenswrapper[4785]: I1126 15:26:47.967130 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-4fd5-account-create-update-kwb7h" event={"ID":"e9c73029-3f12-4d6e-90f0-8bac8b6acf15","Type":"ContainerDied","Data":"b922842525c9892bd671809789f424842d5ebf9bac46f5fec6492527cf27b01c"} Nov 26 15:26:47 crc kubenswrapper[4785]: I1126 15:26:47.967191 4785 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b922842525c9892bd671809789f424842d5ebf9bac46f5fec6492527cf27b01c" Nov 26 15:26:47 crc kubenswrapper[4785]: I1126 15:26:47.967218 4785 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-4fd5-account-create-update-kwb7h" Nov 26 15:26:49 crc kubenswrapper[4785]: I1126 15:26:49.183245 4785 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-db-sync-xh4fd"] Nov 26 15:26:49 crc kubenswrapper[4785]: E1126 15:26:49.183818 4785 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e9c73029-3f12-4d6e-90f0-8bac8b6acf15" containerName="mariadb-account-create-update" Nov 26 15:26:49 crc kubenswrapper[4785]: I1126 15:26:49.183830 4785 state_mem.go:107] "Deleted CPUSet assignment" podUID="e9c73029-3f12-4d6e-90f0-8bac8b6acf15" containerName="mariadb-account-create-update" Nov 26 15:26:49 crc kubenswrapper[4785]: E1126 15:26:49.183858 4785 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb6be7fd-738f-4482-b123-b5bec25f89ed" containerName="mariadb-database-create" Nov 26 15:26:49 crc kubenswrapper[4785]: I1126 15:26:49.183864 4785 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb6be7fd-738f-4482-b123-b5bec25f89ed" containerName="mariadb-database-create" Nov 26 15:26:49 crc kubenswrapper[4785]: I1126 15:26:49.183986 4785 memory_manager.go:354] "RemoveStaleState removing state" podUID="e9c73029-3f12-4d6e-90f0-8bac8b6acf15" containerName="mariadb-account-create-update" Nov 26 15:26:49 crc kubenswrapper[4785]: I1126 15:26:49.183995 4785 memory_manager.go:354] "RemoveStaleState removing state" podUID="bb6be7fd-738f-4482-b123-b5bec25f89ed" containerName="mariadb-database-create" Nov 26 15:26:49 crc kubenswrapper[4785]: I1126 15:26:49.184410 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-sync-xh4fd" Nov 26 15:26:49 crc kubenswrapper[4785]: I1126 15:26:49.186214 4785 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-config-data" Nov 26 15:26:49 crc kubenswrapper[4785]: I1126 15:26:49.187068 4785 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"combined-ca-bundle" Nov 26 15:26:49 crc kubenswrapper[4785]: I1126 15:26:49.188082 4785 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-glance-dockercfg-9hs9z" Nov 26 15:26:49 crc kubenswrapper[4785]: I1126 15:26:49.198826 4785 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-db-sync-xh4fd"] Nov 26 15:26:49 crc kubenswrapper[4785]: I1126 15:26:49.259013 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b72d2801-dc26-4fc0-aaf7-9def6813ae4c-config-data\") pod \"glance-db-sync-xh4fd\" (UID: \"b72d2801-dc26-4fc0-aaf7-9def6813ae4c\") " pod="glance-kuttl-tests/glance-db-sync-xh4fd" Nov 26 15:26:49 crc kubenswrapper[4785]: I1126 15:26:49.259076 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b72d2801-dc26-4fc0-aaf7-9def6813ae4c-combined-ca-bundle\") pod \"glance-db-sync-xh4fd\" (UID: \"b72d2801-dc26-4fc0-aaf7-9def6813ae4c\") " pod="glance-kuttl-tests/glance-db-sync-xh4fd" Nov 26 15:26:49 crc kubenswrapper[4785]: I1126 15:26:49.259139 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gnh2k\" (UniqueName: \"kubernetes.io/projected/b72d2801-dc26-4fc0-aaf7-9def6813ae4c-kube-api-access-gnh2k\") pod \"glance-db-sync-xh4fd\" (UID: \"b72d2801-dc26-4fc0-aaf7-9def6813ae4c\") " pod="glance-kuttl-tests/glance-db-sync-xh4fd" Nov 26 15:26:49 crc kubenswrapper[4785]: I1126 15:26:49.259192 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/b72d2801-dc26-4fc0-aaf7-9def6813ae4c-db-sync-config-data\") pod \"glance-db-sync-xh4fd\" (UID: \"b72d2801-dc26-4fc0-aaf7-9def6813ae4c\") " pod="glance-kuttl-tests/glance-db-sync-xh4fd" Nov 26 15:26:49 crc kubenswrapper[4785]: I1126 15:26:49.360487 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b72d2801-dc26-4fc0-aaf7-9def6813ae4c-config-data\") pod \"glance-db-sync-xh4fd\" (UID: \"b72d2801-dc26-4fc0-aaf7-9def6813ae4c\") " pod="glance-kuttl-tests/glance-db-sync-xh4fd" Nov 26 15:26:49 crc kubenswrapper[4785]: I1126 15:26:49.360539 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b72d2801-dc26-4fc0-aaf7-9def6813ae4c-combined-ca-bundle\") pod \"glance-db-sync-xh4fd\" (UID: \"b72d2801-dc26-4fc0-aaf7-9def6813ae4c\") " pod="glance-kuttl-tests/glance-db-sync-xh4fd" Nov 26 15:26:49 crc kubenswrapper[4785]: I1126 15:26:49.360590 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gnh2k\" (UniqueName: \"kubernetes.io/projected/b72d2801-dc26-4fc0-aaf7-9def6813ae4c-kube-api-access-gnh2k\") pod \"glance-db-sync-xh4fd\" (UID: \"b72d2801-dc26-4fc0-aaf7-9def6813ae4c\") " pod="glance-kuttl-tests/glance-db-sync-xh4fd" Nov 26 15:26:49 crc kubenswrapper[4785]: I1126 15:26:49.360663 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/b72d2801-dc26-4fc0-aaf7-9def6813ae4c-db-sync-config-data\") pod \"glance-db-sync-xh4fd\" (UID: \"b72d2801-dc26-4fc0-aaf7-9def6813ae4c\") " pod="glance-kuttl-tests/glance-db-sync-xh4fd" Nov 26 15:26:49 crc kubenswrapper[4785]: I1126 15:26:49.365884 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b72d2801-dc26-4fc0-aaf7-9def6813ae4c-combined-ca-bundle\") pod \"glance-db-sync-xh4fd\" (UID: \"b72d2801-dc26-4fc0-aaf7-9def6813ae4c\") " pod="glance-kuttl-tests/glance-db-sync-xh4fd" Nov 26 15:26:49 crc kubenswrapper[4785]: I1126 15:26:49.366026 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/b72d2801-dc26-4fc0-aaf7-9def6813ae4c-db-sync-config-data\") pod \"glance-db-sync-xh4fd\" (UID: \"b72d2801-dc26-4fc0-aaf7-9def6813ae4c\") " pod="glance-kuttl-tests/glance-db-sync-xh4fd" Nov 26 15:26:49 crc kubenswrapper[4785]: I1126 15:26:49.366760 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b72d2801-dc26-4fc0-aaf7-9def6813ae4c-config-data\") pod \"glance-db-sync-xh4fd\" (UID: \"b72d2801-dc26-4fc0-aaf7-9def6813ae4c\") " pod="glance-kuttl-tests/glance-db-sync-xh4fd" Nov 26 15:26:49 crc kubenswrapper[4785]: I1126 15:26:49.378037 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gnh2k\" (UniqueName: \"kubernetes.io/projected/b72d2801-dc26-4fc0-aaf7-9def6813ae4c-kube-api-access-gnh2k\") pod \"glance-db-sync-xh4fd\" (UID: \"b72d2801-dc26-4fc0-aaf7-9def6813ae4c\") " pod="glance-kuttl-tests/glance-db-sync-xh4fd" Nov 26 15:26:49 crc kubenswrapper[4785]: I1126 15:26:49.499756 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-sync-xh4fd" Nov 26 15:26:49 crc kubenswrapper[4785]: W1126 15:26:49.915443 4785 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb72d2801_dc26_4fc0_aaf7_9def6813ae4c.slice/crio-7d8de44e30d2e4337a2b1a2cc6f8230270b306602a8f3a022886b39d57836524 WatchSource:0}: Error finding container 7d8de44e30d2e4337a2b1a2cc6f8230270b306602a8f3a022886b39d57836524: Status 404 returned error can't find the container with id 7d8de44e30d2e4337a2b1a2cc6f8230270b306602a8f3a022886b39d57836524 Nov 26 15:26:49 crc kubenswrapper[4785]: I1126 15:26:49.917317 4785 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-db-sync-xh4fd"] Nov 26 15:26:49 crc kubenswrapper[4785]: I1126 15:26:49.994545 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-sync-xh4fd" event={"ID":"b72d2801-dc26-4fc0-aaf7-9def6813ae4c","Type":"ContainerStarted","Data":"7d8de44e30d2e4337a2b1a2cc6f8230270b306602a8f3a022886b39d57836524"} Nov 26 15:26:51 crc kubenswrapper[4785]: I1126 15:26:51.003030 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-sync-xh4fd" event={"ID":"b72d2801-dc26-4fc0-aaf7-9def6813ae4c","Type":"ContainerStarted","Data":"8c84a90c72de5e5893aba6898eed2afdef0fdc40d4d733ea01f11a13a269d5d7"} Nov 26 15:26:51 crc kubenswrapper[4785]: I1126 15:26:51.026316 4785 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-db-sync-xh4fd" podStartSLOduration=2.026294564 podStartE2EDuration="2.026294564s" podCreationTimestamp="2025-11-26 15:26:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 15:26:51.023057729 +0000 UTC m=+1174.701423523" watchObservedRunningTime="2025-11-26 15:26:51.026294564 +0000 UTC m=+1174.704660338" Nov 26 15:26:54 crc kubenswrapper[4785]: I1126 15:26:54.030837 4785 generic.go:334] "Generic (PLEG): container finished" podID="b72d2801-dc26-4fc0-aaf7-9def6813ae4c" containerID="8c84a90c72de5e5893aba6898eed2afdef0fdc40d4d733ea01f11a13a269d5d7" exitCode=0 Nov 26 15:26:54 crc kubenswrapper[4785]: I1126 15:26:54.031625 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-sync-xh4fd" event={"ID":"b72d2801-dc26-4fc0-aaf7-9def6813ae4c","Type":"ContainerDied","Data":"8c84a90c72de5e5893aba6898eed2afdef0fdc40d4d733ea01f11a13a269d5d7"} Nov 26 15:26:55 crc kubenswrapper[4785]: I1126 15:26:55.328740 4785 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-sync-xh4fd" Nov 26 15:26:55 crc kubenswrapper[4785]: I1126 15:26:55.451675 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b72d2801-dc26-4fc0-aaf7-9def6813ae4c-combined-ca-bundle\") pod \"b72d2801-dc26-4fc0-aaf7-9def6813ae4c\" (UID: \"b72d2801-dc26-4fc0-aaf7-9def6813ae4c\") " Nov 26 15:26:55 crc kubenswrapper[4785]: I1126 15:26:55.451729 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gnh2k\" (UniqueName: \"kubernetes.io/projected/b72d2801-dc26-4fc0-aaf7-9def6813ae4c-kube-api-access-gnh2k\") pod \"b72d2801-dc26-4fc0-aaf7-9def6813ae4c\" (UID: \"b72d2801-dc26-4fc0-aaf7-9def6813ae4c\") " Nov 26 15:26:55 crc kubenswrapper[4785]: I1126 15:26:55.451868 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b72d2801-dc26-4fc0-aaf7-9def6813ae4c-config-data\") pod \"b72d2801-dc26-4fc0-aaf7-9def6813ae4c\" (UID: \"b72d2801-dc26-4fc0-aaf7-9def6813ae4c\") " Nov 26 15:26:55 crc kubenswrapper[4785]: I1126 15:26:55.451898 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/b72d2801-dc26-4fc0-aaf7-9def6813ae4c-db-sync-config-data\") pod \"b72d2801-dc26-4fc0-aaf7-9def6813ae4c\" (UID: \"b72d2801-dc26-4fc0-aaf7-9def6813ae4c\") " Nov 26 15:26:55 crc kubenswrapper[4785]: I1126 15:26:55.457238 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b72d2801-dc26-4fc0-aaf7-9def6813ae4c-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "b72d2801-dc26-4fc0-aaf7-9def6813ae4c" (UID: "b72d2801-dc26-4fc0-aaf7-9def6813ae4c"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 15:26:55 crc kubenswrapper[4785]: I1126 15:26:55.459096 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b72d2801-dc26-4fc0-aaf7-9def6813ae4c-kube-api-access-gnh2k" (OuterVolumeSpecName: "kube-api-access-gnh2k") pod "b72d2801-dc26-4fc0-aaf7-9def6813ae4c" (UID: "b72d2801-dc26-4fc0-aaf7-9def6813ae4c"). InnerVolumeSpecName "kube-api-access-gnh2k". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 15:26:55 crc kubenswrapper[4785]: I1126 15:26:55.476591 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b72d2801-dc26-4fc0-aaf7-9def6813ae4c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b72d2801-dc26-4fc0-aaf7-9def6813ae4c" (UID: "b72d2801-dc26-4fc0-aaf7-9def6813ae4c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 15:26:55 crc kubenswrapper[4785]: I1126 15:26:55.491847 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b72d2801-dc26-4fc0-aaf7-9def6813ae4c-config-data" (OuterVolumeSpecName: "config-data") pod "b72d2801-dc26-4fc0-aaf7-9def6813ae4c" (UID: "b72d2801-dc26-4fc0-aaf7-9def6813ae4c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 15:26:55 crc kubenswrapper[4785]: I1126 15:26:55.553340 4785 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gnh2k\" (UniqueName: \"kubernetes.io/projected/b72d2801-dc26-4fc0-aaf7-9def6813ae4c-kube-api-access-gnh2k\") on node \"crc\" DevicePath \"\"" Nov 26 15:26:55 crc kubenswrapper[4785]: I1126 15:26:55.553385 4785 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b72d2801-dc26-4fc0-aaf7-9def6813ae4c-config-data\") on node \"crc\" DevicePath \"\"" Nov 26 15:26:55 crc kubenswrapper[4785]: I1126 15:26:55.553399 4785 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/b72d2801-dc26-4fc0-aaf7-9def6813ae4c-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Nov 26 15:26:55 crc kubenswrapper[4785]: I1126 15:26:55.553411 4785 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b72d2801-dc26-4fc0-aaf7-9def6813ae4c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 26 15:26:56 crc kubenswrapper[4785]: I1126 15:26:56.049601 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-sync-xh4fd" event={"ID":"b72d2801-dc26-4fc0-aaf7-9def6813ae4c","Type":"ContainerDied","Data":"7d8de44e30d2e4337a2b1a2cc6f8230270b306602a8f3a022886b39d57836524"} Nov 26 15:26:56 crc kubenswrapper[4785]: I1126 15:26:56.049658 4785 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-sync-xh4fd" Nov 26 15:26:56 crc kubenswrapper[4785]: I1126 15:26:56.049676 4785 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7d8de44e30d2e4337a2b1a2cc6f8230270b306602a8f3a022886b39d57836524" Nov 26 15:26:56 crc kubenswrapper[4785]: I1126 15:26:56.328575 4785 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Nov 26 15:26:56 crc kubenswrapper[4785]: E1126 15:26:56.328824 4785 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b72d2801-dc26-4fc0-aaf7-9def6813ae4c" containerName="glance-db-sync" Nov 26 15:26:56 crc kubenswrapper[4785]: I1126 15:26:56.328845 4785 state_mem.go:107] "Deleted CPUSet assignment" podUID="b72d2801-dc26-4fc0-aaf7-9def6813ae4c" containerName="glance-db-sync" Nov 26 15:26:56 crc kubenswrapper[4785]: I1126 15:26:56.329000 4785 memory_manager.go:354] "RemoveStaleState removing state" podUID="b72d2801-dc26-4fc0-aaf7-9def6813ae4c" containerName="glance-db-sync" Nov 26 15:26:56 crc kubenswrapper[4785]: I1126 15:26:56.329767 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-0" Nov 26 15:26:56 crc kubenswrapper[4785]: I1126 15:26:56.335040 4785 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"combined-ca-bundle" Nov 26 15:26:56 crc kubenswrapper[4785]: I1126 15:26:56.335303 4785 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-scripts" Nov 26 15:26:56 crc kubenswrapper[4785]: I1126 15:26:56.335389 4785 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"cert-glance-default-public-svc" Nov 26 15:26:56 crc kubenswrapper[4785]: I1126 15:26:56.335402 4785 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-glance-dockercfg-9hs9z" Nov 26 15:26:56 crc kubenswrapper[4785]: I1126 15:26:56.335460 4785 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"cert-glance-default-internal-svc" Nov 26 15:26:56 crc kubenswrapper[4785]: I1126 15:26:56.335576 4785 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-default-single-config-data" Nov 26 15:26:56 crc kubenswrapper[4785]: I1126 15:26:56.347514 4785 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Nov 26 15:26:56 crc kubenswrapper[4785]: I1126 15:26:56.467514 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/76e8eead-f5a5-4c43-a079-0a20eebc4b72-config-data\") pod \"glance-default-single-0\" (UID: \"76e8eead-f5a5-4c43-a079-0a20eebc4b72\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 15:26:56 crc kubenswrapper[4785]: I1126 15:26:56.467629 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/76e8eead-f5a5-4c43-a079-0a20eebc4b72-httpd-run\") pod \"glance-default-single-0\" (UID: \"76e8eead-f5a5-4c43-a079-0a20eebc4b72\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 15:26:56 crc kubenswrapper[4785]: I1126 15:26:56.467668 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/76e8eead-f5a5-4c43-a079-0a20eebc4b72-scripts\") pod \"glance-default-single-0\" (UID: \"76e8eead-f5a5-4c43-a079-0a20eebc4b72\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 15:26:56 crc kubenswrapper[4785]: I1126 15:26:56.467694 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/76e8eead-f5a5-4c43-a079-0a20eebc4b72-public-tls-certs\") pod \"glance-default-single-0\" (UID: \"76e8eead-f5a5-4c43-a079-0a20eebc4b72\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 15:26:56 crc kubenswrapper[4785]: I1126 15:26:56.467730 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/76e8eead-f5a5-4c43-a079-0a20eebc4b72-logs\") pod \"glance-default-single-0\" (UID: \"76e8eead-f5a5-4c43-a079-0a20eebc4b72\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 15:26:56 crc kubenswrapper[4785]: I1126 15:26:56.467757 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/76e8eead-f5a5-4c43-a079-0a20eebc4b72-combined-ca-bundle\") pod \"glance-default-single-0\" (UID: \"76e8eead-f5a5-4c43-a079-0a20eebc4b72\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 15:26:56 crc kubenswrapper[4785]: I1126 15:26:56.467823 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/76e8eead-f5a5-4c43-a079-0a20eebc4b72-internal-tls-certs\") pod \"glance-default-single-0\" (UID: \"76e8eead-f5a5-4c43-a079-0a20eebc4b72\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 15:26:56 crc kubenswrapper[4785]: I1126 15:26:56.467890 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-single-0\" (UID: \"76e8eead-f5a5-4c43-a079-0a20eebc4b72\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 15:26:56 crc kubenswrapper[4785]: I1126 15:26:56.467925 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dbftj\" (UniqueName: \"kubernetes.io/projected/76e8eead-f5a5-4c43-a079-0a20eebc4b72-kube-api-access-dbftj\") pod \"glance-default-single-0\" (UID: \"76e8eead-f5a5-4c43-a079-0a20eebc4b72\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 15:26:56 crc kubenswrapper[4785]: I1126 15:26:56.569508 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/76e8eead-f5a5-4c43-a079-0a20eebc4b72-config-data\") pod \"glance-default-single-0\" (UID: \"76e8eead-f5a5-4c43-a079-0a20eebc4b72\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 15:26:56 crc kubenswrapper[4785]: I1126 15:26:56.570205 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/76e8eead-f5a5-4c43-a079-0a20eebc4b72-httpd-run\") pod \"glance-default-single-0\" (UID: \"76e8eead-f5a5-4c43-a079-0a20eebc4b72\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 15:26:56 crc kubenswrapper[4785]: I1126 15:26:56.570240 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/76e8eead-f5a5-4c43-a079-0a20eebc4b72-scripts\") pod \"glance-default-single-0\" (UID: \"76e8eead-f5a5-4c43-a079-0a20eebc4b72\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 15:26:56 crc kubenswrapper[4785]: I1126 15:26:56.570261 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/76e8eead-f5a5-4c43-a079-0a20eebc4b72-public-tls-certs\") pod \"glance-default-single-0\" (UID: \"76e8eead-f5a5-4c43-a079-0a20eebc4b72\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 15:26:56 crc kubenswrapper[4785]: I1126 15:26:56.570292 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/76e8eead-f5a5-4c43-a079-0a20eebc4b72-logs\") pod \"glance-default-single-0\" (UID: \"76e8eead-f5a5-4c43-a079-0a20eebc4b72\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 15:26:56 crc kubenswrapper[4785]: I1126 15:26:56.570316 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/76e8eead-f5a5-4c43-a079-0a20eebc4b72-combined-ca-bundle\") pod \"glance-default-single-0\" (UID: \"76e8eead-f5a5-4c43-a079-0a20eebc4b72\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 15:26:56 crc kubenswrapper[4785]: I1126 15:26:56.570380 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/76e8eead-f5a5-4c43-a079-0a20eebc4b72-internal-tls-certs\") pod \"glance-default-single-0\" (UID: \"76e8eead-f5a5-4c43-a079-0a20eebc4b72\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 15:26:56 crc kubenswrapper[4785]: I1126 15:26:56.570442 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-single-0\" (UID: \"76e8eead-f5a5-4c43-a079-0a20eebc4b72\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 15:26:56 crc kubenswrapper[4785]: I1126 15:26:56.570467 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dbftj\" (UniqueName: \"kubernetes.io/projected/76e8eead-f5a5-4c43-a079-0a20eebc4b72-kube-api-access-dbftj\") pod \"glance-default-single-0\" (UID: \"76e8eead-f5a5-4c43-a079-0a20eebc4b72\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 15:26:56 crc kubenswrapper[4785]: I1126 15:26:56.571175 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/76e8eead-f5a5-4c43-a079-0a20eebc4b72-httpd-run\") pod \"glance-default-single-0\" (UID: \"76e8eead-f5a5-4c43-a079-0a20eebc4b72\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 15:26:56 crc kubenswrapper[4785]: I1126 15:26:56.571254 4785 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-single-0\" (UID: \"76e8eead-f5a5-4c43-a079-0a20eebc4b72\") device mount path \"/mnt/openstack/pv10\"" pod="glance-kuttl-tests/glance-default-single-0" Nov 26 15:26:56 crc kubenswrapper[4785]: I1126 15:26:56.571448 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/76e8eead-f5a5-4c43-a079-0a20eebc4b72-logs\") pod \"glance-default-single-0\" (UID: \"76e8eead-f5a5-4c43-a079-0a20eebc4b72\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 15:26:56 crc kubenswrapper[4785]: I1126 15:26:56.574155 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/76e8eead-f5a5-4c43-a079-0a20eebc4b72-public-tls-certs\") pod \"glance-default-single-0\" (UID: \"76e8eead-f5a5-4c43-a079-0a20eebc4b72\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 15:26:56 crc kubenswrapper[4785]: I1126 15:26:56.574209 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/76e8eead-f5a5-4c43-a079-0a20eebc4b72-combined-ca-bundle\") pod \"glance-default-single-0\" (UID: \"76e8eead-f5a5-4c43-a079-0a20eebc4b72\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 15:26:56 crc kubenswrapper[4785]: I1126 15:26:56.574694 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/76e8eead-f5a5-4c43-a079-0a20eebc4b72-config-data\") pod \"glance-default-single-0\" (UID: \"76e8eead-f5a5-4c43-a079-0a20eebc4b72\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 15:26:56 crc kubenswrapper[4785]: I1126 15:26:56.575019 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/76e8eead-f5a5-4c43-a079-0a20eebc4b72-scripts\") pod \"glance-default-single-0\" (UID: \"76e8eead-f5a5-4c43-a079-0a20eebc4b72\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 15:26:56 crc kubenswrapper[4785]: I1126 15:26:56.575191 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/76e8eead-f5a5-4c43-a079-0a20eebc4b72-internal-tls-certs\") pod \"glance-default-single-0\" (UID: \"76e8eead-f5a5-4c43-a079-0a20eebc4b72\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 15:26:56 crc kubenswrapper[4785]: I1126 15:26:56.598229 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-single-0\" (UID: \"76e8eead-f5a5-4c43-a079-0a20eebc4b72\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 15:26:56 crc kubenswrapper[4785]: I1126 15:26:56.600775 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dbftj\" (UniqueName: \"kubernetes.io/projected/76e8eead-f5a5-4c43-a079-0a20eebc4b72-kube-api-access-dbftj\") pod \"glance-default-single-0\" (UID: \"76e8eead-f5a5-4c43-a079-0a20eebc4b72\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 15:26:56 crc kubenswrapper[4785]: I1126 15:26:56.649071 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-0" Nov 26 15:26:57 crc kubenswrapper[4785]: I1126 15:26:57.058680 4785 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Nov 26 15:26:57 crc kubenswrapper[4785]: I1126 15:26:57.231776 4785 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Nov 26 15:26:58 crc kubenswrapper[4785]: I1126 15:26:58.064571 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"76e8eead-f5a5-4c43-a079-0a20eebc4b72","Type":"ContainerStarted","Data":"26a0d95733f198ac535b3c923ce811b83ab57e21b9dd6d177ec48e874a67f87b"} Nov 26 15:26:58 crc kubenswrapper[4785]: I1126 15:26:58.064862 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"76e8eead-f5a5-4c43-a079-0a20eebc4b72","Type":"ContainerStarted","Data":"1f7aa77dc504090c306f5446e8be1b76d9a92577cf287c57b1262ce52e2111ab"} Nov 26 15:26:58 crc kubenswrapper[4785]: I1126 15:26:58.064872 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"76e8eead-f5a5-4c43-a079-0a20eebc4b72","Type":"ContainerStarted","Data":"ae765325d4ff908d24ae93f7f2787644307fd719a38077c88e35297fb2fd20bb"} Nov 26 15:26:58 crc kubenswrapper[4785]: I1126 15:26:58.064792 4785 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-single-0" podUID="76e8eead-f5a5-4c43-a079-0a20eebc4b72" containerName="glance-httpd" containerID="cri-o://26a0d95733f198ac535b3c923ce811b83ab57e21b9dd6d177ec48e874a67f87b" gracePeriod=30 Nov 26 15:26:58 crc kubenswrapper[4785]: I1126 15:26:58.064677 4785 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-single-0" podUID="76e8eead-f5a5-4c43-a079-0a20eebc4b72" containerName="glance-log" containerID="cri-o://1f7aa77dc504090c306f5446e8be1b76d9a92577cf287c57b1262ce52e2111ab" gracePeriod=30 Nov 26 15:26:58 crc kubenswrapper[4785]: I1126 15:26:58.091756 4785 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-default-single-0" podStartSLOduration=2.091730227 podStartE2EDuration="2.091730227s" podCreationTimestamp="2025-11-26 15:26:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 15:26:58.08657612 +0000 UTC m=+1181.764941894" watchObservedRunningTime="2025-11-26 15:26:58.091730227 +0000 UTC m=+1181.770096021" Nov 26 15:26:58 crc kubenswrapper[4785]: I1126 15:26:58.524718 4785 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-0" Nov 26 15:26:58 crc kubenswrapper[4785]: I1126 15:26:58.701502 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/76e8eead-f5a5-4c43-a079-0a20eebc4b72-public-tls-certs\") pod \"76e8eead-f5a5-4c43-a079-0a20eebc4b72\" (UID: \"76e8eead-f5a5-4c43-a079-0a20eebc4b72\") " Nov 26 15:26:58 crc kubenswrapper[4785]: I1126 15:26:58.701543 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/76e8eead-f5a5-4c43-a079-0a20eebc4b72-scripts\") pod \"76e8eead-f5a5-4c43-a079-0a20eebc4b72\" (UID: \"76e8eead-f5a5-4c43-a079-0a20eebc4b72\") " Nov 26 15:26:58 crc kubenswrapper[4785]: I1126 15:26:58.701630 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/76e8eead-f5a5-4c43-a079-0a20eebc4b72-logs\") pod \"76e8eead-f5a5-4c43-a079-0a20eebc4b72\" (UID: \"76e8eead-f5a5-4c43-a079-0a20eebc4b72\") " Nov 26 15:26:58 crc kubenswrapper[4785]: I1126 15:26:58.701675 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/76e8eead-f5a5-4c43-a079-0a20eebc4b72-internal-tls-certs\") pod \"76e8eead-f5a5-4c43-a079-0a20eebc4b72\" (UID: \"76e8eead-f5a5-4c43-a079-0a20eebc4b72\") " Nov 26 15:26:58 crc kubenswrapper[4785]: I1126 15:26:58.701722 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"76e8eead-f5a5-4c43-a079-0a20eebc4b72\" (UID: \"76e8eead-f5a5-4c43-a079-0a20eebc4b72\") " Nov 26 15:26:58 crc kubenswrapper[4785]: I1126 15:26:58.701739 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/76e8eead-f5a5-4c43-a079-0a20eebc4b72-httpd-run\") pod \"76e8eead-f5a5-4c43-a079-0a20eebc4b72\" (UID: \"76e8eead-f5a5-4c43-a079-0a20eebc4b72\") " Nov 26 15:26:58 crc kubenswrapper[4785]: I1126 15:26:58.701766 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/76e8eead-f5a5-4c43-a079-0a20eebc4b72-combined-ca-bundle\") pod \"76e8eead-f5a5-4c43-a079-0a20eebc4b72\" (UID: \"76e8eead-f5a5-4c43-a079-0a20eebc4b72\") " Nov 26 15:26:58 crc kubenswrapper[4785]: I1126 15:26:58.701794 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/76e8eead-f5a5-4c43-a079-0a20eebc4b72-config-data\") pod \"76e8eead-f5a5-4c43-a079-0a20eebc4b72\" (UID: \"76e8eead-f5a5-4c43-a079-0a20eebc4b72\") " Nov 26 15:26:58 crc kubenswrapper[4785]: I1126 15:26:58.701843 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbftj\" (UniqueName: \"kubernetes.io/projected/76e8eead-f5a5-4c43-a079-0a20eebc4b72-kube-api-access-dbftj\") pod \"76e8eead-f5a5-4c43-a079-0a20eebc4b72\" (UID: \"76e8eead-f5a5-4c43-a079-0a20eebc4b72\") " Nov 26 15:26:58 crc kubenswrapper[4785]: I1126 15:26:58.702135 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/76e8eead-f5a5-4c43-a079-0a20eebc4b72-logs" (OuterVolumeSpecName: "logs") pod "76e8eead-f5a5-4c43-a079-0a20eebc4b72" (UID: "76e8eead-f5a5-4c43-a079-0a20eebc4b72"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 15:26:58 crc kubenswrapper[4785]: I1126 15:26:58.702336 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/76e8eead-f5a5-4c43-a079-0a20eebc4b72-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "76e8eead-f5a5-4c43-a079-0a20eebc4b72" (UID: "76e8eead-f5a5-4c43-a079-0a20eebc4b72"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 15:26:58 crc kubenswrapper[4785]: I1126 15:26:58.707266 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/76e8eead-f5a5-4c43-a079-0a20eebc4b72-scripts" (OuterVolumeSpecName: "scripts") pod "76e8eead-f5a5-4c43-a079-0a20eebc4b72" (UID: "76e8eead-f5a5-4c43-a079-0a20eebc4b72"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 15:26:58 crc kubenswrapper[4785]: I1126 15:26:58.708098 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage10-crc" (OuterVolumeSpecName: "glance") pod "76e8eead-f5a5-4c43-a079-0a20eebc4b72" (UID: "76e8eead-f5a5-4c43-a079-0a20eebc4b72"). InnerVolumeSpecName "local-storage10-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Nov 26 15:26:58 crc kubenswrapper[4785]: I1126 15:26:58.708650 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/76e8eead-f5a5-4c43-a079-0a20eebc4b72-kube-api-access-dbftj" (OuterVolumeSpecName: "kube-api-access-dbftj") pod "76e8eead-f5a5-4c43-a079-0a20eebc4b72" (UID: "76e8eead-f5a5-4c43-a079-0a20eebc4b72"). InnerVolumeSpecName "kube-api-access-dbftj". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 15:26:58 crc kubenswrapper[4785]: I1126 15:26:58.737823 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/76e8eead-f5a5-4c43-a079-0a20eebc4b72-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "76e8eead-f5a5-4c43-a079-0a20eebc4b72" (UID: "76e8eead-f5a5-4c43-a079-0a20eebc4b72"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 15:26:58 crc kubenswrapper[4785]: I1126 15:26:58.744964 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/76e8eead-f5a5-4c43-a079-0a20eebc4b72-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "76e8eead-f5a5-4c43-a079-0a20eebc4b72" (UID: "76e8eead-f5a5-4c43-a079-0a20eebc4b72"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 15:26:58 crc kubenswrapper[4785]: I1126 15:26:58.753260 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/76e8eead-f5a5-4c43-a079-0a20eebc4b72-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "76e8eead-f5a5-4c43-a079-0a20eebc4b72" (UID: "76e8eead-f5a5-4c43-a079-0a20eebc4b72"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 15:26:58 crc kubenswrapper[4785]: I1126 15:26:58.758890 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/76e8eead-f5a5-4c43-a079-0a20eebc4b72-config-data" (OuterVolumeSpecName: "config-data") pod "76e8eead-f5a5-4c43-a079-0a20eebc4b72" (UID: "76e8eead-f5a5-4c43-a079-0a20eebc4b72"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 15:26:58 crc kubenswrapper[4785]: I1126 15:26:58.803923 4785 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbftj\" (UniqueName: \"kubernetes.io/projected/76e8eead-f5a5-4c43-a079-0a20eebc4b72-kube-api-access-dbftj\") on node \"crc\" DevicePath \"\"" Nov 26 15:26:58 crc kubenswrapper[4785]: I1126 15:26:58.804136 4785 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/76e8eead-f5a5-4c43-a079-0a20eebc4b72-public-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 26 15:26:58 crc kubenswrapper[4785]: I1126 15:26:58.804255 4785 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/76e8eead-f5a5-4c43-a079-0a20eebc4b72-scripts\") on node \"crc\" DevicePath \"\"" Nov 26 15:26:58 crc kubenswrapper[4785]: I1126 15:26:58.804370 4785 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/76e8eead-f5a5-4c43-a079-0a20eebc4b72-logs\") on node \"crc\" DevicePath \"\"" Nov 26 15:26:58 crc kubenswrapper[4785]: I1126 15:26:58.804479 4785 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/76e8eead-f5a5-4c43-a079-0a20eebc4b72-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 26 15:26:58 crc kubenswrapper[4785]: I1126 15:26:58.804669 4785 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" " Nov 26 15:26:58 crc kubenswrapper[4785]: I1126 15:26:58.804809 4785 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/76e8eead-f5a5-4c43-a079-0a20eebc4b72-httpd-run\") on node \"crc\" DevicePath \"\"" Nov 26 15:26:58 crc kubenswrapper[4785]: I1126 15:26:58.804925 4785 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/76e8eead-f5a5-4c43-a079-0a20eebc4b72-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 26 15:26:58 crc kubenswrapper[4785]: I1126 15:26:58.805033 4785 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/76e8eead-f5a5-4c43-a079-0a20eebc4b72-config-data\") on node \"crc\" DevicePath \"\"" Nov 26 15:26:58 crc kubenswrapper[4785]: I1126 15:26:58.816611 4785 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage10-crc" (UniqueName: "kubernetes.io/local-volume/local-storage10-crc") on node "crc" Nov 26 15:26:58 crc kubenswrapper[4785]: I1126 15:26:58.906925 4785 reconciler_common.go:293] "Volume detached for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" DevicePath \"\"" Nov 26 15:26:59 crc kubenswrapper[4785]: I1126 15:26:59.075932 4785 generic.go:334] "Generic (PLEG): container finished" podID="76e8eead-f5a5-4c43-a079-0a20eebc4b72" containerID="26a0d95733f198ac535b3c923ce811b83ab57e21b9dd6d177ec48e874a67f87b" exitCode=143 Nov 26 15:26:59 crc kubenswrapper[4785]: I1126 15:26:59.075975 4785 generic.go:334] "Generic (PLEG): container finished" podID="76e8eead-f5a5-4c43-a079-0a20eebc4b72" containerID="1f7aa77dc504090c306f5446e8be1b76d9a92577cf287c57b1262ce52e2111ab" exitCode=143 Nov 26 15:26:59 crc kubenswrapper[4785]: I1126 15:26:59.076005 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"76e8eead-f5a5-4c43-a079-0a20eebc4b72","Type":"ContainerDied","Data":"26a0d95733f198ac535b3c923ce811b83ab57e21b9dd6d177ec48e874a67f87b"} Nov 26 15:26:59 crc kubenswrapper[4785]: I1126 15:26:59.076053 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"76e8eead-f5a5-4c43-a079-0a20eebc4b72","Type":"ContainerDied","Data":"1f7aa77dc504090c306f5446e8be1b76d9a92577cf287c57b1262ce52e2111ab"} Nov 26 15:26:59 crc kubenswrapper[4785]: I1126 15:26:59.076077 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"76e8eead-f5a5-4c43-a079-0a20eebc4b72","Type":"ContainerDied","Data":"ae765325d4ff908d24ae93f7f2787644307fd719a38077c88e35297fb2fd20bb"} Nov 26 15:26:59 crc kubenswrapper[4785]: I1126 15:26:59.076104 4785 scope.go:117] "RemoveContainer" containerID="26a0d95733f198ac535b3c923ce811b83ab57e21b9dd6d177ec48e874a67f87b" Nov 26 15:26:59 crc kubenswrapper[4785]: I1126 15:26:59.076181 4785 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-0" Nov 26 15:26:59 crc kubenswrapper[4785]: I1126 15:26:59.105474 4785 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Nov 26 15:26:59 crc kubenswrapper[4785]: I1126 15:26:59.113642 4785 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Nov 26 15:26:59 crc kubenswrapper[4785]: I1126 15:26:59.120463 4785 scope.go:117] "RemoveContainer" containerID="1f7aa77dc504090c306f5446e8be1b76d9a92577cf287c57b1262ce52e2111ab" Nov 26 15:26:59 crc kubenswrapper[4785]: I1126 15:26:59.130527 4785 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Nov 26 15:26:59 crc kubenswrapper[4785]: E1126 15:26:59.133974 4785 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="76e8eead-f5a5-4c43-a079-0a20eebc4b72" containerName="glance-httpd" Nov 26 15:26:59 crc kubenswrapper[4785]: I1126 15:26:59.134002 4785 state_mem.go:107] "Deleted CPUSet assignment" podUID="76e8eead-f5a5-4c43-a079-0a20eebc4b72" containerName="glance-httpd" Nov 26 15:26:59 crc kubenswrapper[4785]: E1126 15:26:59.134025 4785 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="76e8eead-f5a5-4c43-a079-0a20eebc4b72" containerName="glance-log" Nov 26 15:26:59 crc kubenswrapper[4785]: I1126 15:26:59.134034 4785 state_mem.go:107] "Deleted CPUSet assignment" podUID="76e8eead-f5a5-4c43-a079-0a20eebc4b72" containerName="glance-log" Nov 26 15:26:59 crc kubenswrapper[4785]: I1126 15:26:59.134207 4785 memory_manager.go:354] "RemoveStaleState removing state" podUID="76e8eead-f5a5-4c43-a079-0a20eebc4b72" containerName="glance-httpd" Nov 26 15:26:59 crc kubenswrapper[4785]: I1126 15:26:59.134228 4785 memory_manager.go:354] "RemoveStaleState removing state" podUID="76e8eead-f5a5-4c43-a079-0a20eebc4b72" containerName="glance-log" Nov 26 15:26:59 crc kubenswrapper[4785]: I1126 15:26:59.135342 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-0" Nov 26 15:26:59 crc kubenswrapper[4785]: I1126 15:26:59.139125 4785 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-scripts" Nov 26 15:26:59 crc kubenswrapper[4785]: I1126 15:26:59.139291 4785 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"combined-ca-bundle" Nov 26 15:26:59 crc kubenswrapper[4785]: I1126 15:26:59.140463 4785 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"cert-glance-default-public-svc" Nov 26 15:26:59 crc kubenswrapper[4785]: I1126 15:26:59.140754 4785 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"cert-glance-default-internal-svc" Nov 26 15:26:59 crc kubenswrapper[4785]: I1126 15:26:59.141230 4785 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-default-single-config-data" Nov 26 15:26:59 crc kubenswrapper[4785]: I1126 15:26:59.141483 4785 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-glance-dockercfg-9hs9z" Nov 26 15:26:59 crc kubenswrapper[4785]: I1126 15:26:59.156029 4785 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Nov 26 15:26:59 crc kubenswrapper[4785]: I1126 15:26:59.176505 4785 scope.go:117] "RemoveContainer" containerID="26a0d95733f198ac535b3c923ce811b83ab57e21b9dd6d177ec48e874a67f87b" Nov 26 15:26:59 crc kubenswrapper[4785]: E1126 15:26:59.177190 4785 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"26a0d95733f198ac535b3c923ce811b83ab57e21b9dd6d177ec48e874a67f87b\": container with ID starting with 26a0d95733f198ac535b3c923ce811b83ab57e21b9dd6d177ec48e874a67f87b not found: ID does not exist" containerID="26a0d95733f198ac535b3c923ce811b83ab57e21b9dd6d177ec48e874a67f87b" Nov 26 15:26:59 crc kubenswrapper[4785]: I1126 15:26:59.177290 4785 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"26a0d95733f198ac535b3c923ce811b83ab57e21b9dd6d177ec48e874a67f87b"} err="failed to get container status \"26a0d95733f198ac535b3c923ce811b83ab57e21b9dd6d177ec48e874a67f87b\": rpc error: code = NotFound desc = could not find container \"26a0d95733f198ac535b3c923ce811b83ab57e21b9dd6d177ec48e874a67f87b\": container with ID starting with 26a0d95733f198ac535b3c923ce811b83ab57e21b9dd6d177ec48e874a67f87b not found: ID does not exist" Nov 26 15:26:59 crc kubenswrapper[4785]: I1126 15:26:59.177378 4785 scope.go:117] "RemoveContainer" containerID="1f7aa77dc504090c306f5446e8be1b76d9a92577cf287c57b1262ce52e2111ab" Nov 26 15:26:59 crc kubenswrapper[4785]: E1126 15:26:59.177904 4785 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1f7aa77dc504090c306f5446e8be1b76d9a92577cf287c57b1262ce52e2111ab\": container with ID starting with 1f7aa77dc504090c306f5446e8be1b76d9a92577cf287c57b1262ce52e2111ab not found: ID does not exist" containerID="1f7aa77dc504090c306f5446e8be1b76d9a92577cf287c57b1262ce52e2111ab" Nov 26 15:26:59 crc kubenswrapper[4785]: I1126 15:26:59.178011 4785 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1f7aa77dc504090c306f5446e8be1b76d9a92577cf287c57b1262ce52e2111ab"} err="failed to get container status \"1f7aa77dc504090c306f5446e8be1b76d9a92577cf287c57b1262ce52e2111ab\": rpc error: code = NotFound desc = could not find container \"1f7aa77dc504090c306f5446e8be1b76d9a92577cf287c57b1262ce52e2111ab\": container with ID starting with 1f7aa77dc504090c306f5446e8be1b76d9a92577cf287c57b1262ce52e2111ab not found: ID does not exist" Nov 26 15:26:59 crc kubenswrapper[4785]: I1126 15:26:59.178094 4785 scope.go:117] "RemoveContainer" containerID="26a0d95733f198ac535b3c923ce811b83ab57e21b9dd6d177ec48e874a67f87b" Nov 26 15:26:59 crc kubenswrapper[4785]: I1126 15:26:59.183231 4785 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"26a0d95733f198ac535b3c923ce811b83ab57e21b9dd6d177ec48e874a67f87b"} err="failed to get container status \"26a0d95733f198ac535b3c923ce811b83ab57e21b9dd6d177ec48e874a67f87b\": rpc error: code = NotFound desc = could not find container \"26a0d95733f198ac535b3c923ce811b83ab57e21b9dd6d177ec48e874a67f87b\": container with ID starting with 26a0d95733f198ac535b3c923ce811b83ab57e21b9dd6d177ec48e874a67f87b not found: ID does not exist" Nov 26 15:26:59 crc kubenswrapper[4785]: I1126 15:26:59.183305 4785 scope.go:117] "RemoveContainer" containerID="1f7aa77dc504090c306f5446e8be1b76d9a92577cf287c57b1262ce52e2111ab" Nov 26 15:26:59 crc kubenswrapper[4785]: I1126 15:26:59.183955 4785 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1f7aa77dc504090c306f5446e8be1b76d9a92577cf287c57b1262ce52e2111ab"} err="failed to get container status \"1f7aa77dc504090c306f5446e8be1b76d9a92577cf287c57b1262ce52e2111ab\": rpc error: code = NotFound desc = could not find container \"1f7aa77dc504090c306f5446e8be1b76d9a92577cf287c57b1262ce52e2111ab\": container with ID starting with 1f7aa77dc504090c306f5446e8be1b76d9a92577cf287c57b1262ce52e2111ab not found: ID does not exist" Nov 26 15:26:59 crc kubenswrapper[4785]: I1126 15:26:59.313292 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-single-0\" (UID: \"232480ce-5e55-4c5c-bef5-67df7025b5eb\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 15:26:59 crc kubenswrapper[4785]: I1126 15:26:59.313369 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/232480ce-5e55-4c5c-bef5-67df7025b5eb-public-tls-certs\") pod \"glance-default-single-0\" (UID: \"232480ce-5e55-4c5c-bef5-67df7025b5eb\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 15:26:59 crc kubenswrapper[4785]: I1126 15:26:59.313402 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/232480ce-5e55-4c5c-bef5-67df7025b5eb-combined-ca-bundle\") pod \"glance-default-single-0\" (UID: \"232480ce-5e55-4c5c-bef5-67df7025b5eb\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 15:26:59 crc kubenswrapper[4785]: I1126 15:26:59.313430 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/232480ce-5e55-4c5c-bef5-67df7025b5eb-config-data\") pod \"glance-default-single-0\" (UID: \"232480ce-5e55-4c5c-bef5-67df7025b5eb\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 15:26:59 crc kubenswrapper[4785]: I1126 15:26:59.313460 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/232480ce-5e55-4c5c-bef5-67df7025b5eb-internal-tls-certs\") pod \"glance-default-single-0\" (UID: \"232480ce-5e55-4c5c-bef5-67df7025b5eb\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 15:26:59 crc kubenswrapper[4785]: I1126 15:26:59.313489 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/232480ce-5e55-4c5c-bef5-67df7025b5eb-httpd-run\") pod \"glance-default-single-0\" (UID: \"232480ce-5e55-4c5c-bef5-67df7025b5eb\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 15:26:59 crc kubenswrapper[4785]: I1126 15:26:59.313645 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/232480ce-5e55-4c5c-bef5-67df7025b5eb-scripts\") pod \"glance-default-single-0\" (UID: \"232480ce-5e55-4c5c-bef5-67df7025b5eb\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 15:26:59 crc kubenswrapper[4785]: I1126 15:26:59.313775 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mjqzk\" (UniqueName: \"kubernetes.io/projected/232480ce-5e55-4c5c-bef5-67df7025b5eb-kube-api-access-mjqzk\") pod \"glance-default-single-0\" (UID: \"232480ce-5e55-4c5c-bef5-67df7025b5eb\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 15:26:59 crc kubenswrapper[4785]: I1126 15:26:59.313819 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/232480ce-5e55-4c5c-bef5-67df7025b5eb-logs\") pod \"glance-default-single-0\" (UID: \"232480ce-5e55-4c5c-bef5-67df7025b5eb\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 15:26:59 crc kubenswrapper[4785]: I1126 15:26:59.415616 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/232480ce-5e55-4c5c-bef5-67df7025b5eb-internal-tls-certs\") pod \"glance-default-single-0\" (UID: \"232480ce-5e55-4c5c-bef5-67df7025b5eb\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 15:26:59 crc kubenswrapper[4785]: I1126 15:26:59.415719 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/232480ce-5e55-4c5c-bef5-67df7025b5eb-httpd-run\") pod \"glance-default-single-0\" (UID: \"232480ce-5e55-4c5c-bef5-67df7025b5eb\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 15:26:59 crc kubenswrapper[4785]: I1126 15:26:59.416337 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/232480ce-5e55-4c5c-bef5-67df7025b5eb-httpd-run\") pod \"glance-default-single-0\" (UID: \"232480ce-5e55-4c5c-bef5-67df7025b5eb\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 15:26:59 crc kubenswrapper[4785]: I1126 15:26:59.416424 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/232480ce-5e55-4c5c-bef5-67df7025b5eb-scripts\") pod \"glance-default-single-0\" (UID: \"232480ce-5e55-4c5c-bef5-67df7025b5eb\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 15:26:59 crc kubenswrapper[4785]: I1126 15:26:59.416960 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mjqzk\" (UniqueName: \"kubernetes.io/projected/232480ce-5e55-4c5c-bef5-67df7025b5eb-kube-api-access-mjqzk\") pod \"glance-default-single-0\" (UID: \"232480ce-5e55-4c5c-bef5-67df7025b5eb\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 15:26:59 crc kubenswrapper[4785]: I1126 15:26:59.417005 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/232480ce-5e55-4c5c-bef5-67df7025b5eb-logs\") pod \"glance-default-single-0\" (UID: \"232480ce-5e55-4c5c-bef5-67df7025b5eb\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 15:26:59 crc kubenswrapper[4785]: I1126 15:26:59.417112 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-single-0\" (UID: \"232480ce-5e55-4c5c-bef5-67df7025b5eb\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 15:26:59 crc kubenswrapper[4785]: I1126 15:26:59.417184 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/232480ce-5e55-4c5c-bef5-67df7025b5eb-public-tls-certs\") pod \"glance-default-single-0\" (UID: \"232480ce-5e55-4c5c-bef5-67df7025b5eb\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 15:26:59 crc kubenswrapper[4785]: I1126 15:26:59.417230 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/232480ce-5e55-4c5c-bef5-67df7025b5eb-combined-ca-bundle\") pod \"glance-default-single-0\" (UID: \"232480ce-5e55-4c5c-bef5-67df7025b5eb\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 15:26:59 crc kubenswrapper[4785]: I1126 15:26:59.417267 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/232480ce-5e55-4c5c-bef5-67df7025b5eb-config-data\") pod \"glance-default-single-0\" (UID: \"232480ce-5e55-4c5c-bef5-67df7025b5eb\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 15:26:59 crc kubenswrapper[4785]: I1126 15:26:59.417370 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/232480ce-5e55-4c5c-bef5-67df7025b5eb-logs\") pod \"glance-default-single-0\" (UID: \"232480ce-5e55-4c5c-bef5-67df7025b5eb\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 15:26:59 crc kubenswrapper[4785]: I1126 15:26:59.417493 4785 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-single-0\" (UID: \"232480ce-5e55-4c5c-bef5-67df7025b5eb\") device mount path \"/mnt/openstack/pv10\"" pod="glance-kuttl-tests/glance-default-single-0" Nov 26 15:26:59 crc kubenswrapper[4785]: I1126 15:26:59.420580 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/232480ce-5e55-4c5c-bef5-67df7025b5eb-combined-ca-bundle\") pod \"glance-default-single-0\" (UID: \"232480ce-5e55-4c5c-bef5-67df7025b5eb\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 15:26:59 crc kubenswrapper[4785]: I1126 15:26:59.421984 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/232480ce-5e55-4c5c-bef5-67df7025b5eb-scripts\") pod \"glance-default-single-0\" (UID: \"232480ce-5e55-4c5c-bef5-67df7025b5eb\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 15:26:59 crc kubenswrapper[4785]: I1126 15:26:59.421989 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/232480ce-5e55-4c5c-bef5-67df7025b5eb-internal-tls-certs\") pod \"glance-default-single-0\" (UID: \"232480ce-5e55-4c5c-bef5-67df7025b5eb\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 15:26:59 crc kubenswrapper[4785]: I1126 15:26:59.422686 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/232480ce-5e55-4c5c-bef5-67df7025b5eb-public-tls-certs\") pod \"glance-default-single-0\" (UID: \"232480ce-5e55-4c5c-bef5-67df7025b5eb\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 15:26:59 crc kubenswrapper[4785]: I1126 15:26:59.423703 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/232480ce-5e55-4c5c-bef5-67df7025b5eb-config-data\") pod \"glance-default-single-0\" (UID: \"232480ce-5e55-4c5c-bef5-67df7025b5eb\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 15:26:59 crc kubenswrapper[4785]: I1126 15:26:59.441257 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mjqzk\" (UniqueName: \"kubernetes.io/projected/232480ce-5e55-4c5c-bef5-67df7025b5eb-kube-api-access-mjqzk\") pod \"glance-default-single-0\" (UID: \"232480ce-5e55-4c5c-bef5-67df7025b5eb\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 15:26:59 crc kubenswrapper[4785]: I1126 15:26:59.456704 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-single-0\" (UID: \"232480ce-5e55-4c5c-bef5-67df7025b5eb\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 26 15:26:59 crc kubenswrapper[4785]: I1126 15:26:59.756844 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-0" Nov 26 15:27:00 crc kubenswrapper[4785]: I1126 15:27:00.254701 4785 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Nov 26 15:27:00 crc kubenswrapper[4785]: W1126 15:27:00.263003 4785 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod232480ce_5e55_4c5c_bef5_67df7025b5eb.slice/crio-82dae829810ad9654fbbee6fb4242a1cebc15f8387666f21ba30da05205d64e2 WatchSource:0}: Error finding container 82dae829810ad9654fbbee6fb4242a1cebc15f8387666f21ba30da05205d64e2: Status 404 returned error can't find the container with id 82dae829810ad9654fbbee6fb4242a1cebc15f8387666f21ba30da05205d64e2 Nov 26 15:27:01 crc kubenswrapper[4785]: I1126 15:27:01.056496 4785 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="76e8eead-f5a5-4c43-a079-0a20eebc4b72" path="/var/lib/kubelet/pods/76e8eead-f5a5-4c43-a079-0a20eebc4b72/volumes" Nov 26 15:27:01 crc kubenswrapper[4785]: I1126 15:27:01.120629 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"232480ce-5e55-4c5c-bef5-67df7025b5eb","Type":"ContainerStarted","Data":"19a3a1fd3581e5bda448b8df4beffb5e12251de6915adb7045978c7c7b31d052"} Nov 26 15:27:01 crc kubenswrapper[4785]: I1126 15:27:01.120678 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"232480ce-5e55-4c5c-bef5-67df7025b5eb","Type":"ContainerStarted","Data":"82dae829810ad9654fbbee6fb4242a1cebc15f8387666f21ba30da05205d64e2"} Nov 26 15:27:02 crc kubenswrapper[4785]: I1126 15:27:02.144669 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"232480ce-5e55-4c5c-bef5-67df7025b5eb","Type":"ContainerStarted","Data":"0ace70a88719950a3939b3acaf729edf2054bbdc4a26e70f8971e4f5c49d8806"} Nov 26 15:27:02 crc kubenswrapper[4785]: I1126 15:27:02.185842 4785 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-default-single-0" podStartSLOduration=3.185819811 podStartE2EDuration="3.185819811s" podCreationTimestamp="2025-11-26 15:26:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 15:27:02.173408481 +0000 UTC m=+1185.851774315" watchObservedRunningTime="2025-11-26 15:27:02.185819811 +0000 UTC m=+1185.864185585" Nov 26 15:27:09 crc kubenswrapper[4785]: I1126 15:27:09.757121 4785 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-single-0" Nov 26 15:27:09 crc kubenswrapper[4785]: I1126 15:27:09.757769 4785 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-single-0" Nov 26 15:27:09 crc kubenswrapper[4785]: I1126 15:27:09.794160 4785 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-single-0" Nov 26 15:27:09 crc kubenswrapper[4785]: I1126 15:27:09.804587 4785 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-single-0" Nov 26 15:27:10 crc kubenswrapper[4785]: I1126 15:27:10.219414 4785 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-single-0" Nov 26 15:27:10 crc kubenswrapper[4785]: I1126 15:27:10.219460 4785 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-single-0" Nov 26 15:27:12 crc kubenswrapper[4785]: I1126 15:27:12.152232 4785 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-single-0" Nov 26 15:27:12 crc kubenswrapper[4785]: I1126 15:27:12.153133 4785 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-single-0" Nov 26 15:27:13 crc kubenswrapper[4785]: I1126 15:27:13.062985 4785 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-db-sync-xh4fd"] Nov 26 15:27:13 crc kubenswrapper[4785]: I1126 15:27:13.071483 4785 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-db-sync-xh4fd"] Nov 26 15:27:13 crc kubenswrapper[4785]: I1126 15:27:13.123241 4785 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance4fd5-account-delete-2b2n4"] Nov 26 15:27:13 crc kubenswrapper[4785]: I1126 15:27:13.124303 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance4fd5-account-delete-2b2n4" Nov 26 15:27:13 crc kubenswrapper[4785]: I1126 15:27:13.131767 4785 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance4fd5-account-delete-2b2n4"] Nov 26 15:27:13 crc kubenswrapper[4785]: I1126 15:27:13.186742 4785 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Nov 26 15:27:13 crc kubenswrapper[4785]: I1126 15:27:13.242269 4785 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="glance-kuttl-tests/glance-default-single-0" secret="" err="secret \"glance-glance-dockercfg-9hs9z\" not found" Nov 26 15:27:13 crc kubenswrapper[4785]: I1126 15:27:13.272973 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5231dbe6-b3b3-467d-8b64-4af1843c6d9c-operator-scripts\") pod \"glance4fd5-account-delete-2b2n4\" (UID: \"5231dbe6-b3b3-467d-8b64-4af1843c6d9c\") " pod="glance-kuttl-tests/glance4fd5-account-delete-2b2n4" Nov 26 15:27:13 crc kubenswrapper[4785]: I1126 15:27:13.273026 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m42zr\" (UniqueName: \"kubernetes.io/projected/5231dbe6-b3b3-467d-8b64-4af1843c6d9c-kube-api-access-m42zr\") pod \"glance4fd5-account-delete-2b2n4\" (UID: \"5231dbe6-b3b3-467d-8b64-4af1843c6d9c\") " pod="glance-kuttl-tests/glance4fd5-account-delete-2b2n4" Nov 26 15:27:13 crc kubenswrapper[4785]: I1126 15:27:13.373970 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5231dbe6-b3b3-467d-8b64-4af1843c6d9c-operator-scripts\") pod \"glance4fd5-account-delete-2b2n4\" (UID: \"5231dbe6-b3b3-467d-8b64-4af1843c6d9c\") " pod="glance-kuttl-tests/glance4fd5-account-delete-2b2n4" Nov 26 15:27:13 crc kubenswrapper[4785]: I1126 15:27:13.374008 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m42zr\" (UniqueName: \"kubernetes.io/projected/5231dbe6-b3b3-467d-8b64-4af1843c6d9c-kube-api-access-m42zr\") pod \"glance4fd5-account-delete-2b2n4\" (UID: \"5231dbe6-b3b3-467d-8b64-4af1843c6d9c\") " pod="glance-kuttl-tests/glance4fd5-account-delete-2b2n4" Nov 26 15:27:13 crc kubenswrapper[4785]: E1126 15:27:13.374606 4785 secret.go:188] Couldn't get secret glance-kuttl-tests/glance-default-single-config-data: secret "glance-default-single-config-data" not found Nov 26 15:27:13 crc kubenswrapper[4785]: E1126 15:27:13.374662 4785 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/232480ce-5e55-4c5c-bef5-67df7025b5eb-config-data podName:232480ce-5e55-4c5c-bef5-67df7025b5eb nodeName:}" failed. No retries permitted until 2025-11-26 15:27:13.874645472 +0000 UTC m=+1197.553011236 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/secret/232480ce-5e55-4c5c-bef5-67df7025b5eb-config-data") pod "glance-default-single-0" (UID: "232480ce-5e55-4c5c-bef5-67df7025b5eb") : secret "glance-default-single-config-data" not found Nov 26 15:27:13 crc kubenswrapper[4785]: E1126 15:27:13.374670 4785 secret.go:188] Couldn't get secret glance-kuttl-tests/glance-scripts: secret "glance-scripts" not found Nov 26 15:27:13 crc kubenswrapper[4785]: E1126 15:27:13.374704 4785 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/232480ce-5e55-4c5c-bef5-67df7025b5eb-scripts podName:232480ce-5e55-4c5c-bef5-67df7025b5eb nodeName:}" failed. No retries permitted until 2025-11-26 15:27:13.874693584 +0000 UTC m=+1197.553059348 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "scripts" (UniqueName: "kubernetes.io/secret/232480ce-5e55-4c5c-bef5-67df7025b5eb-scripts") pod "glance-default-single-0" (UID: "232480ce-5e55-4c5c-bef5-67df7025b5eb") : secret "glance-scripts" not found Nov 26 15:27:13 crc kubenswrapper[4785]: I1126 15:27:13.374928 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5231dbe6-b3b3-467d-8b64-4af1843c6d9c-operator-scripts\") pod \"glance4fd5-account-delete-2b2n4\" (UID: \"5231dbe6-b3b3-467d-8b64-4af1843c6d9c\") " pod="glance-kuttl-tests/glance4fd5-account-delete-2b2n4" Nov 26 15:27:13 crc kubenswrapper[4785]: I1126 15:27:13.401810 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m42zr\" (UniqueName: \"kubernetes.io/projected/5231dbe6-b3b3-467d-8b64-4af1843c6d9c-kube-api-access-m42zr\") pod \"glance4fd5-account-delete-2b2n4\" (UID: \"5231dbe6-b3b3-467d-8b64-4af1843c6d9c\") " pod="glance-kuttl-tests/glance4fd5-account-delete-2b2n4" Nov 26 15:27:13 crc kubenswrapper[4785]: I1126 15:27:13.440530 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance4fd5-account-delete-2b2n4" Nov 26 15:27:13 crc kubenswrapper[4785]: E1126 15:27:13.882963 4785 secret.go:188] Couldn't get secret glance-kuttl-tests/glance-default-single-config-data: secret "glance-default-single-config-data" not found Nov 26 15:27:13 crc kubenswrapper[4785]: E1126 15:27:13.883064 4785 secret.go:188] Couldn't get secret glance-kuttl-tests/glance-scripts: secret "glance-scripts" not found Nov 26 15:27:13 crc kubenswrapper[4785]: E1126 15:27:13.883336 4785 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/232480ce-5e55-4c5c-bef5-67df7025b5eb-config-data podName:232480ce-5e55-4c5c-bef5-67df7025b5eb nodeName:}" failed. No retries permitted until 2025-11-26 15:27:14.883317472 +0000 UTC m=+1198.561683236 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/secret/232480ce-5e55-4c5c-bef5-67df7025b5eb-config-data") pod "glance-default-single-0" (UID: "232480ce-5e55-4c5c-bef5-67df7025b5eb") : secret "glance-default-single-config-data" not found Nov 26 15:27:13 crc kubenswrapper[4785]: E1126 15:27:13.883389 4785 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/232480ce-5e55-4c5c-bef5-67df7025b5eb-scripts podName:232480ce-5e55-4c5c-bef5-67df7025b5eb nodeName:}" failed. No retries permitted until 2025-11-26 15:27:14.883369094 +0000 UTC m=+1198.561734858 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "scripts" (UniqueName: "kubernetes.io/secret/232480ce-5e55-4c5c-bef5-67df7025b5eb-scripts") pod "glance-default-single-0" (UID: "232480ce-5e55-4c5c-bef5-67df7025b5eb") : secret "glance-scripts" not found Nov 26 15:27:13 crc kubenswrapper[4785]: I1126 15:27:13.982765 4785 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance4fd5-account-delete-2b2n4"] Nov 26 15:27:14 crc kubenswrapper[4785]: I1126 15:27:14.267437 4785 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-single-0" podUID="232480ce-5e55-4c5c-bef5-67df7025b5eb" containerName="glance-log" containerID="cri-o://19a3a1fd3581e5bda448b8df4beffb5e12251de6915adb7045978c7c7b31d052" gracePeriod=30 Nov 26 15:27:14 crc kubenswrapper[4785]: I1126 15:27:14.267788 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance4fd5-account-delete-2b2n4" event={"ID":"5231dbe6-b3b3-467d-8b64-4af1843c6d9c","Type":"ContainerStarted","Data":"3dada919d61dce3122685dda7213cbfdbb204af8baad41f68c5704395899363b"} Nov 26 15:27:14 crc kubenswrapper[4785]: I1126 15:27:14.267816 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance4fd5-account-delete-2b2n4" event={"ID":"5231dbe6-b3b3-467d-8b64-4af1843c6d9c","Type":"ContainerStarted","Data":"752b4d0e597d99eebaeff81ca8379557d6506b2e498857cd49346d68713e0570"} Nov 26 15:27:14 crc kubenswrapper[4785]: I1126 15:27:14.268133 4785 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-single-0" podUID="232480ce-5e55-4c5c-bef5-67df7025b5eb" containerName="glance-httpd" containerID="cri-o://0ace70a88719950a3939b3acaf729edf2054bbdc4a26e70f8971e4f5c49d8806" gracePeriod=30 Nov 26 15:27:14 crc kubenswrapper[4785]: I1126 15:27:14.280188 4785 prober.go:107] "Probe failed" probeType="Readiness" pod="glance-kuttl-tests/glance-default-single-0" podUID="232480ce-5e55-4c5c-bef5-67df7025b5eb" containerName="glance-log" probeResult="failure" output="Get \"https://10.217.0.111:9292/healthcheck\": EOF" Nov 26 15:27:14 crc kubenswrapper[4785]: E1126 15:27:14.905017 4785 secret.go:188] Couldn't get secret glance-kuttl-tests/glance-scripts: secret "glance-scripts" not found Nov 26 15:27:14 crc kubenswrapper[4785]: E1126 15:27:14.905045 4785 secret.go:188] Couldn't get secret glance-kuttl-tests/glance-default-single-config-data: secret "glance-default-single-config-data" not found Nov 26 15:27:14 crc kubenswrapper[4785]: E1126 15:27:14.905452 4785 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/232480ce-5e55-4c5c-bef5-67df7025b5eb-scripts podName:232480ce-5e55-4c5c-bef5-67df7025b5eb nodeName:}" failed. No retries permitted until 2025-11-26 15:27:16.905405667 +0000 UTC m=+1200.583771431 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "scripts" (UniqueName: "kubernetes.io/secret/232480ce-5e55-4c5c-bef5-67df7025b5eb-scripts") pod "glance-default-single-0" (UID: "232480ce-5e55-4c5c-bef5-67df7025b5eb") : secret "glance-scripts" not found Nov 26 15:27:14 crc kubenswrapper[4785]: E1126 15:27:14.905489 4785 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/232480ce-5e55-4c5c-bef5-67df7025b5eb-config-data podName:232480ce-5e55-4c5c-bef5-67df7025b5eb nodeName:}" failed. No retries permitted until 2025-11-26 15:27:16.905467299 +0000 UTC m=+1200.583833083 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/secret/232480ce-5e55-4c5c-bef5-67df7025b5eb-config-data") pod "glance-default-single-0" (UID: "232480ce-5e55-4c5c-bef5-67df7025b5eb") : secret "glance-default-single-config-data" not found Nov 26 15:27:15 crc kubenswrapper[4785]: I1126 15:27:15.047469 4785 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b72d2801-dc26-4fc0-aaf7-9def6813ae4c" path="/var/lib/kubelet/pods/b72d2801-dc26-4fc0-aaf7-9def6813ae4c/volumes" Nov 26 15:27:15 crc kubenswrapper[4785]: I1126 15:27:15.276305 4785 generic.go:334] "Generic (PLEG): container finished" podID="232480ce-5e55-4c5c-bef5-67df7025b5eb" containerID="19a3a1fd3581e5bda448b8df4beffb5e12251de6915adb7045978c7c7b31d052" exitCode=143 Nov 26 15:27:15 crc kubenswrapper[4785]: I1126 15:27:15.276371 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"232480ce-5e55-4c5c-bef5-67df7025b5eb","Type":"ContainerDied","Data":"19a3a1fd3581e5bda448b8df4beffb5e12251de6915adb7045978c7c7b31d052"} Nov 26 15:27:15 crc kubenswrapper[4785]: I1126 15:27:15.277932 4785 generic.go:334] "Generic (PLEG): container finished" podID="5231dbe6-b3b3-467d-8b64-4af1843c6d9c" containerID="3dada919d61dce3122685dda7213cbfdbb204af8baad41f68c5704395899363b" exitCode=0 Nov 26 15:27:15 crc kubenswrapper[4785]: I1126 15:27:15.277966 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance4fd5-account-delete-2b2n4" event={"ID":"5231dbe6-b3b3-467d-8b64-4af1843c6d9c","Type":"ContainerDied","Data":"3dada919d61dce3122685dda7213cbfdbb204af8baad41f68c5704395899363b"} Nov 26 15:27:16 crc kubenswrapper[4785]: I1126 15:27:16.694725 4785 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance4fd5-account-delete-2b2n4" Nov 26 15:27:16 crc kubenswrapper[4785]: I1126 15:27:16.834394 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m42zr\" (UniqueName: \"kubernetes.io/projected/5231dbe6-b3b3-467d-8b64-4af1843c6d9c-kube-api-access-m42zr\") pod \"5231dbe6-b3b3-467d-8b64-4af1843c6d9c\" (UID: \"5231dbe6-b3b3-467d-8b64-4af1843c6d9c\") " Nov 26 15:27:16 crc kubenswrapper[4785]: I1126 15:27:16.834549 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5231dbe6-b3b3-467d-8b64-4af1843c6d9c-operator-scripts\") pod \"5231dbe6-b3b3-467d-8b64-4af1843c6d9c\" (UID: \"5231dbe6-b3b3-467d-8b64-4af1843c6d9c\") " Nov 26 15:27:16 crc kubenswrapper[4785]: I1126 15:27:16.835320 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5231dbe6-b3b3-467d-8b64-4af1843c6d9c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "5231dbe6-b3b3-467d-8b64-4af1843c6d9c" (UID: "5231dbe6-b3b3-467d-8b64-4af1843c6d9c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 15:27:16 crc kubenswrapper[4785]: I1126 15:27:16.851786 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5231dbe6-b3b3-467d-8b64-4af1843c6d9c-kube-api-access-m42zr" (OuterVolumeSpecName: "kube-api-access-m42zr") pod "5231dbe6-b3b3-467d-8b64-4af1843c6d9c" (UID: "5231dbe6-b3b3-467d-8b64-4af1843c6d9c"). InnerVolumeSpecName "kube-api-access-m42zr". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 15:27:16 crc kubenswrapper[4785]: I1126 15:27:16.936075 4785 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m42zr\" (UniqueName: \"kubernetes.io/projected/5231dbe6-b3b3-467d-8b64-4af1843c6d9c-kube-api-access-m42zr\") on node \"crc\" DevicePath \"\"" Nov 26 15:27:16 crc kubenswrapper[4785]: I1126 15:27:16.936104 4785 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5231dbe6-b3b3-467d-8b64-4af1843c6d9c-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 26 15:27:16 crc kubenswrapper[4785]: E1126 15:27:16.936142 4785 secret.go:188] Couldn't get secret glance-kuttl-tests/glance-default-single-config-data: secret "glance-default-single-config-data" not found Nov 26 15:27:16 crc kubenswrapper[4785]: E1126 15:27:16.936156 4785 secret.go:188] Couldn't get secret glance-kuttl-tests/glance-scripts: secret "glance-scripts" not found Nov 26 15:27:16 crc kubenswrapper[4785]: E1126 15:27:16.936205 4785 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/232480ce-5e55-4c5c-bef5-67df7025b5eb-scripts podName:232480ce-5e55-4c5c-bef5-67df7025b5eb nodeName:}" failed. No retries permitted until 2025-11-26 15:27:20.936189704 +0000 UTC m=+1204.614555468 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "scripts" (UniqueName: "kubernetes.io/secret/232480ce-5e55-4c5c-bef5-67df7025b5eb-scripts") pod "glance-default-single-0" (UID: "232480ce-5e55-4c5c-bef5-67df7025b5eb") : secret "glance-scripts" not found Nov 26 15:27:16 crc kubenswrapper[4785]: E1126 15:27:16.936218 4785 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/232480ce-5e55-4c5c-bef5-67df7025b5eb-config-data podName:232480ce-5e55-4c5c-bef5-67df7025b5eb nodeName:}" failed. No retries permitted until 2025-11-26 15:27:20.936212834 +0000 UTC m=+1204.614578598 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/secret/232480ce-5e55-4c5c-bef5-67df7025b5eb-config-data") pod "glance-default-single-0" (UID: "232480ce-5e55-4c5c-bef5-67df7025b5eb") : secret "glance-default-single-config-data" not found Nov 26 15:27:17 crc kubenswrapper[4785]: I1126 15:27:17.296873 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance4fd5-account-delete-2b2n4" event={"ID":"5231dbe6-b3b3-467d-8b64-4af1843c6d9c","Type":"ContainerDied","Data":"752b4d0e597d99eebaeff81ca8379557d6506b2e498857cd49346d68713e0570"} Nov 26 15:27:17 crc kubenswrapper[4785]: I1126 15:27:17.296929 4785 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="752b4d0e597d99eebaeff81ca8379557d6506b2e498857cd49346d68713e0570" Nov 26 15:27:17 crc kubenswrapper[4785]: I1126 15:27:17.296991 4785 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance4fd5-account-delete-2b2n4" Nov 26 15:27:18 crc kubenswrapper[4785]: I1126 15:27:18.158369 4785 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-db-create-ftcc4"] Nov 26 15:27:18 crc kubenswrapper[4785]: I1126 15:27:18.164258 4785 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-db-create-ftcc4"] Nov 26 15:27:18 crc kubenswrapper[4785]: I1126 15:27:18.175491 4785 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-4fd5-account-create-update-kwb7h"] Nov 26 15:27:18 crc kubenswrapper[4785]: I1126 15:27:18.181917 4785 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-4fd5-account-create-update-kwb7h"] Nov 26 15:27:18 crc kubenswrapper[4785]: I1126 15:27:18.187295 4785 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance4fd5-account-delete-2b2n4"] Nov 26 15:27:18 crc kubenswrapper[4785]: I1126 15:27:18.191638 4785 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance4fd5-account-delete-2b2n4"] Nov 26 15:27:19 crc kubenswrapper[4785]: I1126 15:27:19.052619 4785 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5231dbe6-b3b3-467d-8b64-4af1843c6d9c" path="/var/lib/kubelet/pods/5231dbe6-b3b3-467d-8b64-4af1843c6d9c/volumes" Nov 26 15:27:19 crc kubenswrapper[4785]: I1126 15:27:19.053904 4785 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bb6be7fd-738f-4482-b123-b5bec25f89ed" path="/var/lib/kubelet/pods/bb6be7fd-738f-4482-b123-b5bec25f89ed/volumes" Nov 26 15:27:19 crc kubenswrapper[4785]: I1126 15:27:19.054344 4785 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e9c73029-3f12-4d6e-90f0-8bac8b6acf15" path="/var/lib/kubelet/pods/e9c73029-3f12-4d6e-90f0-8bac8b6acf15/volumes" Nov 26 15:27:19 crc kubenswrapper[4785]: I1126 15:27:19.136969 4785 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-0" Nov 26 15:27:19 crc kubenswrapper[4785]: I1126 15:27:19.301056 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"232480ce-5e55-4c5c-bef5-67df7025b5eb\" (UID: \"232480ce-5e55-4c5c-bef5-67df7025b5eb\") " Nov 26 15:27:19 crc kubenswrapper[4785]: I1126 15:27:19.301187 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/232480ce-5e55-4c5c-bef5-67df7025b5eb-public-tls-certs\") pod \"232480ce-5e55-4c5c-bef5-67df7025b5eb\" (UID: \"232480ce-5e55-4c5c-bef5-67df7025b5eb\") " Nov 26 15:27:19 crc kubenswrapper[4785]: I1126 15:27:19.301297 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/232480ce-5e55-4c5c-bef5-67df7025b5eb-combined-ca-bundle\") pod \"232480ce-5e55-4c5c-bef5-67df7025b5eb\" (UID: \"232480ce-5e55-4c5c-bef5-67df7025b5eb\") " Nov 26 15:27:19 crc kubenswrapper[4785]: I1126 15:27:19.301388 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/232480ce-5e55-4c5c-bef5-67df7025b5eb-httpd-run\") pod \"232480ce-5e55-4c5c-bef5-67df7025b5eb\" (UID: \"232480ce-5e55-4c5c-bef5-67df7025b5eb\") " Nov 26 15:27:19 crc kubenswrapper[4785]: I1126 15:27:19.301423 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/232480ce-5e55-4c5c-bef5-67df7025b5eb-logs\") pod \"232480ce-5e55-4c5c-bef5-67df7025b5eb\" (UID: \"232480ce-5e55-4c5c-bef5-67df7025b5eb\") " Nov 26 15:27:19 crc kubenswrapper[4785]: I1126 15:27:19.301489 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mjqzk\" (UniqueName: \"kubernetes.io/projected/232480ce-5e55-4c5c-bef5-67df7025b5eb-kube-api-access-mjqzk\") pod \"232480ce-5e55-4c5c-bef5-67df7025b5eb\" (UID: \"232480ce-5e55-4c5c-bef5-67df7025b5eb\") " Nov 26 15:27:19 crc kubenswrapper[4785]: I1126 15:27:19.301533 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/232480ce-5e55-4c5c-bef5-67df7025b5eb-config-data\") pod \"232480ce-5e55-4c5c-bef5-67df7025b5eb\" (UID: \"232480ce-5e55-4c5c-bef5-67df7025b5eb\") " Nov 26 15:27:19 crc kubenswrapper[4785]: I1126 15:27:19.301595 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/232480ce-5e55-4c5c-bef5-67df7025b5eb-internal-tls-certs\") pod \"232480ce-5e55-4c5c-bef5-67df7025b5eb\" (UID: \"232480ce-5e55-4c5c-bef5-67df7025b5eb\") " Nov 26 15:27:19 crc kubenswrapper[4785]: I1126 15:27:19.301630 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/232480ce-5e55-4c5c-bef5-67df7025b5eb-scripts\") pod \"232480ce-5e55-4c5c-bef5-67df7025b5eb\" (UID: \"232480ce-5e55-4c5c-bef5-67df7025b5eb\") " Nov 26 15:27:19 crc kubenswrapper[4785]: I1126 15:27:19.302085 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/232480ce-5e55-4c5c-bef5-67df7025b5eb-logs" (OuterVolumeSpecName: "logs") pod "232480ce-5e55-4c5c-bef5-67df7025b5eb" (UID: "232480ce-5e55-4c5c-bef5-67df7025b5eb"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 15:27:19 crc kubenswrapper[4785]: I1126 15:27:19.302233 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/232480ce-5e55-4c5c-bef5-67df7025b5eb-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "232480ce-5e55-4c5c-bef5-67df7025b5eb" (UID: "232480ce-5e55-4c5c-bef5-67df7025b5eb"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 15:27:19 crc kubenswrapper[4785]: I1126 15:27:19.306651 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/232480ce-5e55-4c5c-bef5-67df7025b5eb-kube-api-access-mjqzk" (OuterVolumeSpecName: "kube-api-access-mjqzk") pod "232480ce-5e55-4c5c-bef5-67df7025b5eb" (UID: "232480ce-5e55-4c5c-bef5-67df7025b5eb"). InnerVolumeSpecName "kube-api-access-mjqzk". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 15:27:19 crc kubenswrapper[4785]: I1126 15:27:19.307850 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage10-crc" (OuterVolumeSpecName: "glance") pod "232480ce-5e55-4c5c-bef5-67df7025b5eb" (UID: "232480ce-5e55-4c5c-bef5-67df7025b5eb"). InnerVolumeSpecName "local-storage10-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Nov 26 15:27:19 crc kubenswrapper[4785]: I1126 15:27:19.308683 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/232480ce-5e55-4c5c-bef5-67df7025b5eb-scripts" (OuterVolumeSpecName: "scripts") pod "232480ce-5e55-4c5c-bef5-67df7025b5eb" (UID: "232480ce-5e55-4c5c-bef5-67df7025b5eb"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 15:27:19 crc kubenswrapper[4785]: I1126 15:27:19.313211 4785 generic.go:334] "Generic (PLEG): container finished" podID="232480ce-5e55-4c5c-bef5-67df7025b5eb" containerID="0ace70a88719950a3939b3acaf729edf2054bbdc4a26e70f8971e4f5c49d8806" exitCode=0 Nov 26 15:27:19 crc kubenswrapper[4785]: I1126 15:27:19.313249 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"232480ce-5e55-4c5c-bef5-67df7025b5eb","Type":"ContainerDied","Data":"0ace70a88719950a3939b3acaf729edf2054bbdc4a26e70f8971e4f5c49d8806"} Nov 26 15:27:19 crc kubenswrapper[4785]: I1126 15:27:19.313275 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"232480ce-5e55-4c5c-bef5-67df7025b5eb","Type":"ContainerDied","Data":"82dae829810ad9654fbbee6fb4242a1cebc15f8387666f21ba30da05205d64e2"} Nov 26 15:27:19 crc kubenswrapper[4785]: I1126 15:27:19.313294 4785 scope.go:117] "RemoveContainer" containerID="0ace70a88719950a3939b3acaf729edf2054bbdc4a26e70f8971e4f5c49d8806" Nov 26 15:27:19 crc kubenswrapper[4785]: I1126 15:27:19.313410 4785 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-0" Nov 26 15:27:19 crc kubenswrapper[4785]: I1126 15:27:19.323910 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/232480ce-5e55-4c5c-bef5-67df7025b5eb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "232480ce-5e55-4c5c-bef5-67df7025b5eb" (UID: "232480ce-5e55-4c5c-bef5-67df7025b5eb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 15:27:19 crc kubenswrapper[4785]: I1126 15:27:19.355976 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/232480ce-5e55-4c5c-bef5-67df7025b5eb-config-data" (OuterVolumeSpecName: "config-data") pod "232480ce-5e55-4c5c-bef5-67df7025b5eb" (UID: "232480ce-5e55-4c5c-bef5-67df7025b5eb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 15:27:19 crc kubenswrapper[4785]: I1126 15:27:19.362743 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/232480ce-5e55-4c5c-bef5-67df7025b5eb-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "232480ce-5e55-4c5c-bef5-67df7025b5eb" (UID: "232480ce-5e55-4c5c-bef5-67df7025b5eb"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 15:27:19 crc kubenswrapper[4785]: I1126 15:27:19.366401 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/232480ce-5e55-4c5c-bef5-67df7025b5eb-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "232480ce-5e55-4c5c-bef5-67df7025b5eb" (UID: "232480ce-5e55-4c5c-bef5-67df7025b5eb"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 15:27:19 crc kubenswrapper[4785]: I1126 15:27:19.383386 4785 scope.go:117] "RemoveContainer" containerID="19a3a1fd3581e5bda448b8df4beffb5e12251de6915adb7045978c7c7b31d052" Nov 26 15:27:19 crc kubenswrapper[4785]: I1126 15:27:19.399892 4785 scope.go:117] "RemoveContainer" containerID="0ace70a88719950a3939b3acaf729edf2054bbdc4a26e70f8971e4f5c49d8806" Nov 26 15:27:19 crc kubenswrapper[4785]: E1126 15:27:19.400291 4785 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0ace70a88719950a3939b3acaf729edf2054bbdc4a26e70f8971e4f5c49d8806\": container with ID starting with 0ace70a88719950a3939b3acaf729edf2054bbdc4a26e70f8971e4f5c49d8806 not found: ID does not exist" containerID="0ace70a88719950a3939b3acaf729edf2054bbdc4a26e70f8971e4f5c49d8806" Nov 26 15:27:19 crc kubenswrapper[4785]: I1126 15:27:19.400322 4785 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0ace70a88719950a3939b3acaf729edf2054bbdc4a26e70f8971e4f5c49d8806"} err="failed to get container status \"0ace70a88719950a3939b3acaf729edf2054bbdc4a26e70f8971e4f5c49d8806\": rpc error: code = NotFound desc = could not find container \"0ace70a88719950a3939b3acaf729edf2054bbdc4a26e70f8971e4f5c49d8806\": container with ID starting with 0ace70a88719950a3939b3acaf729edf2054bbdc4a26e70f8971e4f5c49d8806 not found: ID does not exist" Nov 26 15:27:19 crc kubenswrapper[4785]: I1126 15:27:19.400342 4785 scope.go:117] "RemoveContainer" containerID="19a3a1fd3581e5bda448b8df4beffb5e12251de6915adb7045978c7c7b31d052" Nov 26 15:27:19 crc kubenswrapper[4785]: E1126 15:27:19.400621 4785 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"19a3a1fd3581e5bda448b8df4beffb5e12251de6915adb7045978c7c7b31d052\": container with ID starting with 19a3a1fd3581e5bda448b8df4beffb5e12251de6915adb7045978c7c7b31d052 not found: ID does not exist" containerID="19a3a1fd3581e5bda448b8df4beffb5e12251de6915adb7045978c7c7b31d052" Nov 26 15:27:19 crc kubenswrapper[4785]: I1126 15:27:19.400671 4785 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"19a3a1fd3581e5bda448b8df4beffb5e12251de6915adb7045978c7c7b31d052"} err="failed to get container status \"19a3a1fd3581e5bda448b8df4beffb5e12251de6915adb7045978c7c7b31d052\": rpc error: code = NotFound desc = could not find container \"19a3a1fd3581e5bda448b8df4beffb5e12251de6915adb7045978c7c7b31d052\": container with ID starting with 19a3a1fd3581e5bda448b8df4beffb5e12251de6915adb7045978c7c7b31d052 not found: ID does not exist" Nov 26 15:27:19 crc kubenswrapper[4785]: I1126 15:27:19.403623 4785 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" " Nov 26 15:27:19 crc kubenswrapper[4785]: I1126 15:27:19.403649 4785 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/232480ce-5e55-4c5c-bef5-67df7025b5eb-public-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 26 15:27:19 crc kubenswrapper[4785]: I1126 15:27:19.403665 4785 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/232480ce-5e55-4c5c-bef5-67df7025b5eb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 26 15:27:19 crc kubenswrapper[4785]: I1126 15:27:19.403677 4785 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/232480ce-5e55-4c5c-bef5-67df7025b5eb-httpd-run\") on node \"crc\" DevicePath \"\"" Nov 26 15:27:19 crc kubenswrapper[4785]: I1126 15:27:19.403688 4785 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/232480ce-5e55-4c5c-bef5-67df7025b5eb-logs\") on node \"crc\" DevicePath \"\"" Nov 26 15:27:19 crc kubenswrapper[4785]: I1126 15:27:19.403699 4785 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mjqzk\" (UniqueName: \"kubernetes.io/projected/232480ce-5e55-4c5c-bef5-67df7025b5eb-kube-api-access-mjqzk\") on node \"crc\" DevicePath \"\"" Nov 26 15:27:19 crc kubenswrapper[4785]: I1126 15:27:19.403710 4785 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/232480ce-5e55-4c5c-bef5-67df7025b5eb-config-data\") on node \"crc\" DevicePath \"\"" Nov 26 15:27:19 crc kubenswrapper[4785]: I1126 15:27:19.403724 4785 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/232480ce-5e55-4c5c-bef5-67df7025b5eb-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 26 15:27:19 crc kubenswrapper[4785]: I1126 15:27:19.403735 4785 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/232480ce-5e55-4c5c-bef5-67df7025b5eb-scripts\") on node \"crc\" DevicePath \"\"" Nov 26 15:27:19 crc kubenswrapper[4785]: I1126 15:27:19.420477 4785 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage10-crc" (UniqueName: "kubernetes.io/local-volume/local-storage10-crc") on node "crc" Nov 26 15:27:19 crc kubenswrapper[4785]: I1126 15:27:19.505159 4785 reconciler_common.go:293] "Volume detached for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" DevicePath \"\"" Nov 26 15:27:19 crc kubenswrapper[4785]: I1126 15:27:19.643177 4785 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Nov 26 15:27:19 crc kubenswrapper[4785]: I1126 15:27:19.648076 4785 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Nov 26 15:27:20 crc kubenswrapper[4785]: I1126 15:27:20.962582 4785 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-db-create-dr56t"] Nov 26 15:27:20 crc kubenswrapper[4785]: E1126 15:27:20.963253 4785 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="232480ce-5e55-4c5c-bef5-67df7025b5eb" containerName="glance-httpd" Nov 26 15:27:20 crc kubenswrapper[4785]: I1126 15:27:20.963276 4785 state_mem.go:107] "Deleted CPUSet assignment" podUID="232480ce-5e55-4c5c-bef5-67df7025b5eb" containerName="glance-httpd" Nov 26 15:27:20 crc kubenswrapper[4785]: E1126 15:27:20.963496 4785 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="232480ce-5e55-4c5c-bef5-67df7025b5eb" containerName="glance-log" Nov 26 15:27:20 crc kubenswrapper[4785]: I1126 15:27:20.963506 4785 state_mem.go:107] "Deleted CPUSet assignment" podUID="232480ce-5e55-4c5c-bef5-67df7025b5eb" containerName="glance-log" Nov 26 15:27:20 crc kubenswrapper[4785]: E1126 15:27:20.963534 4785 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5231dbe6-b3b3-467d-8b64-4af1843c6d9c" containerName="mariadb-account-delete" Nov 26 15:27:20 crc kubenswrapper[4785]: I1126 15:27:20.963547 4785 state_mem.go:107] "Deleted CPUSet assignment" podUID="5231dbe6-b3b3-467d-8b64-4af1843c6d9c" containerName="mariadb-account-delete" Nov 26 15:27:20 crc kubenswrapper[4785]: I1126 15:27:20.963818 4785 memory_manager.go:354] "RemoveStaleState removing state" podUID="5231dbe6-b3b3-467d-8b64-4af1843c6d9c" containerName="mariadb-account-delete" Nov 26 15:27:20 crc kubenswrapper[4785]: I1126 15:27:20.963844 4785 memory_manager.go:354] "RemoveStaleState removing state" podUID="232480ce-5e55-4c5c-bef5-67df7025b5eb" containerName="glance-httpd" Nov 26 15:27:20 crc kubenswrapper[4785]: I1126 15:27:20.963871 4785 memory_manager.go:354] "RemoveStaleState removing state" podUID="232480ce-5e55-4c5c-bef5-67df7025b5eb" containerName="glance-log" Nov 26 15:27:20 crc kubenswrapper[4785]: I1126 15:27:20.967384 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-create-dr56t" Nov 26 15:27:20 crc kubenswrapper[4785]: I1126 15:27:20.971810 4785 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-5542-account-create-update-kgwxd"] Nov 26 15:27:20 crc kubenswrapper[4785]: I1126 15:27:20.972885 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-5542-account-create-update-kgwxd" Nov 26 15:27:20 crc kubenswrapper[4785]: I1126 15:27:20.976798 4785 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-db-secret" Nov 26 15:27:20 crc kubenswrapper[4785]: I1126 15:27:20.979227 4785 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-db-create-dr56t"] Nov 26 15:27:20 crc kubenswrapper[4785]: I1126 15:27:20.987997 4785 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-5542-account-create-update-kgwxd"] Nov 26 15:27:21 crc kubenswrapper[4785]: I1126 15:27:21.043726 4785 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="232480ce-5e55-4c5c-bef5-67df7025b5eb" path="/var/lib/kubelet/pods/232480ce-5e55-4c5c-bef5-67df7025b5eb/volumes" Nov 26 15:27:21 crc kubenswrapper[4785]: I1126 15:27:21.127807 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b5cd9a63-e7cf-4c00-b352-be258e30c83b-operator-scripts\") pod \"glance-5542-account-create-update-kgwxd\" (UID: \"b5cd9a63-e7cf-4c00-b352-be258e30c83b\") " pod="glance-kuttl-tests/glance-5542-account-create-update-kgwxd" Nov 26 15:27:21 crc kubenswrapper[4785]: I1126 15:27:21.127874 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g5kjw\" (UniqueName: \"kubernetes.io/projected/37b8d5b5-97ff-4bb8-9149-1af3d8aa824f-kube-api-access-g5kjw\") pod \"glance-db-create-dr56t\" (UID: \"37b8d5b5-97ff-4bb8-9149-1af3d8aa824f\") " pod="glance-kuttl-tests/glance-db-create-dr56t" Nov 26 15:27:21 crc kubenswrapper[4785]: I1126 15:27:21.127912 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/37b8d5b5-97ff-4bb8-9149-1af3d8aa824f-operator-scripts\") pod \"glance-db-create-dr56t\" (UID: \"37b8d5b5-97ff-4bb8-9149-1af3d8aa824f\") " pod="glance-kuttl-tests/glance-db-create-dr56t" Nov 26 15:27:21 crc kubenswrapper[4785]: I1126 15:27:21.127937 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kbhrd\" (UniqueName: \"kubernetes.io/projected/b5cd9a63-e7cf-4c00-b352-be258e30c83b-kube-api-access-kbhrd\") pod \"glance-5542-account-create-update-kgwxd\" (UID: \"b5cd9a63-e7cf-4c00-b352-be258e30c83b\") " pod="glance-kuttl-tests/glance-5542-account-create-update-kgwxd" Nov 26 15:27:21 crc kubenswrapper[4785]: I1126 15:27:21.229008 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b5cd9a63-e7cf-4c00-b352-be258e30c83b-operator-scripts\") pod \"glance-5542-account-create-update-kgwxd\" (UID: \"b5cd9a63-e7cf-4c00-b352-be258e30c83b\") " pod="glance-kuttl-tests/glance-5542-account-create-update-kgwxd" Nov 26 15:27:21 crc kubenswrapper[4785]: I1126 15:27:21.229085 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g5kjw\" (UniqueName: \"kubernetes.io/projected/37b8d5b5-97ff-4bb8-9149-1af3d8aa824f-kube-api-access-g5kjw\") pod \"glance-db-create-dr56t\" (UID: \"37b8d5b5-97ff-4bb8-9149-1af3d8aa824f\") " pod="glance-kuttl-tests/glance-db-create-dr56t" Nov 26 15:27:21 crc kubenswrapper[4785]: I1126 15:27:21.229119 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/37b8d5b5-97ff-4bb8-9149-1af3d8aa824f-operator-scripts\") pod \"glance-db-create-dr56t\" (UID: \"37b8d5b5-97ff-4bb8-9149-1af3d8aa824f\") " pod="glance-kuttl-tests/glance-db-create-dr56t" Nov 26 15:27:21 crc kubenswrapper[4785]: I1126 15:27:21.229144 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kbhrd\" (UniqueName: \"kubernetes.io/projected/b5cd9a63-e7cf-4c00-b352-be258e30c83b-kube-api-access-kbhrd\") pod \"glance-5542-account-create-update-kgwxd\" (UID: \"b5cd9a63-e7cf-4c00-b352-be258e30c83b\") " pod="glance-kuttl-tests/glance-5542-account-create-update-kgwxd" Nov 26 15:27:21 crc kubenswrapper[4785]: I1126 15:27:21.230192 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b5cd9a63-e7cf-4c00-b352-be258e30c83b-operator-scripts\") pod \"glance-5542-account-create-update-kgwxd\" (UID: \"b5cd9a63-e7cf-4c00-b352-be258e30c83b\") " pod="glance-kuttl-tests/glance-5542-account-create-update-kgwxd" Nov 26 15:27:21 crc kubenswrapper[4785]: I1126 15:27:21.230233 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/37b8d5b5-97ff-4bb8-9149-1af3d8aa824f-operator-scripts\") pod \"glance-db-create-dr56t\" (UID: \"37b8d5b5-97ff-4bb8-9149-1af3d8aa824f\") " pod="glance-kuttl-tests/glance-db-create-dr56t" Nov 26 15:27:21 crc kubenswrapper[4785]: I1126 15:27:21.249679 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kbhrd\" (UniqueName: \"kubernetes.io/projected/b5cd9a63-e7cf-4c00-b352-be258e30c83b-kube-api-access-kbhrd\") pod \"glance-5542-account-create-update-kgwxd\" (UID: \"b5cd9a63-e7cf-4c00-b352-be258e30c83b\") " pod="glance-kuttl-tests/glance-5542-account-create-update-kgwxd" Nov 26 15:27:21 crc kubenswrapper[4785]: I1126 15:27:21.252909 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g5kjw\" (UniqueName: \"kubernetes.io/projected/37b8d5b5-97ff-4bb8-9149-1af3d8aa824f-kube-api-access-g5kjw\") pod \"glance-db-create-dr56t\" (UID: \"37b8d5b5-97ff-4bb8-9149-1af3d8aa824f\") " pod="glance-kuttl-tests/glance-db-create-dr56t" Nov 26 15:27:21 crc kubenswrapper[4785]: I1126 15:27:21.303056 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-create-dr56t" Nov 26 15:27:21 crc kubenswrapper[4785]: I1126 15:27:21.315027 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-5542-account-create-update-kgwxd" Nov 26 15:27:21 crc kubenswrapper[4785]: I1126 15:27:21.748197 4785 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-5542-account-create-update-kgwxd"] Nov 26 15:27:21 crc kubenswrapper[4785]: I1126 15:27:21.801839 4785 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-db-create-dr56t"] Nov 26 15:27:21 crc kubenswrapper[4785]: W1126 15:27:21.811404 4785 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod37b8d5b5_97ff_4bb8_9149_1af3d8aa824f.slice/crio-12d1914241566abcdb8dfcf2d10dd7876e1c3c745ffae13c29f40f7b66d5a924 WatchSource:0}: Error finding container 12d1914241566abcdb8dfcf2d10dd7876e1c3c745ffae13c29f40f7b66d5a924: Status 404 returned error can't find the container with id 12d1914241566abcdb8dfcf2d10dd7876e1c3c745ffae13c29f40f7b66d5a924 Nov 26 15:27:22 crc kubenswrapper[4785]: I1126 15:27:22.339741 4785 generic.go:334] "Generic (PLEG): container finished" podID="b5cd9a63-e7cf-4c00-b352-be258e30c83b" containerID="081d20860d7400cd61f7bde70e0d3481e66c32caf9cbbe4dabee7fa14820e4bb" exitCode=0 Nov 26 15:27:22 crc kubenswrapper[4785]: I1126 15:27:22.339813 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-5542-account-create-update-kgwxd" event={"ID":"b5cd9a63-e7cf-4c00-b352-be258e30c83b","Type":"ContainerDied","Data":"081d20860d7400cd61f7bde70e0d3481e66c32caf9cbbe4dabee7fa14820e4bb"} Nov 26 15:27:22 crc kubenswrapper[4785]: I1126 15:27:22.339840 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-5542-account-create-update-kgwxd" event={"ID":"b5cd9a63-e7cf-4c00-b352-be258e30c83b","Type":"ContainerStarted","Data":"94e607eea40327363a010cf18445ccc004e35cb4242faaa6a0ee0c6264fcfa35"} Nov 26 15:27:22 crc kubenswrapper[4785]: I1126 15:27:22.342288 4785 generic.go:334] "Generic (PLEG): container finished" podID="37b8d5b5-97ff-4bb8-9149-1af3d8aa824f" containerID="50f09461476963cbe51c87d941006a786c0461375c73c06f76c6b5dc5956c448" exitCode=0 Nov 26 15:27:22 crc kubenswrapper[4785]: I1126 15:27:22.342335 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-create-dr56t" event={"ID":"37b8d5b5-97ff-4bb8-9149-1af3d8aa824f","Type":"ContainerDied","Data":"50f09461476963cbe51c87d941006a786c0461375c73c06f76c6b5dc5956c448"} Nov 26 15:27:22 crc kubenswrapper[4785]: I1126 15:27:22.342361 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-create-dr56t" event={"ID":"37b8d5b5-97ff-4bb8-9149-1af3d8aa824f","Type":"ContainerStarted","Data":"12d1914241566abcdb8dfcf2d10dd7876e1c3c745ffae13c29f40f7b66d5a924"} Nov 26 15:27:23 crc kubenswrapper[4785]: I1126 15:27:23.708911 4785 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-5542-account-create-update-kgwxd" Nov 26 15:27:23 crc kubenswrapper[4785]: I1126 15:27:23.715904 4785 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-create-dr56t" Nov 26 15:27:23 crc kubenswrapper[4785]: I1126 15:27:23.864002 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/37b8d5b5-97ff-4bb8-9149-1af3d8aa824f-operator-scripts\") pod \"37b8d5b5-97ff-4bb8-9149-1af3d8aa824f\" (UID: \"37b8d5b5-97ff-4bb8-9149-1af3d8aa824f\") " Nov 26 15:27:23 crc kubenswrapper[4785]: I1126 15:27:23.864073 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kbhrd\" (UniqueName: \"kubernetes.io/projected/b5cd9a63-e7cf-4c00-b352-be258e30c83b-kube-api-access-kbhrd\") pod \"b5cd9a63-e7cf-4c00-b352-be258e30c83b\" (UID: \"b5cd9a63-e7cf-4c00-b352-be258e30c83b\") " Nov 26 15:27:23 crc kubenswrapper[4785]: I1126 15:27:23.864107 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b5cd9a63-e7cf-4c00-b352-be258e30c83b-operator-scripts\") pod \"b5cd9a63-e7cf-4c00-b352-be258e30c83b\" (UID: \"b5cd9a63-e7cf-4c00-b352-be258e30c83b\") " Nov 26 15:27:23 crc kubenswrapper[4785]: I1126 15:27:23.864132 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g5kjw\" (UniqueName: \"kubernetes.io/projected/37b8d5b5-97ff-4bb8-9149-1af3d8aa824f-kube-api-access-g5kjw\") pod \"37b8d5b5-97ff-4bb8-9149-1af3d8aa824f\" (UID: \"37b8d5b5-97ff-4bb8-9149-1af3d8aa824f\") " Nov 26 15:27:23 crc kubenswrapper[4785]: I1126 15:27:23.864653 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/37b8d5b5-97ff-4bb8-9149-1af3d8aa824f-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "37b8d5b5-97ff-4bb8-9149-1af3d8aa824f" (UID: "37b8d5b5-97ff-4bb8-9149-1af3d8aa824f"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 15:27:23 crc kubenswrapper[4785]: I1126 15:27:23.864943 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b5cd9a63-e7cf-4c00-b352-be258e30c83b-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "b5cd9a63-e7cf-4c00-b352-be258e30c83b" (UID: "b5cd9a63-e7cf-4c00-b352-be258e30c83b"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 15:27:23 crc kubenswrapper[4785]: I1126 15:27:23.868479 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/37b8d5b5-97ff-4bb8-9149-1af3d8aa824f-kube-api-access-g5kjw" (OuterVolumeSpecName: "kube-api-access-g5kjw") pod "37b8d5b5-97ff-4bb8-9149-1af3d8aa824f" (UID: "37b8d5b5-97ff-4bb8-9149-1af3d8aa824f"). InnerVolumeSpecName "kube-api-access-g5kjw". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 15:27:23 crc kubenswrapper[4785]: I1126 15:27:23.868621 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b5cd9a63-e7cf-4c00-b352-be258e30c83b-kube-api-access-kbhrd" (OuterVolumeSpecName: "kube-api-access-kbhrd") pod "b5cd9a63-e7cf-4c00-b352-be258e30c83b" (UID: "b5cd9a63-e7cf-4c00-b352-be258e30c83b"). InnerVolumeSpecName "kube-api-access-kbhrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 15:27:23 crc kubenswrapper[4785]: I1126 15:27:23.966146 4785 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/37b8d5b5-97ff-4bb8-9149-1af3d8aa824f-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 26 15:27:23 crc kubenswrapper[4785]: I1126 15:27:23.966544 4785 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kbhrd\" (UniqueName: \"kubernetes.io/projected/b5cd9a63-e7cf-4c00-b352-be258e30c83b-kube-api-access-kbhrd\") on node \"crc\" DevicePath \"\"" Nov 26 15:27:23 crc kubenswrapper[4785]: I1126 15:27:23.966575 4785 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b5cd9a63-e7cf-4c00-b352-be258e30c83b-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 26 15:27:23 crc kubenswrapper[4785]: I1126 15:27:23.966583 4785 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g5kjw\" (UniqueName: \"kubernetes.io/projected/37b8d5b5-97ff-4bb8-9149-1af3d8aa824f-kube-api-access-g5kjw\") on node \"crc\" DevicePath \"\"" Nov 26 15:27:24 crc kubenswrapper[4785]: I1126 15:27:24.358799 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-5542-account-create-update-kgwxd" event={"ID":"b5cd9a63-e7cf-4c00-b352-be258e30c83b","Type":"ContainerDied","Data":"94e607eea40327363a010cf18445ccc004e35cb4242faaa6a0ee0c6264fcfa35"} Nov 26 15:27:24 crc kubenswrapper[4785]: I1126 15:27:24.358839 4785 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="94e607eea40327363a010cf18445ccc004e35cb4242faaa6a0ee0c6264fcfa35" Nov 26 15:27:24 crc kubenswrapper[4785]: I1126 15:27:24.358849 4785 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-5542-account-create-update-kgwxd" Nov 26 15:27:24 crc kubenswrapper[4785]: I1126 15:27:24.360226 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-create-dr56t" event={"ID":"37b8d5b5-97ff-4bb8-9149-1af3d8aa824f","Type":"ContainerDied","Data":"12d1914241566abcdb8dfcf2d10dd7876e1c3c745ffae13c29f40f7b66d5a924"} Nov 26 15:27:24 crc kubenswrapper[4785]: I1126 15:27:24.360244 4785 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="12d1914241566abcdb8dfcf2d10dd7876e1c3c745ffae13c29f40f7b66d5a924" Nov 26 15:27:24 crc kubenswrapper[4785]: I1126 15:27:24.360266 4785 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-create-dr56t" Nov 26 15:27:26 crc kubenswrapper[4785]: I1126 15:27:26.199147 4785 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-db-sync-5bhl4"] Nov 26 15:27:26 crc kubenswrapper[4785]: E1126 15:27:26.200592 4785 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37b8d5b5-97ff-4bb8-9149-1af3d8aa824f" containerName="mariadb-database-create" Nov 26 15:27:26 crc kubenswrapper[4785]: I1126 15:27:26.200679 4785 state_mem.go:107] "Deleted CPUSet assignment" podUID="37b8d5b5-97ff-4bb8-9149-1af3d8aa824f" containerName="mariadb-database-create" Nov 26 15:27:26 crc kubenswrapper[4785]: E1126 15:27:26.200799 4785 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b5cd9a63-e7cf-4c00-b352-be258e30c83b" containerName="mariadb-account-create-update" Nov 26 15:27:26 crc kubenswrapper[4785]: I1126 15:27:26.200869 4785 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5cd9a63-e7cf-4c00-b352-be258e30c83b" containerName="mariadb-account-create-update" Nov 26 15:27:26 crc kubenswrapper[4785]: I1126 15:27:26.201067 4785 memory_manager.go:354] "RemoveStaleState removing state" podUID="37b8d5b5-97ff-4bb8-9149-1af3d8aa824f" containerName="mariadb-database-create" Nov 26 15:27:26 crc kubenswrapper[4785]: I1126 15:27:26.201150 4785 memory_manager.go:354] "RemoveStaleState removing state" podUID="b5cd9a63-e7cf-4c00-b352-be258e30c83b" containerName="mariadb-account-create-update" Nov 26 15:27:26 crc kubenswrapper[4785]: I1126 15:27:26.201794 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-sync-5bhl4" Nov 26 15:27:26 crc kubenswrapper[4785]: I1126 15:27:26.204145 4785 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-glance-dockercfg-p94cf" Nov 26 15:27:26 crc kubenswrapper[4785]: I1126 15:27:26.211405 4785 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-config-data" Nov 26 15:27:26 crc kubenswrapper[4785]: I1126 15:27:26.216875 4785 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-db-sync-5bhl4"] Nov 26 15:27:26 crc kubenswrapper[4785]: I1126 15:27:26.297236 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-99l92\" (UniqueName: \"kubernetes.io/projected/e81fe919-3471-4913-a892-b03f703d3ed9-kube-api-access-99l92\") pod \"glance-db-sync-5bhl4\" (UID: \"e81fe919-3471-4913-a892-b03f703d3ed9\") " pod="glance-kuttl-tests/glance-db-sync-5bhl4" Nov 26 15:27:26 crc kubenswrapper[4785]: I1126 15:27:26.297469 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/e81fe919-3471-4913-a892-b03f703d3ed9-db-sync-config-data\") pod \"glance-db-sync-5bhl4\" (UID: \"e81fe919-3471-4913-a892-b03f703d3ed9\") " pod="glance-kuttl-tests/glance-db-sync-5bhl4" Nov 26 15:27:26 crc kubenswrapper[4785]: I1126 15:27:26.297583 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e81fe919-3471-4913-a892-b03f703d3ed9-config-data\") pod \"glance-db-sync-5bhl4\" (UID: \"e81fe919-3471-4913-a892-b03f703d3ed9\") " pod="glance-kuttl-tests/glance-db-sync-5bhl4" Nov 26 15:27:26 crc kubenswrapper[4785]: I1126 15:27:26.398506 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-99l92\" (UniqueName: \"kubernetes.io/projected/e81fe919-3471-4913-a892-b03f703d3ed9-kube-api-access-99l92\") pod \"glance-db-sync-5bhl4\" (UID: \"e81fe919-3471-4913-a892-b03f703d3ed9\") " pod="glance-kuttl-tests/glance-db-sync-5bhl4" Nov 26 15:27:26 crc kubenswrapper[4785]: I1126 15:27:26.398786 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/e81fe919-3471-4913-a892-b03f703d3ed9-db-sync-config-data\") pod \"glance-db-sync-5bhl4\" (UID: \"e81fe919-3471-4913-a892-b03f703d3ed9\") " pod="glance-kuttl-tests/glance-db-sync-5bhl4" Nov 26 15:27:26 crc kubenswrapper[4785]: I1126 15:27:26.398918 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e81fe919-3471-4913-a892-b03f703d3ed9-config-data\") pod \"glance-db-sync-5bhl4\" (UID: \"e81fe919-3471-4913-a892-b03f703d3ed9\") " pod="glance-kuttl-tests/glance-db-sync-5bhl4" Nov 26 15:27:26 crc kubenswrapper[4785]: I1126 15:27:26.403152 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e81fe919-3471-4913-a892-b03f703d3ed9-config-data\") pod \"glance-db-sync-5bhl4\" (UID: \"e81fe919-3471-4913-a892-b03f703d3ed9\") " pod="glance-kuttl-tests/glance-db-sync-5bhl4" Nov 26 15:27:26 crc kubenswrapper[4785]: I1126 15:27:26.413024 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/e81fe919-3471-4913-a892-b03f703d3ed9-db-sync-config-data\") pod \"glance-db-sync-5bhl4\" (UID: \"e81fe919-3471-4913-a892-b03f703d3ed9\") " pod="glance-kuttl-tests/glance-db-sync-5bhl4" Nov 26 15:27:26 crc kubenswrapper[4785]: I1126 15:27:26.417299 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-99l92\" (UniqueName: \"kubernetes.io/projected/e81fe919-3471-4913-a892-b03f703d3ed9-kube-api-access-99l92\") pod \"glance-db-sync-5bhl4\" (UID: \"e81fe919-3471-4913-a892-b03f703d3ed9\") " pod="glance-kuttl-tests/glance-db-sync-5bhl4" Nov 26 15:27:26 crc kubenswrapper[4785]: I1126 15:27:26.532823 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-sync-5bhl4" Nov 26 15:27:26 crc kubenswrapper[4785]: I1126 15:27:26.946106 4785 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-db-sync-5bhl4"] Nov 26 15:27:27 crc kubenswrapper[4785]: I1126 15:27:27.382863 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-sync-5bhl4" event={"ID":"e81fe919-3471-4913-a892-b03f703d3ed9","Type":"ContainerStarted","Data":"481ea91af0a3a907aa31c8beb7bfbbc6e8d5bf0d5eaadc2a874212a3f895e068"} Nov 26 15:27:28 crc kubenswrapper[4785]: I1126 15:27:28.391028 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-sync-5bhl4" event={"ID":"e81fe919-3471-4913-a892-b03f703d3ed9","Type":"ContainerStarted","Data":"4b0ef06591a646674dc835100c6d1a286b80e10cc357cffd7c1e1b3636a5c35b"} Nov 26 15:27:30 crc kubenswrapper[4785]: I1126 15:27:30.411780 4785 generic.go:334] "Generic (PLEG): container finished" podID="e81fe919-3471-4913-a892-b03f703d3ed9" containerID="4b0ef06591a646674dc835100c6d1a286b80e10cc357cffd7c1e1b3636a5c35b" exitCode=0 Nov 26 15:27:30 crc kubenswrapper[4785]: I1126 15:27:30.411923 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-sync-5bhl4" event={"ID":"e81fe919-3471-4913-a892-b03f703d3ed9","Type":"ContainerDied","Data":"4b0ef06591a646674dc835100c6d1a286b80e10cc357cffd7c1e1b3636a5c35b"} Nov 26 15:27:31 crc kubenswrapper[4785]: I1126 15:27:31.700084 4785 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-sync-5bhl4" Nov 26 15:27:31 crc kubenswrapper[4785]: I1126 15:27:31.769578 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-99l92\" (UniqueName: \"kubernetes.io/projected/e81fe919-3471-4913-a892-b03f703d3ed9-kube-api-access-99l92\") pod \"e81fe919-3471-4913-a892-b03f703d3ed9\" (UID: \"e81fe919-3471-4913-a892-b03f703d3ed9\") " Nov 26 15:27:31 crc kubenswrapper[4785]: I1126 15:27:31.769673 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e81fe919-3471-4913-a892-b03f703d3ed9-config-data\") pod \"e81fe919-3471-4913-a892-b03f703d3ed9\" (UID: \"e81fe919-3471-4913-a892-b03f703d3ed9\") " Nov 26 15:27:31 crc kubenswrapper[4785]: I1126 15:27:31.769724 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/e81fe919-3471-4913-a892-b03f703d3ed9-db-sync-config-data\") pod \"e81fe919-3471-4913-a892-b03f703d3ed9\" (UID: \"e81fe919-3471-4913-a892-b03f703d3ed9\") " Nov 26 15:27:31 crc kubenswrapper[4785]: I1126 15:27:31.784585 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e81fe919-3471-4913-a892-b03f703d3ed9-kube-api-access-99l92" (OuterVolumeSpecName: "kube-api-access-99l92") pod "e81fe919-3471-4913-a892-b03f703d3ed9" (UID: "e81fe919-3471-4913-a892-b03f703d3ed9"). InnerVolumeSpecName "kube-api-access-99l92". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 15:27:31 crc kubenswrapper[4785]: I1126 15:27:31.806708 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e81fe919-3471-4913-a892-b03f703d3ed9-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "e81fe919-3471-4913-a892-b03f703d3ed9" (UID: "e81fe919-3471-4913-a892-b03f703d3ed9"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 15:27:31 crc kubenswrapper[4785]: I1126 15:27:31.840724 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e81fe919-3471-4913-a892-b03f703d3ed9-config-data" (OuterVolumeSpecName: "config-data") pod "e81fe919-3471-4913-a892-b03f703d3ed9" (UID: "e81fe919-3471-4913-a892-b03f703d3ed9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 15:27:31 crc kubenswrapper[4785]: I1126 15:27:31.871387 4785 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-99l92\" (UniqueName: \"kubernetes.io/projected/e81fe919-3471-4913-a892-b03f703d3ed9-kube-api-access-99l92\") on node \"crc\" DevicePath \"\"" Nov 26 15:27:31 crc kubenswrapper[4785]: I1126 15:27:31.871418 4785 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e81fe919-3471-4913-a892-b03f703d3ed9-config-data\") on node \"crc\" DevicePath \"\"" Nov 26 15:27:31 crc kubenswrapper[4785]: I1126 15:27:31.871429 4785 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/e81fe919-3471-4913-a892-b03f703d3ed9-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Nov 26 15:27:32 crc kubenswrapper[4785]: I1126 15:27:32.434384 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-sync-5bhl4" event={"ID":"e81fe919-3471-4913-a892-b03f703d3ed9","Type":"ContainerDied","Data":"481ea91af0a3a907aa31c8beb7bfbbc6e8d5bf0d5eaadc2a874212a3f895e068"} Nov 26 15:27:32 crc kubenswrapper[4785]: I1126 15:27:32.434432 4785 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="481ea91af0a3a907aa31c8beb7bfbbc6e8d5bf0d5eaadc2a874212a3f895e068" Nov 26 15:27:32 crc kubenswrapper[4785]: I1126 15:27:32.434982 4785 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-sync-5bhl4" Nov 26 15:27:33 crc kubenswrapper[4785]: I1126 15:27:33.787921 4785 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-default-external-api-0"] Nov 26 15:27:33 crc kubenswrapper[4785]: E1126 15:27:33.788376 4785 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e81fe919-3471-4913-a892-b03f703d3ed9" containerName="glance-db-sync" Nov 26 15:27:33 crc kubenswrapper[4785]: I1126 15:27:33.788388 4785 state_mem.go:107] "Deleted CPUSet assignment" podUID="e81fe919-3471-4913-a892-b03f703d3ed9" containerName="glance-db-sync" Nov 26 15:27:33 crc kubenswrapper[4785]: I1126 15:27:33.788762 4785 memory_manager.go:354] "RemoveStaleState removing state" podUID="e81fe919-3471-4913-a892-b03f703d3ed9" containerName="glance-db-sync" Nov 26 15:27:33 crc kubenswrapper[4785]: I1126 15:27:33.790532 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-external-api-0" Nov 26 15:27:33 crc kubenswrapper[4785]: I1126 15:27:33.794738 4785 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-scripts" Nov 26 15:27:33 crc kubenswrapper[4785]: I1126 15:27:33.794969 4785 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-default-external-config-data" Nov 26 15:27:33 crc kubenswrapper[4785]: I1126 15:27:33.795199 4785 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-glance-dockercfg-p94cf" Nov 26 15:27:33 crc kubenswrapper[4785]: I1126 15:27:33.806751 4785 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-0"] Nov 26 15:27:33 crc kubenswrapper[4785]: I1126 15:27:33.903378 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/59fd54c3-fef6-4165-ab5a-3bb74543da8b-sys\") pod \"glance-default-external-api-0\" (UID: \"59fd54c3-fef6-4165-ab5a-3bb74543da8b\") " pod="glance-kuttl-tests/glance-default-external-api-0" Nov 26 15:27:33 crc kubenswrapper[4785]: I1126 15:27:33.904071 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/59fd54c3-fef6-4165-ab5a-3bb74543da8b-scripts\") pod \"glance-default-external-api-0\" (UID: \"59fd54c3-fef6-4165-ab5a-3bb74543da8b\") " pod="glance-kuttl-tests/glance-default-external-api-0" Nov 26 15:27:33 crc kubenswrapper[4785]: I1126 15:27:33.904146 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/59fd54c3-fef6-4165-ab5a-3bb74543da8b-etc-nvme\") pod \"glance-default-external-api-0\" (UID: \"59fd54c3-fef6-4165-ab5a-3bb74543da8b\") " pod="glance-kuttl-tests/glance-default-external-api-0" Nov 26 15:27:33 crc kubenswrapper[4785]: I1126 15:27:33.904204 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/59fd54c3-fef6-4165-ab5a-3bb74543da8b-dev\") pod \"glance-default-external-api-0\" (UID: \"59fd54c3-fef6-4165-ab5a-3bb74543da8b\") " pod="glance-kuttl-tests/glance-default-external-api-0" Nov 26 15:27:33 crc kubenswrapper[4785]: I1126 15:27:33.904261 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"59fd54c3-fef6-4165-ab5a-3bb74543da8b\") " pod="glance-kuttl-tests/glance-default-external-api-0" Nov 26 15:27:33 crc kubenswrapper[4785]: I1126 15:27:33.904377 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ngm9j\" (UniqueName: \"kubernetes.io/projected/59fd54c3-fef6-4165-ab5a-3bb74543da8b-kube-api-access-ngm9j\") pod \"glance-default-external-api-0\" (UID: \"59fd54c3-fef6-4165-ab5a-3bb74543da8b\") " pod="glance-kuttl-tests/glance-default-external-api-0" Nov 26 15:27:33 crc kubenswrapper[4785]: I1126 15:27:33.904422 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/59fd54c3-fef6-4165-ab5a-3bb74543da8b-lib-modules\") pod \"glance-default-external-api-0\" (UID: \"59fd54c3-fef6-4165-ab5a-3bb74543da8b\") " pod="glance-kuttl-tests/glance-default-external-api-0" Nov 26 15:27:33 crc kubenswrapper[4785]: I1126 15:27:33.904488 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/59fd54c3-fef6-4165-ab5a-3bb74543da8b-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"59fd54c3-fef6-4165-ab5a-3bb74543da8b\") " pod="glance-kuttl-tests/glance-default-external-api-0" Nov 26 15:27:33 crc kubenswrapper[4785]: I1126 15:27:33.904574 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/59fd54c3-fef6-4165-ab5a-3bb74543da8b-config-data\") pod \"glance-default-external-api-0\" (UID: \"59fd54c3-fef6-4165-ab5a-3bb74543da8b\") " pod="glance-kuttl-tests/glance-default-external-api-0" Nov 26 15:27:33 crc kubenswrapper[4785]: I1126 15:27:33.904660 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/59fd54c3-fef6-4165-ab5a-3bb74543da8b-logs\") pod \"glance-default-external-api-0\" (UID: \"59fd54c3-fef6-4165-ab5a-3bb74543da8b\") " pod="glance-kuttl-tests/glance-default-external-api-0" Nov 26 15:27:33 crc kubenswrapper[4785]: I1126 15:27:33.904688 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-external-api-0\" (UID: \"59fd54c3-fef6-4165-ab5a-3bb74543da8b\") " pod="glance-kuttl-tests/glance-default-external-api-0" Nov 26 15:27:33 crc kubenswrapper[4785]: I1126 15:27:33.904736 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/59fd54c3-fef6-4165-ab5a-3bb74543da8b-run\") pod \"glance-default-external-api-0\" (UID: \"59fd54c3-fef6-4165-ab5a-3bb74543da8b\") " pod="glance-kuttl-tests/glance-default-external-api-0" Nov 26 15:27:33 crc kubenswrapper[4785]: I1126 15:27:33.904774 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/59fd54c3-fef6-4165-ab5a-3bb74543da8b-etc-iscsi\") pod \"glance-default-external-api-0\" (UID: \"59fd54c3-fef6-4165-ab5a-3bb74543da8b\") " pod="glance-kuttl-tests/glance-default-external-api-0" Nov 26 15:27:33 crc kubenswrapper[4785]: I1126 15:27:33.904804 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/59fd54c3-fef6-4165-ab5a-3bb74543da8b-var-locks-brick\") pod \"glance-default-external-api-0\" (UID: \"59fd54c3-fef6-4165-ab5a-3bb74543da8b\") " pod="glance-kuttl-tests/glance-default-external-api-0" Nov 26 15:27:33 crc kubenswrapper[4785]: I1126 15:27:33.933145 4785 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-0"] Nov 26 15:27:33 crc kubenswrapper[4785]: I1126 15:27:33.934948 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 15:27:33 crc kubenswrapper[4785]: I1126 15:27:33.944486 4785 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-default-internal-config-data" Nov 26 15:27:33 crc kubenswrapper[4785]: I1126 15:27:33.952039 4785 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-0"] Nov 26 15:27:34 crc kubenswrapper[4785]: I1126 15:27:34.006606 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fabcb040-8ff8-4302-8628-7c3d52c1a5cb-scripts\") pod \"glance-default-internal-api-0\" (UID: \"fabcb040-8ff8-4302-8628-7c3d52c1a5cb\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 15:27:34 crc kubenswrapper[4785]: I1126 15:27:34.006672 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/59fd54c3-fef6-4165-ab5a-3bb74543da8b-etc-iscsi\") pod \"glance-default-external-api-0\" (UID: \"59fd54c3-fef6-4165-ab5a-3bb74543da8b\") " pod="glance-kuttl-tests/glance-default-external-api-0" Nov 26 15:27:34 crc kubenswrapper[4785]: I1126 15:27:34.006704 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/59fd54c3-fef6-4165-ab5a-3bb74543da8b-var-locks-brick\") pod \"glance-default-external-api-0\" (UID: \"59fd54c3-fef6-4165-ab5a-3bb74543da8b\") " pod="glance-kuttl-tests/glance-default-external-api-0" Nov 26 15:27:34 crc kubenswrapper[4785]: I1126 15:27:34.006736 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/59fd54c3-fef6-4165-ab5a-3bb74543da8b-sys\") pod \"glance-default-external-api-0\" (UID: \"59fd54c3-fef6-4165-ab5a-3bb74543da8b\") " pod="glance-kuttl-tests/glance-default-external-api-0" Nov 26 15:27:34 crc kubenswrapper[4785]: I1126 15:27:34.006772 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"fabcb040-8ff8-4302-8628-7c3d52c1a5cb\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 15:27:34 crc kubenswrapper[4785]: I1126 15:27:34.006801 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/fabcb040-8ff8-4302-8628-7c3d52c1a5cb-dev\") pod \"glance-default-internal-api-0\" (UID: \"fabcb040-8ff8-4302-8628-7c3d52c1a5cb\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 15:27:34 crc kubenswrapper[4785]: I1126 15:27:34.006825 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/fabcb040-8ff8-4302-8628-7c3d52c1a5cb-run\") pod \"glance-default-internal-api-0\" (UID: \"fabcb040-8ff8-4302-8628-7c3d52c1a5cb\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 15:27:34 crc kubenswrapper[4785]: I1126 15:27:34.006861 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/fabcb040-8ff8-4302-8628-7c3d52c1a5cb-etc-nvme\") pod \"glance-default-internal-api-0\" (UID: \"fabcb040-8ff8-4302-8628-7c3d52c1a5cb\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 15:27:34 crc kubenswrapper[4785]: I1126 15:27:34.006892 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/59fd54c3-fef6-4165-ab5a-3bb74543da8b-scripts\") pod \"glance-default-external-api-0\" (UID: \"59fd54c3-fef6-4165-ab5a-3bb74543da8b\") " pod="glance-kuttl-tests/glance-default-external-api-0" Nov 26 15:27:34 crc kubenswrapper[4785]: I1126 15:27:34.006914 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/59fd54c3-fef6-4165-ab5a-3bb74543da8b-etc-nvme\") pod \"glance-default-external-api-0\" (UID: \"59fd54c3-fef6-4165-ab5a-3bb74543da8b\") " pod="glance-kuttl-tests/glance-default-external-api-0" Nov 26 15:27:34 crc kubenswrapper[4785]: I1126 15:27:34.006941 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fabcb040-8ff8-4302-8628-7c3d52c1a5cb-config-data\") pod \"glance-default-internal-api-0\" (UID: \"fabcb040-8ff8-4302-8628-7c3d52c1a5cb\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 15:27:34 crc kubenswrapper[4785]: I1126 15:27:34.006965 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/fabcb040-8ff8-4302-8628-7c3d52c1a5cb-var-locks-brick\") pod \"glance-default-internal-api-0\" (UID: \"fabcb040-8ff8-4302-8628-7c3d52c1a5cb\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 15:27:34 crc kubenswrapper[4785]: I1126 15:27:34.006989 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-internal-api-0\" (UID: \"fabcb040-8ff8-4302-8628-7c3d52c1a5cb\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 15:27:34 crc kubenswrapper[4785]: I1126 15:27:34.007008 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fabcb040-8ff8-4302-8628-7c3d52c1a5cb-logs\") pod \"glance-default-internal-api-0\" (UID: \"fabcb040-8ff8-4302-8628-7c3d52c1a5cb\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 15:27:34 crc kubenswrapper[4785]: I1126 15:27:34.007043 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/59fd54c3-fef6-4165-ab5a-3bb74543da8b-dev\") pod \"glance-default-external-api-0\" (UID: \"59fd54c3-fef6-4165-ab5a-3bb74543da8b\") " pod="glance-kuttl-tests/glance-default-external-api-0" Nov 26 15:27:34 crc kubenswrapper[4785]: I1126 15:27:34.007074 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/fabcb040-8ff8-4302-8628-7c3d52c1a5cb-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"fabcb040-8ff8-4302-8628-7c3d52c1a5cb\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 15:27:34 crc kubenswrapper[4785]: I1126 15:27:34.007098 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"59fd54c3-fef6-4165-ab5a-3bb74543da8b\") " pod="glance-kuttl-tests/glance-default-external-api-0" Nov 26 15:27:34 crc kubenswrapper[4785]: I1126 15:27:34.007124 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/fabcb040-8ff8-4302-8628-7c3d52c1a5cb-lib-modules\") pod \"glance-default-internal-api-0\" (UID: \"fabcb040-8ff8-4302-8628-7c3d52c1a5cb\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 15:27:34 crc kubenswrapper[4785]: I1126 15:27:34.007141 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ngm9j\" (UniqueName: \"kubernetes.io/projected/59fd54c3-fef6-4165-ab5a-3bb74543da8b-kube-api-access-ngm9j\") pod \"glance-default-external-api-0\" (UID: \"59fd54c3-fef6-4165-ab5a-3bb74543da8b\") " pod="glance-kuttl-tests/glance-default-external-api-0" Nov 26 15:27:34 crc kubenswrapper[4785]: I1126 15:27:34.007157 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/59fd54c3-fef6-4165-ab5a-3bb74543da8b-lib-modules\") pod \"glance-default-external-api-0\" (UID: \"59fd54c3-fef6-4165-ab5a-3bb74543da8b\") " pod="glance-kuttl-tests/glance-default-external-api-0" Nov 26 15:27:34 crc kubenswrapper[4785]: I1126 15:27:34.007175 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/59fd54c3-fef6-4165-ab5a-3bb74543da8b-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"59fd54c3-fef6-4165-ab5a-3bb74543da8b\") " pod="glance-kuttl-tests/glance-default-external-api-0" Nov 26 15:27:34 crc kubenswrapper[4785]: I1126 15:27:34.007196 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/59fd54c3-fef6-4165-ab5a-3bb74543da8b-config-data\") pod \"glance-default-external-api-0\" (UID: \"59fd54c3-fef6-4165-ab5a-3bb74543da8b\") " pod="glance-kuttl-tests/glance-default-external-api-0" Nov 26 15:27:34 crc kubenswrapper[4785]: I1126 15:27:34.007210 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/fabcb040-8ff8-4302-8628-7c3d52c1a5cb-sys\") pod \"glance-default-internal-api-0\" (UID: \"fabcb040-8ff8-4302-8628-7c3d52c1a5cb\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 15:27:34 crc kubenswrapper[4785]: I1126 15:27:34.007231 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2bqx2\" (UniqueName: \"kubernetes.io/projected/fabcb040-8ff8-4302-8628-7c3d52c1a5cb-kube-api-access-2bqx2\") pod \"glance-default-internal-api-0\" (UID: \"fabcb040-8ff8-4302-8628-7c3d52c1a5cb\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 15:27:34 crc kubenswrapper[4785]: I1126 15:27:34.007247 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/59fd54c3-fef6-4165-ab5a-3bb74543da8b-logs\") pod \"glance-default-external-api-0\" (UID: \"59fd54c3-fef6-4165-ab5a-3bb74543da8b\") " pod="glance-kuttl-tests/glance-default-external-api-0" Nov 26 15:27:34 crc kubenswrapper[4785]: I1126 15:27:34.007262 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/fabcb040-8ff8-4302-8628-7c3d52c1a5cb-etc-iscsi\") pod \"glance-default-internal-api-0\" (UID: \"fabcb040-8ff8-4302-8628-7c3d52c1a5cb\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 15:27:34 crc kubenswrapper[4785]: I1126 15:27:34.007281 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-external-api-0\" (UID: \"59fd54c3-fef6-4165-ab5a-3bb74543da8b\") " pod="glance-kuttl-tests/glance-default-external-api-0" Nov 26 15:27:34 crc kubenswrapper[4785]: I1126 15:27:34.007305 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/59fd54c3-fef6-4165-ab5a-3bb74543da8b-run\") pod \"glance-default-external-api-0\" (UID: \"59fd54c3-fef6-4165-ab5a-3bb74543da8b\") " pod="glance-kuttl-tests/glance-default-external-api-0" Nov 26 15:27:34 crc kubenswrapper[4785]: I1126 15:27:34.007368 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/59fd54c3-fef6-4165-ab5a-3bb74543da8b-run\") pod \"glance-default-external-api-0\" (UID: \"59fd54c3-fef6-4165-ab5a-3bb74543da8b\") " pod="glance-kuttl-tests/glance-default-external-api-0" Nov 26 15:27:34 crc kubenswrapper[4785]: I1126 15:27:34.007405 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/59fd54c3-fef6-4165-ab5a-3bb74543da8b-etc-iscsi\") pod \"glance-default-external-api-0\" (UID: \"59fd54c3-fef6-4165-ab5a-3bb74543da8b\") " pod="glance-kuttl-tests/glance-default-external-api-0" Nov 26 15:27:34 crc kubenswrapper[4785]: I1126 15:27:34.007445 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/59fd54c3-fef6-4165-ab5a-3bb74543da8b-var-locks-brick\") pod \"glance-default-external-api-0\" (UID: \"59fd54c3-fef6-4165-ab5a-3bb74543da8b\") " pod="glance-kuttl-tests/glance-default-external-api-0" Nov 26 15:27:34 crc kubenswrapper[4785]: I1126 15:27:34.007479 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/59fd54c3-fef6-4165-ab5a-3bb74543da8b-sys\") pod \"glance-default-external-api-0\" (UID: \"59fd54c3-fef6-4165-ab5a-3bb74543da8b\") " pod="glance-kuttl-tests/glance-default-external-api-0" Nov 26 15:27:34 crc kubenswrapper[4785]: I1126 15:27:34.007801 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/59fd54c3-fef6-4165-ab5a-3bb74543da8b-etc-nvme\") pod \"glance-default-external-api-0\" (UID: \"59fd54c3-fef6-4165-ab5a-3bb74543da8b\") " pod="glance-kuttl-tests/glance-default-external-api-0" Nov 26 15:27:34 crc kubenswrapper[4785]: I1126 15:27:34.007889 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/59fd54c3-fef6-4165-ab5a-3bb74543da8b-dev\") pod \"glance-default-external-api-0\" (UID: \"59fd54c3-fef6-4165-ab5a-3bb74543da8b\") " pod="glance-kuttl-tests/glance-default-external-api-0" Nov 26 15:27:34 crc kubenswrapper[4785]: I1126 15:27:34.008137 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/59fd54c3-fef6-4165-ab5a-3bb74543da8b-lib-modules\") pod \"glance-default-external-api-0\" (UID: \"59fd54c3-fef6-4165-ab5a-3bb74543da8b\") " pod="glance-kuttl-tests/glance-default-external-api-0" Nov 26 15:27:34 crc kubenswrapper[4785]: I1126 15:27:34.008184 4785 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"59fd54c3-fef6-4165-ab5a-3bb74543da8b\") device mount path \"/mnt/openstack/pv12\"" pod="glance-kuttl-tests/glance-default-external-api-0" Nov 26 15:27:34 crc kubenswrapper[4785]: I1126 15:27:34.008612 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/59fd54c3-fef6-4165-ab5a-3bb74543da8b-logs\") pod \"glance-default-external-api-0\" (UID: \"59fd54c3-fef6-4165-ab5a-3bb74543da8b\") " pod="glance-kuttl-tests/glance-default-external-api-0" Nov 26 15:27:34 crc kubenswrapper[4785]: I1126 15:27:34.008611 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/59fd54c3-fef6-4165-ab5a-3bb74543da8b-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"59fd54c3-fef6-4165-ab5a-3bb74543da8b\") " pod="glance-kuttl-tests/glance-default-external-api-0" Nov 26 15:27:34 crc kubenswrapper[4785]: I1126 15:27:34.008723 4785 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-external-api-0\" (UID: \"59fd54c3-fef6-4165-ab5a-3bb74543da8b\") device mount path \"/mnt/openstack/pv02\"" pod="glance-kuttl-tests/glance-default-external-api-0" Nov 26 15:27:34 crc kubenswrapper[4785]: I1126 15:27:34.012435 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/59fd54c3-fef6-4165-ab5a-3bb74543da8b-config-data\") pod \"glance-default-external-api-0\" (UID: \"59fd54c3-fef6-4165-ab5a-3bb74543da8b\") " pod="glance-kuttl-tests/glance-default-external-api-0" Nov 26 15:27:34 crc kubenswrapper[4785]: I1126 15:27:34.022688 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/59fd54c3-fef6-4165-ab5a-3bb74543da8b-scripts\") pod \"glance-default-external-api-0\" (UID: \"59fd54c3-fef6-4165-ab5a-3bb74543da8b\") " pod="glance-kuttl-tests/glance-default-external-api-0" Nov 26 15:27:34 crc kubenswrapper[4785]: I1126 15:27:34.023295 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ngm9j\" (UniqueName: \"kubernetes.io/projected/59fd54c3-fef6-4165-ab5a-3bb74543da8b-kube-api-access-ngm9j\") pod \"glance-default-external-api-0\" (UID: \"59fd54c3-fef6-4165-ab5a-3bb74543da8b\") " pod="glance-kuttl-tests/glance-default-external-api-0" Nov 26 15:27:34 crc kubenswrapper[4785]: I1126 15:27:34.028751 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"59fd54c3-fef6-4165-ab5a-3bb74543da8b\") " pod="glance-kuttl-tests/glance-default-external-api-0" Nov 26 15:27:34 crc kubenswrapper[4785]: I1126 15:27:34.044167 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-external-api-0\" (UID: \"59fd54c3-fef6-4165-ab5a-3bb74543da8b\") " pod="glance-kuttl-tests/glance-default-external-api-0" Nov 26 15:27:34 crc kubenswrapper[4785]: I1126 15:27:34.108974 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"fabcb040-8ff8-4302-8628-7c3d52c1a5cb\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 15:27:34 crc kubenswrapper[4785]: I1126 15:27:34.109033 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/fabcb040-8ff8-4302-8628-7c3d52c1a5cb-dev\") pod \"glance-default-internal-api-0\" (UID: \"fabcb040-8ff8-4302-8628-7c3d52c1a5cb\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 15:27:34 crc kubenswrapper[4785]: I1126 15:27:34.109070 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/fabcb040-8ff8-4302-8628-7c3d52c1a5cb-run\") pod \"glance-default-internal-api-0\" (UID: \"fabcb040-8ff8-4302-8628-7c3d52c1a5cb\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 15:27:34 crc kubenswrapper[4785]: I1126 15:27:34.109116 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/fabcb040-8ff8-4302-8628-7c3d52c1a5cb-etc-nvme\") pod \"glance-default-internal-api-0\" (UID: \"fabcb040-8ff8-4302-8628-7c3d52c1a5cb\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 15:27:34 crc kubenswrapper[4785]: I1126 15:27:34.109145 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fabcb040-8ff8-4302-8628-7c3d52c1a5cb-config-data\") pod \"glance-default-internal-api-0\" (UID: \"fabcb040-8ff8-4302-8628-7c3d52c1a5cb\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 15:27:34 crc kubenswrapper[4785]: I1126 15:27:34.109179 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/fabcb040-8ff8-4302-8628-7c3d52c1a5cb-var-locks-brick\") pod \"glance-default-internal-api-0\" (UID: \"fabcb040-8ff8-4302-8628-7c3d52c1a5cb\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 15:27:34 crc kubenswrapper[4785]: I1126 15:27:34.109201 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-internal-api-0\" (UID: \"fabcb040-8ff8-4302-8628-7c3d52c1a5cb\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 15:27:34 crc kubenswrapper[4785]: I1126 15:27:34.109224 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fabcb040-8ff8-4302-8628-7c3d52c1a5cb-logs\") pod \"glance-default-internal-api-0\" (UID: \"fabcb040-8ff8-4302-8628-7c3d52c1a5cb\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 15:27:34 crc kubenswrapper[4785]: I1126 15:27:34.109279 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/fabcb040-8ff8-4302-8628-7c3d52c1a5cb-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"fabcb040-8ff8-4302-8628-7c3d52c1a5cb\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 15:27:34 crc kubenswrapper[4785]: I1126 15:27:34.109312 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/fabcb040-8ff8-4302-8628-7c3d52c1a5cb-lib-modules\") pod \"glance-default-internal-api-0\" (UID: \"fabcb040-8ff8-4302-8628-7c3d52c1a5cb\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 15:27:34 crc kubenswrapper[4785]: I1126 15:27:34.109356 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/fabcb040-8ff8-4302-8628-7c3d52c1a5cb-sys\") pod \"glance-default-internal-api-0\" (UID: \"fabcb040-8ff8-4302-8628-7c3d52c1a5cb\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 15:27:34 crc kubenswrapper[4785]: I1126 15:27:34.109382 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2bqx2\" (UniqueName: \"kubernetes.io/projected/fabcb040-8ff8-4302-8628-7c3d52c1a5cb-kube-api-access-2bqx2\") pod \"glance-default-internal-api-0\" (UID: \"fabcb040-8ff8-4302-8628-7c3d52c1a5cb\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 15:27:34 crc kubenswrapper[4785]: I1126 15:27:34.109404 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/fabcb040-8ff8-4302-8628-7c3d52c1a5cb-etc-iscsi\") pod \"glance-default-internal-api-0\" (UID: \"fabcb040-8ff8-4302-8628-7c3d52c1a5cb\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 15:27:34 crc kubenswrapper[4785]: I1126 15:27:34.109445 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fabcb040-8ff8-4302-8628-7c3d52c1a5cb-scripts\") pod \"glance-default-internal-api-0\" (UID: \"fabcb040-8ff8-4302-8628-7c3d52c1a5cb\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 15:27:34 crc kubenswrapper[4785]: I1126 15:27:34.109720 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/fabcb040-8ff8-4302-8628-7c3d52c1a5cb-run\") pod \"glance-default-internal-api-0\" (UID: \"fabcb040-8ff8-4302-8628-7c3d52c1a5cb\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 15:27:34 crc kubenswrapper[4785]: I1126 15:27:34.109752 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/fabcb040-8ff8-4302-8628-7c3d52c1a5cb-dev\") pod \"glance-default-internal-api-0\" (UID: \"fabcb040-8ff8-4302-8628-7c3d52c1a5cb\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 15:27:34 crc kubenswrapper[4785]: I1126 15:27:34.109179 4785 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"fabcb040-8ff8-4302-8628-7c3d52c1a5cb\") device mount path \"/mnt/openstack/pv08\"" pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 15:27:34 crc kubenswrapper[4785]: I1126 15:27:34.110029 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/fabcb040-8ff8-4302-8628-7c3d52c1a5cb-var-locks-brick\") pod \"glance-default-internal-api-0\" (UID: \"fabcb040-8ff8-4302-8628-7c3d52c1a5cb\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 15:27:34 crc kubenswrapper[4785]: I1126 15:27:34.110111 4785 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-internal-api-0\" (UID: \"fabcb040-8ff8-4302-8628-7c3d52c1a5cb\") device mount path \"/mnt/openstack/pv11\"" pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 15:27:34 crc kubenswrapper[4785]: I1126 15:27:34.110262 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/fabcb040-8ff8-4302-8628-7c3d52c1a5cb-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"fabcb040-8ff8-4302-8628-7c3d52c1a5cb\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 15:27:34 crc kubenswrapper[4785]: I1126 15:27:34.110313 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/fabcb040-8ff8-4302-8628-7c3d52c1a5cb-etc-iscsi\") pod \"glance-default-internal-api-0\" (UID: \"fabcb040-8ff8-4302-8628-7c3d52c1a5cb\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 15:27:34 crc kubenswrapper[4785]: I1126 15:27:34.110368 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/fabcb040-8ff8-4302-8628-7c3d52c1a5cb-lib-modules\") pod \"glance-default-internal-api-0\" (UID: \"fabcb040-8ff8-4302-8628-7c3d52c1a5cb\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 15:27:34 crc kubenswrapper[4785]: I1126 15:27:34.110664 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/fabcb040-8ff8-4302-8628-7c3d52c1a5cb-sys\") pod \"glance-default-internal-api-0\" (UID: \"fabcb040-8ff8-4302-8628-7c3d52c1a5cb\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 15:27:34 crc kubenswrapper[4785]: I1126 15:27:34.110903 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-external-api-0" Nov 26 15:27:34 crc kubenswrapper[4785]: I1126 15:27:34.110993 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/fabcb040-8ff8-4302-8628-7c3d52c1a5cb-etc-nvme\") pod \"glance-default-internal-api-0\" (UID: \"fabcb040-8ff8-4302-8628-7c3d52c1a5cb\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 15:27:34 crc kubenswrapper[4785]: I1126 15:27:34.111273 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fabcb040-8ff8-4302-8628-7c3d52c1a5cb-logs\") pod \"glance-default-internal-api-0\" (UID: \"fabcb040-8ff8-4302-8628-7c3d52c1a5cb\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 15:27:34 crc kubenswrapper[4785]: I1126 15:27:34.118026 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fabcb040-8ff8-4302-8628-7c3d52c1a5cb-scripts\") pod \"glance-default-internal-api-0\" (UID: \"fabcb040-8ff8-4302-8628-7c3d52c1a5cb\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 15:27:34 crc kubenswrapper[4785]: I1126 15:27:34.118487 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fabcb040-8ff8-4302-8628-7c3d52c1a5cb-config-data\") pod \"glance-default-internal-api-0\" (UID: \"fabcb040-8ff8-4302-8628-7c3d52c1a5cb\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 15:27:34 crc kubenswrapper[4785]: I1126 15:27:34.136703 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"fabcb040-8ff8-4302-8628-7c3d52c1a5cb\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 15:27:34 crc kubenswrapper[4785]: I1126 15:27:34.138037 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2bqx2\" (UniqueName: \"kubernetes.io/projected/fabcb040-8ff8-4302-8628-7c3d52c1a5cb-kube-api-access-2bqx2\") pod \"glance-default-internal-api-0\" (UID: \"fabcb040-8ff8-4302-8628-7c3d52c1a5cb\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 15:27:34 crc kubenswrapper[4785]: I1126 15:27:34.141966 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-internal-api-0\" (UID: \"fabcb040-8ff8-4302-8628-7c3d52c1a5cb\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 15:27:34 crc kubenswrapper[4785]: I1126 15:27:34.256335 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 15:27:34 crc kubenswrapper[4785]: I1126 15:27:34.496716 4785 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-0"] Nov 26 15:27:34 crc kubenswrapper[4785]: W1126 15:27:34.502451 4785 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfabcb040_8ff8_4302_8628_7c3d52c1a5cb.slice/crio-f662982f5b1d7de5b487ea2b1ee914a05bf354a7a2e81899d4108b6b11b27c50 WatchSource:0}: Error finding container f662982f5b1d7de5b487ea2b1ee914a05bf354a7a2e81899d4108b6b11b27c50: Status 404 returned error can't find the container with id f662982f5b1d7de5b487ea2b1ee914a05bf354a7a2e81899d4108b6b11b27c50 Nov 26 15:27:34 crc kubenswrapper[4785]: I1126 15:27:34.556735 4785 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-0"] Nov 26 15:27:34 crc kubenswrapper[4785]: W1126 15:27:34.570262 4785 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod59fd54c3_fef6_4165_ab5a_3bb74543da8b.slice/crio-4e8218b8d7cdbe8b2bbe8d4e2e726b25b88ee896e56a7eb837d8b17d0a4c22b4 WatchSource:0}: Error finding container 4e8218b8d7cdbe8b2bbe8d4e2e726b25b88ee896e56a7eb837d8b17d0a4c22b4: Status 404 returned error can't find the container with id 4e8218b8d7cdbe8b2bbe8d4e2e726b25b88ee896e56a7eb837d8b17d0a4c22b4 Nov 26 15:27:34 crc kubenswrapper[4785]: I1126 15:27:34.825927 4785 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-0"] Nov 26 15:27:35 crc kubenswrapper[4785]: I1126 15:27:35.460507 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-0" event={"ID":"59fd54c3-fef6-4165-ab5a-3bb74543da8b","Type":"ContainerStarted","Data":"0d066f4fc73b34551aee79e5681e88c1e025ee35655080bff726d0335cda2a50"} Nov 26 15:27:35 crc kubenswrapper[4785]: I1126 15:27:35.461011 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-0" event={"ID":"59fd54c3-fef6-4165-ab5a-3bb74543da8b","Type":"ContainerStarted","Data":"ce3b83a301ee7beb4c13b3e636a446990d3f4e20b59efe5daaec44f09d46cfd9"} Nov 26 15:27:35 crc kubenswrapper[4785]: I1126 15:27:35.461023 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-0" event={"ID":"59fd54c3-fef6-4165-ab5a-3bb74543da8b","Type":"ContainerStarted","Data":"2eb2916c0dcbeb771b79a6d07b12136e1d36ea5ecfe99773270e9ce5b28d1153"} Nov 26 15:27:35 crc kubenswrapper[4785]: I1126 15:27:35.461034 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-0" event={"ID":"59fd54c3-fef6-4165-ab5a-3bb74543da8b","Type":"ContainerStarted","Data":"4e8218b8d7cdbe8b2bbe8d4e2e726b25b88ee896e56a7eb837d8b17d0a4c22b4"} Nov 26 15:27:35 crc kubenswrapper[4785]: I1126 15:27:35.464510 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"fabcb040-8ff8-4302-8628-7c3d52c1a5cb","Type":"ContainerStarted","Data":"93d68345702e12ebe5aefba07254e1c8ba324a8f30175bbfd01c1c72d07b1a27"} Nov 26 15:27:35 crc kubenswrapper[4785]: I1126 15:27:35.464545 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"fabcb040-8ff8-4302-8628-7c3d52c1a5cb","Type":"ContainerStarted","Data":"8f713110f256777f141b637f15571db17aef871e32e978a230e08b87cbedad24"} Nov 26 15:27:35 crc kubenswrapper[4785]: I1126 15:27:35.464578 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"fabcb040-8ff8-4302-8628-7c3d52c1a5cb","Type":"ContainerStarted","Data":"cadcae2ee6f9ea73047f137d1ab3f9ef10fed9e7a338d2dafd3ca1adbfe1e836"} Nov 26 15:27:35 crc kubenswrapper[4785]: I1126 15:27:35.464590 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"fabcb040-8ff8-4302-8628-7c3d52c1a5cb","Type":"ContainerStarted","Data":"f662982f5b1d7de5b487ea2b1ee914a05bf354a7a2e81899d4108b6b11b27c50"} Nov 26 15:27:35 crc kubenswrapper[4785]: I1126 15:27:35.464696 4785 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-internal-api-0" podUID="fabcb040-8ff8-4302-8628-7c3d52c1a5cb" containerName="glance-log" containerID="cri-o://cadcae2ee6f9ea73047f137d1ab3f9ef10fed9e7a338d2dafd3ca1adbfe1e836" gracePeriod=30 Nov 26 15:27:35 crc kubenswrapper[4785]: I1126 15:27:35.464945 4785 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-internal-api-0" podUID="fabcb040-8ff8-4302-8628-7c3d52c1a5cb" containerName="glance-api" containerID="cri-o://93d68345702e12ebe5aefba07254e1c8ba324a8f30175bbfd01c1c72d07b1a27" gracePeriod=30 Nov 26 15:27:35 crc kubenswrapper[4785]: I1126 15:27:35.465004 4785 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-internal-api-0" podUID="fabcb040-8ff8-4302-8628-7c3d52c1a5cb" containerName="glance-httpd" containerID="cri-o://8f713110f256777f141b637f15571db17aef871e32e978a230e08b87cbedad24" gracePeriod=30 Nov 26 15:27:35 crc kubenswrapper[4785]: I1126 15:27:35.492324 4785 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-default-external-api-0" podStartSLOduration=2.492305231 podStartE2EDuration="2.492305231s" podCreationTimestamp="2025-11-26 15:27:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 15:27:35.488383447 +0000 UTC m=+1219.166749221" watchObservedRunningTime="2025-11-26 15:27:35.492305231 +0000 UTC m=+1219.170670995" Nov 26 15:27:35 crc kubenswrapper[4785]: I1126 15:27:35.525859 4785 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-default-internal-api-0" podStartSLOduration=3.525836021 podStartE2EDuration="3.525836021s" podCreationTimestamp="2025-11-26 15:27:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 15:27:35.515787564 +0000 UTC m=+1219.194153348" watchObservedRunningTime="2025-11-26 15:27:35.525836021 +0000 UTC m=+1219.204201785" Nov 26 15:27:35 crc kubenswrapper[4785]: I1126 15:27:35.855804 4785 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 15:27:35 crc kubenswrapper[4785]: I1126 15:27:35.938501 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fabcb040-8ff8-4302-8628-7c3d52c1a5cb-config-data\") pod \"fabcb040-8ff8-4302-8628-7c3d52c1a5cb\" (UID: \"fabcb040-8ff8-4302-8628-7c3d52c1a5cb\") " Nov 26 15:27:35 crc kubenswrapper[4785]: I1126 15:27:35.938564 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"fabcb040-8ff8-4302-8628-7c3d52c1a5cb\" (UID: \"fabcb040-8ff8-4302-8628-7c3d52c1a5cb\") " Nov 26 15:27:35 crc kubenswrapper[4785]: I1126 15:27:35.938632 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/fabcb040-8ff8-4302-8628-7c3d52c1a5cb-lib-modules\") pod \"fabcb040-8ff8-4302-8628-7c3d52c1a5cb\" (UID: \"fabcb040-8ff8-4302-8628-7c3d52c1a5cb\") " Nov 26 15:27:35 crc kubenswrapper[4785]: I1126 15:27:35.938672 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/fabcb040-8ff8-4302-8628-7c3d52c1a5cb-run\") pod \"fabcb040-8ff8-4302-8628-7c3d52c1a5cb\" (UID: \"fabcb040-8ff8-4302-8628-7c3d52c1a5cb\") " Nov 26 15:27:35 crc kubenswrapper[4785]: I1126 15:27:35.938715 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/fabcb040-8ff8-4302-8628-7c3d52c1a5cb-etc-nvme\") pod \"fabcb040-8ff8-4302-8628-7c3d52c1a5cb\" (UID: \"fabcb040-8ff8-4302-8628-7c3d52c1a5cb\") " Nov 26 15:27:35 crc kubenswrapper[4785]: I1126 15:27:35.938751 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2bqx2\" (UniqueName: \"kubernetes.io/projected/fabcb040-8ff8-4302-8628-7c3d52c1a5cb-kube-api-access-2bqx2\") pod \"fabcb040-8ff8-4302-8628-7c3d52c1a5cb\" (UID: \"fabcb040-8ff8-4302-8628-7c3d52c1a5cb\") " Nov 26 15:27:35 crc kubenswrapper[4785]: I1126 15:27:35.938783 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/fabcb040-8ff8-4302-8628-7c3d52c1a5cb-dev\") pod \"fabcb040-8ff8-4302-8628-7c3d52c1a5cb\" (UID: \"fabcb040-8ff8-4302-8628-7c3d52c1a5cb\") " Nov 26 15:27:35 crc kubenswrapper[4785]: I1126 15:27:35.938813 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/fabcb040-8ff8-4302-8628-7c3d52c1a5cb-sys\") pod \"fabcb040-8ff8-4302-8628-7c3d52c1a5cb\" (UID: \"fabcb040-8ff8-4302-8628-7c3d52c1a5cb\") " Nov 26 15:27:35 crc kubenswrapper[4785]: I1126 15:27:35.938846 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fabcb040-8ff8-4302-8628-7c3d52c1a5cb-scripts\") pod \"fabcb040-8ff8-4302-8628-7c3d52c1a5cb\" (UID: \"fabcb040-8ff8-4302-8628-7c3d52c1a5cb\") " Nov 26 15:27:35 crc kubenswrapper[4785]: I1126 15:27:35.938882 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/fabcb040-8ff8-4302-8628-7c3d52c1a5cb-var-locks-brick\") pod \"fabcb040-8ff8-4302-8628-7c3d52c1a5cb\" (UID: \"fabcb040-8ff8-4302-8628-7c3d52c1a5cb\") " Nov 26 15:27:35 crc kubenswrapper[4785]: I1126 15:27:35.938908 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/fabcb040-8ff8-4302-8628-7c3d52c1a5cb-etc-iscsi\") pod \"fabcb040-8ff8-4302-8628-7c3d52c1a5cb\" (UID: \"fabcb040-8ff8-4302-8628-7c3d52c1a5cb\") " Nov 26 15:27:35 crc kubenswrapper[4785]: I1126 15:27:35.938942 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fabcb040-8ff8-4302-8628-7c3d52c1a5cb-logs\") pod \"fabcb040-8ff8-4302-8628-7c3d52c1a5cb\" (UID: \"fabcb040-8ff8-4302-8628-7c3d52c1a5cb\") " Nov 26 15:27:35 crc kubenswrapper[4785]: I1126 15:27:35.938979 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/fabcb040-8ff8-4302-8628-7c3d52c1a5cb-httpd-run\") pod \"fabcb040-8ff8-4302-8628-7c3d52c1a5cb\" (UID: \"fabcb040-8ff8-4302-8628-7c3d52c1a5cb\") " Nov 26 15:27:35 crc kubenswrapper[4785]: I1126 15:27:35.939009 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance-cache\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"fabcb040-8ff8-4302-8628-7c3d52c1a5cb\" (UID: \"fabcb040-8ff8-4302-8628-7c3d52c1a5cb\") " Nov 26 15:27:35 crc kubenswrapper[4785]: I1126 15:27:35.939062 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fabcb040-8ff8-4302-8628-7c3d52c1a5cb-var-locks-brick" (OuterVolumeSpecName: "var-locks-brick") pod "fabcb040-8ff8-4302-8628-7c3d52c1a5cb" (UID: "fabcb040-8ff8-4302-8628-7c3d52c1a5cb"). InnerVolumeSpecName "var-locks-brick". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 15:27:35 crc kubenswrapper[4785]: I1126 15:27:35.939115 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fabcb040-8ff8-4302-8628-7c3d52c1a5cb-dev" (OuterVolumeSpecName: "dev") pod "fabcb040-8ff8-4302-8628-7c3d52c1a5cb" (UID: "fabcb040-8ff8-4302-8628-7c3d52c1a5cb"). InnerVolumeSpecName "dev". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 15:27:35 crc kubenswrapper[4785]: I1126 15:27:35.939200 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fabcb040-8ff8-4302-8628-7c3d52c1a5cb-sys" (OuterVolumeSpecName: "sys") pod "fabcb040-8ff8-4302-8628-7c3d52c1a5cb" (UID: "fabcb040-8ff8-4302-8628-7c3d52c1a5cb"). InnerVolumeSpecName "sys". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 15:27:35 crc kubenswrapper[4785]: I1126 15:27:35.939262 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fabcb040-8ff8-4302-8628-7c3d52c1a5cb-lib-modules" (OuterVolumeSpecName: "lib-modules") pod "fabcb040-8ff8-4302-8628-7c3d52c1a5cb" (UID: "fabcb040-8ff8-4302-8628-7c3d52c1a5cb"). InnerVolumeSpecName "lib-modules". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 15:27:35 crc kubenswrapper[4785]: I1126 15:27:35.939300 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fabcb040-8ff8-4302-8628-7c3d52c1a5cb-etc-iscsi" (OuterVolumeSpecName: "etc-iscsi") pod "fabcb040-8ff8-4302-8628-7c3d52c1a5cb" (UID: "fabcb040-8ff8-4302-8628-7c3d52c1a5cb"). InnerVolumeSpecName "etc-iscsi". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 15:27:35 crc kubenswrapper[4785]: I1126 15:27:35.939306 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fabcb040-8ff8-4302-8628-7c3d52c1a5cb-run" (OuterVolumeSpecName: "run") pod "fabcb040-8ff8-4302-8628-7c3d52c1a5cb" (UID: "fabcb040-8ff8-4302-8628-7c3d52c1a5cb"). InnerVolumeSpecName "run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 15:27:35 crc kubenswrapper[4785]: I1126 15:27:35.939339 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fabcb040-8ff8-4302-8628-7c3d52c1a5cb-etc-nvme" (OuterVolumeSpecName: "etc-nvme") pod "fabcb040-8ff8-4302-8628-7c3d52c1a5cb" (UID: "fabcb040-8ff8-4302-8628-7c3d52c1a5cb"). InnerVolumeSpecName "etc-nvme". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 15:27:35 crc kubenswrapper[4785]: I1126 15:27:35.939602 4785 reconciler_common.go:293] "Volume detached for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/fabcb040-8ff8-4302-8628-7c3d52c1a5cb-sys\") on node \"crc\" DevicePath \"\"" Nov 26 15:27:35 crc kubenswrapper[4785]: I1126 15:27:35.939625 4785 reconciler_common.go:293] "Volume detached for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/fabcb040-8ff8-4302-8628-7c3d52c1a5cb-var-locks-brick\") on node \"crc\" DevicePath \"\"" Nov 26 15:27:35 crc kubenswrapper[4785]: I1126 15:27:35.939636 4785 reconciler_common.go:293] "Volume detached for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/fabcb040-8ff8-4302-8628-7c3d52c1a5cb-etc-iscsi\") on node \"crc\" DevicePath \"\"" Nov 26 15:27:35 crc kubenswrapper[4785]: I1126 15:27:35.939645 4785 reconciler_common.go:293] "Volume detached for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/fabcb040-8ff8-4302-8628-7c3d52c1a5cb-lib-modules\") on node \"crc\" DevicePath \"\"" Nov 26 15:27:35 crc kubenswrapper[4785]: I1126 15:27:35.939653 4785 reconciler_common.go:293] "Volume detached for volume \"run\" (UniqueName: \"kubernetes.io/host-path/fabcb040-8ff8-4302-8628-7c3d52c1a5cb-run\") on node \"crc\" DevicePath \"\"" Nov 26 15:27:35 crc kubenswrapper[4785]: I1126 15:27:35.939661 4785 reconciler_common.go:293] "Volume detached for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/fabcb040-8ff8-4302-8628-7c3d52c1a5cb-etc-nvme\") on node \"crc\" DevicePath \"\"" Nov 26 15:27:35 crc kubenswrapper[4785]: I1126 15:27:35.939669 4785 reconciler_common.go:293] "Volume detached for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/fabcb040-8ff8-4302-8628-7c3d52c1a5cb-dev\") on node \"crc\" DevicePath \"\"" Nov 26 15:27:35 crc kubenswrapper[4785]: I1126 15:27:35.939602 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fabcb040-8ff8-4302-8628-7c3d52c1a5cb-logs" (OuterVolumeSpecName: "logs") pod "fabcb040-8ff8-4302-8628-7c3d52c1a5cb" (UID: "fabcb040-8ff8-4302-8628-7c3d52c1a5cb"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 15:27:35 crc kubenswrapper[4785]: I1126 15:27:35.939736 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fabcb040-8ff8-4302-8628-7c3d52c1a5cb-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "fabcb040-8ff8-4302-8628-7c3d52c1a5cb" (UID: "fabcb040-8ff8-4302-8628-7c3d52c1a5cb"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 15:27:35 crc kubenswrapper[4785]: I1126 15:27:35.943657 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fabcb040-8ff8-4302-8628-7c3d52c1a5cb-scripts" (OuterVolumeSpecName: "scripts") pod "fabcb040-8ff8-4302-8628-7c3d52c1a5cb" (UID: "fabcb040-8ff8-4302-8628-7c3d52c1a5cb"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 15:27:35 crc kubenswrapper[4785]: I1126 15:27:35.944770 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fabcb040-8ff8-4302-8628-7c3d52c1a5cb-kube-api-access-2bqx2" (OuterVolumeSpecName: "kube-api-access-2bqx2") pod "fabcb040-8ff8-4302-8628-7c3d52c1a5cb" (UID: "fabcb040-8ff8-4302-8628-7c3d52c1a5cb"). InnerVolumeSpecName "kube-api-access-2bqx2". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 15:27:35 crc kubenswrapper[4785]: I1126 15:27:35.944770 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage08-crc" (OuterVolumeSpecName: "glance-cache") pod "fabcb040-8ff8-4302-8628-7c3d52c1a5cb" (UID: "fabcb040-8ff8-4302-8628-7c3d52c1a5cb"). InnerVolumeSpecName "local-storage08-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Nov 26 15:27:35 crc kubenswrapper[4785]: I1126 15:27:35.944796 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage11-crc" (OuterVolumeSpecName: "glance") pod "fabcb040-8ff8-4302-8628-7c3d52c1a5cb" (UID: "fabcb040-8ff8-4302-8628-7c3d52c1a5cb"). InnerVolumeSpecName "local-storage11-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Nov 26 15:27:36 crc kubenswrapper[4785]: I1126 15:27:36.023752 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fabcb040-8ff8-4302-8628-7c3d52c1a5cb-config-data" (OuterVolumeSpecName: "config-data") pod "fabcb040-8ff8-4302-8628-7c3d52c1a5cb" (UID: "fabcb040-8ff8-4302-8628-7c3d52c1a5cb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 15:27:36 crc kubenswrapper[4785]: I1126 15:27:36.042222 4785 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2bqx2\" (UniqueName: \"kubernetes.io/projected/fabcb040-8ff8-4302-8628-7c3d52c1a5cb-kube-api-access-2bqx2\") on node \"crc\" DevicePath \"\"" Nov 26 15:27:36 crc kubenswrapper[4785]: I1126 15:27:36.042336 4785 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fabcb040-8ff8-4302-8628-7c3d52c1a5cb-scripts\") on node \"crc\" DevicePath \"\"" Nov 26 15:27:36 crc kubenswrapper[4785]: I1126 15:27:36.042357 4785 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fabcb040-8ff8-4302-8628-7c3d52c1a5cb-logs\") on node \"crc\" DevicePath \"\"" Nov 26 15:27:36 crc kubenswrapper[4785]: I1126 15:27:36.042369 4785 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/fabcb040-8ff8-4302-8628-7c3d52c1a5cb-httpd-run\") on node \"crc\" DevicePath \"\"" Nov 26 15:27:36 crc kubenswrapper[4785]: I1126 15:27:36.042411 4785 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" " Nov 26 15:27:36 crc kubenswrapper[4785]: I1126 15:27:36.042425 4785 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fabcb040-8ff8-4302-8628-7c3d52c1a5cb-config-data\") on node \"crc\" DevicePath \"\"" Nov 26 15:27:36 crc kubenswrapper[4785]: I1126 15:27:36.042443 4785 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" " Nov 26 15:27:36 crc kubenswrapper[4785]: I1126 15:27:36.056415 4785 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage08-crc" (UniqueName: "kubernetes.io/local-volume/local-storage08-crc") on node "crc" Nov 26 15:27:36 crc kubenswrapper[4785]: I1126 15:27:36.060067 4785 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage11-crc" (UniqueName: "kubernetes.io/local-volume/local-storage11-crc") on node "crc" Nov 26 15:27:36 crc kubenswrapper[4785]: I1126 15:27:36.143906 4785 reconciler_common.go:293] "Volume detached for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" DevicePath \"\"" Nov 26 15:27:36 crc kubenswrapper[4785]: I1126 15:27:36.143937 4785 reconciler_common.go:293] "Volume detached for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" DevicePath \"\"" Nov 26 15:27:36 crc kubenswrapper[4785]: I1126 15:27:36.485871 4785 generic.go:334] "Generic (PLEG): container finished" podID="fabcb040-8ff8-4302-8628-7c3d52c1a5cb" containerID="93d68345702e12ebe5aefba07254e1c8ba324a8f30175bbfd01c1c72d07b1a27" exitCode=143 Nov 26 15:27:36 crc kubenswrapper[4785]: I1126 15:27:36.485907 4785 generic.go:334] "Generic (PLEG): container finished" podID="fabcb040-8ff8-4302-8628-7c3d52c1a5cb" containerID="8f713110f256777f141b637f15571db17aef871e32e978a230e08b87cbedad24" exitCode=143 Nov 26 15:27:36 crc kubenswrapper[4785]: I1126 15:27:36.485917 4785 generic.go:334] "Generic (PLEG): container finished" podID="fabcb040-8ff8-4302-8628-7c3d52c1a5cb" containerID="cadcae2ee6f9ea73047f137d1ab3f9ef10fed9e7a338d2dafd3ca1adbfe1e836" exitCode=143 Nov 26 15:27:36 crc kubenswrapper[4785]: I1126 15:27:36.485974 4785 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 15:27:36 crc kubenswrapper[4785]: I1126 15:27:36.485990 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"fabcb040-8ff8-4302-8628-7c3d52c1a5cb","Type":"ContainerDied","Data":"93d68345702e12ebe5aefba07254e1c8ba324a8f30175bbfd01c1c72d07b1a27"} Nov 26 15:27:36 crc kubenswrapper[4785]: I1126 15:27:36.486042 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"fabcb040-8ff8-4302-8628-7c3d52c1a5cb","Type":"ContainerDied","Data":"8f713110f256777f141b637f15571db17aef871e32e978a230e08b87cbedad24"} Nov 26 15:27:36 crc kubenswrapper[4785]: I1126 15:27:36.486055 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"fabcb040-8ff8-4302-8628-7c3d52c1a5cb","Type":"ContainerDied","Data":"cadcae2ee6f9ea73047f137d1ab3f9ef10fed9e7a338d2dafd3ca1adbfe1e836"} Nov 26 15:27:36 crc kubenswrapper[4785]: I1126 15:27:36.486069 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"fabcb040-8ff8-4302-8628-7c3d52c1a5cb","Type":"ContainerDied","Data":"f662982f5b1d7de5b487ea2b1ee914a05bf354a7a2e81899d4108b6b11b27c50"} Nov 26 15:27:36 crc kubenswrapper[4785]: I1126 15:27:36.486086 4785 scope.go:117] "RemoveContainer" containerID="93d68345702e12ebe5aefba07254e1c8ba324a8f30175bbfd01c1c72d07b1a27" Nov 26 15:27:36 crc kubenswrapper[4785]: I1126 15:27:36.514064 4785 scope.go:117] "RemoveContainer" containerID="8f713110f256777f141b637f15571db17aef871e32e978a230e08b87cbedad24" Nov 26 15:27:36 crc kubenswrapper[4785]: I1126 15:27:36.537489 4785 scope.go:117] "RemoveContainer" containerID="cadcae2ee6f9ea73047f137d1ab3f9ef10fed9e7a338d2dafd3ca1adbfe1e836" Nov 26 15:27:36 crc kubenswrapper[4785]: I1126 15:27:36.542543 4785 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-0"] Nov 26 15:27:36 crc kubenswrapper[4785]: I1126 15:27:36.548713 4785 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-0"] Nov 26 15:27:36 crc kubenswrapper[4785]: I1126 15:27:36.570256 4785 scope.go:117] "RemoveContainer" containerID="93d68345702e12ebe5aefba07254e1c8ba324a8f30175bbfd01c1c72d07b1a27" Nov 26 15:27:36 crc kubenswrapper[4785]: I1126 15:27:36.570835 4785 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-0"] Nov 26 15:27:36 crc kubenswrapper[4785]: E1126 15:27:36.570842 4785 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"93d68345702e12ebe5aefba07254e1c8ba324a8f30175bbfd01c1c72d07b1a27\": container with ID starting with 93d68345702e12ebe5aefba07254e1c8ba324a8f30175bbfd01c1c72d07b1a27 not found: ID does not exist" containerID="93d68345702e12ebe5aefba07254e1c8ba324a8f30175bbfd01c1c72d07b1a27" Nov 26 15:27:36 crc kubenswrapper[4785]: I1126 15:27:36.571100 4785 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"93d68345702e12ebe5aefba07254e1c8ba324a8f30175bbfd01c1c72d07b1a27"} err="failed to get container status \"93d68345702e12ebe5aefba07254e1c8ba324a8f30175bbfd01c1c72d07b1a27\": rpc error: code = NotFound desc = could not find container \"93d68345702e12ebe5aefba07254e1c8ba324a8f30175bbfd01c1c72d07b1a27\": container with ID starting with 93d68345702e12ebe5aefba07254e1c8ba324a8f30175bbfd01c1c72d07b1a27 not found: ID does not exist" Nov 26 15:27:36 crc kubenswrapper[4785]: I1126 15:27:36.571175 4785 scope.go:117] "RemoveContainer" containerID="8f713110f256777f141b637f15571db17aef871e32e978a230e08b87cbedad24" Nov 26 15:27:36 crc kubenswrapper[4785]: E1126 15:27:36.571245 4785 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fabcb040-8ff8-4302-8628-7c3d52c1a5cb" containerName="glance-api" Nov 26 15:27:36 crc kubenswrapper[4785]: I1126 15:27:36.571284 4785 state_mem.go:107] "Deleted CPUSet assignment" podUID="fabcb040-8ff8-4302-8628-7c3d52c1a5cb" containerName="glance-api" Nov 26 15:27:36 crc kubenswrapper[4785]: E1126 15:27:36.571314 4785 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fabcb040-8ff8-4302-8628-7c3d52c1a5cb" containerName="glance-httpd" Nov 26 15:27:36 crc kubenswrapper[4785]: I1126 15:27:36.571326 4785 state_mem.go:107] "Deleted CPUSet assignment" podUID="fabcb040-8ff8-4302-8628-7c3d52c1a5cb" containerName="glance-httpd" Nov 26 15:27:36 crc kubenswrapper[4785]: E1126 15:27:36.571355 4785 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fabcb040-8ff8-4302-8628-7c3d52c1a5cb" containerName="glance-log" Nov 26 15:27:36 crc kubenswrapper[4785]: I1126 15:27:36.571368 4785 state_mem.go:107] "Deleted CPUSet assignment" podUID="fabcb040-8ff8-4302-8628-7c3d52c1a5cb" containerName="glance-log" Nov 26 15:27:36 crc kubenswrapper[4785]: E1126 15:27:36.571542 4785 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8f713110f256777f141b637f15571db17aef871e32e978a230e08b87cbedad24\": container with ID starting with 8f713110f256777f141b637f15571db17aef871e32e978a230e08b87cbedad24 not found: ID does not exist" containerID="8f713110f256777f141b637f15571db17aef871e32e978a230e08b87cbedad24" Nov 26 15:27:36 crc kubenswrapper[4785]: I1126 15:27:36.571634 4785 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8f713110f256777f141b637f15571db17aef871e32e978a230e08b87cbedad24"} err="failed to get container status \"8f713110f256777f141b637f15571db17aef871e32e978a230e08b87cbedad24\": rpc error: code = NotFound desc = could not find container \"8f713110f256777f141b637f15571db17aef871e32e978a230e08b87cbedad24\": container with ID starting with 8f713110f256777f141b637f15571db17aef871e32e978a230e08b87cbedad24 not found: ID does not exist" Nov 26 15:27:36 crc kubenswrapper[4785]: I1126 15:27:36.571738 4785 scope.go:117] "RemoveContainer" containerID="cadcae2ee6f9ea73047f137d1ab3f9ef10fed9e7a338d2dafd3ca1adbfe1e836" Nov 26 15:27:36 crc kubenswrapper[4785]: I1126 15:27:36.571608 4785 memory_manager.go:354] "RemoveStaleState removing state" podUID="fabcb040-8ff8-4302-8628-7c3d52c1a5cb" containerName="glance-api" Nov 26 15:27:36 crc kubenswrapper[4785]: I1126 15:27:36.572057 4785 memory_manager.go:354] "RemoveStaleState removing state" podUID="fabcb040-8ff8-4302-8628-7c3d52c1a5cb" containerName="glance-log" Nov 26 15:27:36 crc kubenswrapper[4785]: I1126 15:27:36.572150 4785 memory_manager.go:354] "RemoveStaleState removing state" podUID="fabcb040-8ff8-4302-8628-7c3d52c1a5cb" containerName="glance-httpd" Nov 26 15:27:36 crc kubenswrapper[4785]: I1126 15:27:36.573291 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 15:27:36 crc kubenswrapper[4785]: E1126 15:27:36.572086 4785 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cadcae2ee6f9ea73047f137d1ab3f9ef10fed9e7a338d2dafd3ca1adbfe1e836\": container with ID starting with cadcae2ee6f9ea73047f137d1ab3f9ef10fed9e7a338d2dafd3ca1adbfe1e836 not found: ID does not exist" containerID="cadcae2ee6f9ea73047f137d1ab3f9ef10fed9e7a338d2dafd3ca1adbfe1e836" Nov 26 15:27:36 crc kubenswrapper[4785]: I1126 15:27:36.573825 4785 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cadcae2ee6f9ea73047f137d1ab3f9ef10fed9e7a338d2dafd3ca1adbfe1e836"} err="failed to get container status \"cadcae2ee6f9ea73047f137d1ab3f9ef10fed9e7a338d2dafd3ca1adbfe1e836\": rpc error: code = NotFound desc = could not find container \"cadcae2ee6f9ea73047f137d1ab3f9ef10fed9e7a338d2dafd3ca1adbfe1e836\": container with ID starting with cadcae2ee6f9ea73047f137d1ab3f9ef10fed9e7a338d2dafd3ca1adbfe1e836 not found: ID does not exist" Nov 26 15:27:36 crc kubenswrapper[4785]: I1126 15:27:36.573892 4785 scope.go:117] "RemoveContainer" containerID="93d68345702e12ebe5aefba07254e1c8ba324a8f30175bbfd01c1c72d07b1a27" Nov 26 15:27:36 crc kubenswrapper[4785]: I1126 15:27:36.574285 4785 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"93d68345702e12ebe5aefba07254e1c8ba324a8f30175bbfd01c1c72d07b1a27"} err="failed to get container status \"93d68345702e12ebe5aefba07254e1c8ba324a8f30175bbfd01c1c72d07b1a27\": rpc error: code = NotFound desc = could not find container \"93d68345702e12ebe5aefba07254e1c8ba324a8f30175bbfd01c1c72d07b1a27\": container with ID starting with 93d68345702e12ebe5aefba07254e1c8ba324a8f30175bbfd01c1c72d07b1a27 not found: ID does not exist" Nov 26 15:27:36 crc kubenswrapper[4785]: I1126 15:27:36.574337 4785 scope.go:117] "RemoveContainer" containerID="8f713110f256777f141b637f15571db17aef871e32e978a230e08b87cbedad24" Nov 26 15:27:36 crc kubenswrapper[4785]: I1126 15:27:36.575755 4785 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8f713110f256777f141b637f15571db17aef871e32e978a230e08b87cbedad24"} err="failed to get container status \"8f713110f256777f141b637f15571db17aef871e32e978a230e08b87cbedad24\": rpc error: code = NotFound desc = could not find container \"8f713110f256777f141b637f15571db17aef871e32e978a230e08b87cbedad24\": container with ID starting with 8f713110f256777f141b637f15571db17aef871e32e978a230e08b87cbedad24 not found: ID does not exist" Nov 26 15:27:36 crc kubenswrapper[4785]: I1126 15:27:36.575854 4785 scope.go:117] "RemoveContainer" containerID="cadcae2ee6f9ea73047f137d1ab3f9ef10fed9e7a338d2dafd3ca1adbfe1e836" Nov 26 15:27:36 crc kubenswrapper[4785]: I1126 15:27:36.576069 4785 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-default-internal-config-data" Nov 26 15:27:36 crc kubenswrapper[4785]: I1126 15:27:36.576372 4785 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cadcae2ee6f9ea73047f137d1ab3f9ef10fed9e7a338d2dafd3ca1adbfe1e836"} err="failed to get container status \"cadcae2ee6f9ea73047f137d1ab3f9ef10fed9e7a338d2dafd3ca1adbfe1e836\": rpc error: code = NotFound desc = could not find container \"cadcae2ee6f9ea73047f137d1ab3f9ef10fed9e7a338d2dafd3ca1adbfe1e836\": container with ID starting with cadcae2ee6f9ea73047f137d1ab3f9ef10fed9e7a338d2dafd3ca1adbfe1e836 not found: ID does not exist" Nov 26 15:27:36 crc kubenswrapper[4785]: I1126 15:27:36.576398 4785 scope.go:117] "RemoveContainer" containerID="93d68345702e12ebe5aefba07254e1c8ba324a8f30175bbfd01c1c72d07b1a27" Nov 26 15:27:36 crc kubenswrapper[4785]: I1126 15:27:36.576685 4785 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"93d68345702e12ebe5aefba07254e1c8ba324a8f30175bbfd01c1c72d07b1a27"} err="failed to get container status \"93d68345702e12ebe5aefba07254e1c8ba324a8f30175bbfd01c1c72d07b1a27\": rpc error: code = NotFound desc = could not find container \"93d68345702e12ebe5aefba07254e1c8ba324a8f30175bbfd01c1c72d07b1a27\": container with ID starting with 93d68345702e12ebe5aefba07254e1c8ba324a8f30175bbfd01c1c72d07b1a27 not found: ID does not exist" Nov 26 15:27:36 crc kubenswrapper[4785]: I1126 15:27:36.576715 4785 scope.go:117] "RemoveContainer" containerID="8f713110f256777f141b637f15571db17aef871e32e978a230e08b87cbedad24" Nov 26 15:27:36 crc kubenswrapper[4785]: I1126 15:27:36.577079 4785 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8f713110f256777f141b637f15571db17aef871e32e978a230e08b87cbedad24"} err="failed to get container status \"8f713110f256777f141b637f15571db17aef871e32e978a230e08b87cbedad24\": rpc error: code = NotFound desc = could not find container \"8f713110f256777f141b637f15571db17aef871e32e978a230e08b87cbedad24\": container with ID starting with 8f713110f256777f141b637f15571db17aef871e32e978a230e08b87cbedad24 not found: ID does not exist" Nov 26 15:27:36 crc kubenswrapper[4785]: I1126 15:27:36.577166 4785 scope.go:117] "RemoveContainer" containerID="cadcae2ee6f9ea73047f137d1ab3f9ef10fed9e7a338d2dafd3ca1adbfe1e836" Nov 26 15:27:36 crc kubenswrapper[4785]: I1126 15:27:36.577441 4785 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cadcae2ee6f9ea73047f137d1ab3f9ef10fed9e7a338d2dafd3ca1adbfe1e836"} err="failed to get container status \"cadcae2ee6f9ea73047f137d1ab3f9ef10fed9e7a338d2dafd3ca1adbfe1e836\": rpc error: code = NotFound desc = could not find container \"cadcae2ee6f9ea73047f137d1ab3f9ef10fed9e7a338d2dafd3ca1adbfe1e836\": container with ID starting with cadcae2ee6f9ea73047f137d1ab3f9ef10fed9e7a338d2dafd3ca1adbfe1e836 not found: ID does not exist" Nov 26 15:27:36 crc kubenswrapper[4785]: I1126 15:27:36.597930 4785 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-0"] Nov 26 15:27:36 crc kubenswrapper[4785]: I1126 15:27:36.650910 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-internal-api-0\" (UID: \"82000356-60a7-4c1b-8f86-c7ecc1a25d7f\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 15:27:36 crc kubenswrapper[4785]: I1126 15:27:36.650990 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/82000356-60a7-4c1b-8f86-c7ecc1a25d7f-config-data\") pod \"glance-default-internal-api-0\" (UID: \"82000356-60a7-4c1b-8f86-c7ecc1a25d7f\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 15:27:36 crc kubenswrapper[4785]: I1126 15:27:36.651022 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/82000356-60a7-4c1b-8f86-c7ecc1a25d7f-scripts\") pod \"glance-default-internal-api-0\" (UID: \"82000356-60a7-4c1b-8f86-c7ecc1a25d7f\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 15:27:36 crc kubenswrapper[4785]: I1126 15:27:36.651175 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/82000356-60a7-4c1b-8f86-c7ecc1a25d7f-sys\") pod \"glance-default-internal-api-0\" (UID: \"82000356-60a7-4c1b-8f86-c7ecc1a25d7f\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 15:27:36 crc kubenswrapper[4785]: I1126 15:27:36.651202 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r7g6j\" (UniqueName: \"kubernetes.io/projected/82000356-60a7-4c1b-8f86-c7ecc1a25d7f-kube-api-access-r7g6j\") pod \"glance-default-internal-api-0\" (UID: \"82000356-60a7-4c1b-8f86-c7ecc1a25d7f\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 15:27:36 crc kubenswrapper[4785]: I1126 15:27:36.651309 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/82000356-60a7-4c1b-8f86-c7ecc1a25d7f-dev\") pod \"glance-default-internal-api-0\" (UID: \"82000356-60a7-4c1b-8f86-c7ecc1a25d7f\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 15:27:36 crc kubenswrapper[4785]: I1126 15:27:36.651408 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/82000356-60a7-4c1b-8f86-c7ecc1a25d7f-run\") pod \"glance-default-internal-api-0\" (UID: \"82000356-60a7-4c1b-8f86-c7ecc1a25d7f\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 15:27:36 crc kubenswrapper[4785]: I1126 15:27:36.651509 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/82000356-60a7-4c1b-8f86-c7ecc1a25d7f-etc-nvme\") pod \"glance-default-internal-api-0\" (UID: \"82000356-60a7-4c1b-8f86-c7ecc1a25d7f\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 15:27:36 crc kubenswrapper[4785]: I1126 15:27:36.651630 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/82000356-60a7-4c1b-8f86-c7ecc1a25d7f-etc-iscsi\") pod \"glance-default-internal-api-0\" (UID: \"82000356-60a7-4c1b-8f86-c7ecc1a25d7f\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 15:27:36 crc kubenswrapper[4785]: I1126 15:27:36.651697 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/82000356-60a7-4c1b-8f86-c7ecc1a25d7f-logs\") pod \"glance-default-internal-api-0\" (UID: \"82000356-60a7-4c1b-8f86-c7ecc1a25d7f\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 15:27:36 crc kubenswrapper[4785]: I1126 15:27:36.651729 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/82000356-60a7-4c1b-8f86-c7ecc1a25d7f-var-locks-brick\") pod \"glance-default-internal-api-0\" (UID: \"82000356-60a7-4c1b-8f86-c7ecc1a25d7f\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 15:27:36 crc kubenswrapper[4785]: I1126 15:27:36.651800 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"82000356-60a7-4c1b-8f86-c7ecc1a25d7f\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 15:27:36 crc kubenswrapper[4785]: I1126 15:27:36.652079 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/82000356-60a7-4c1b-8f86-c7ecc1a25d7f-lib-modules\") pod \"glance-default-internal-api-0\" (UID: \"82000356-60a7-4c1b-8f86-c7ecc1a25d7f\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 15:27:36 crc kubenswrapper[4785]: I1126 15:27:36.652114 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/82000356-60a7-4c1b-8f86-c7ecc1a25d7f-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"82000356-60a7-4c1b-8f86-c7ecc1a25d7f\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 15:27:36 crc kubenswrapper[4785]: I1126 15:27:36.753749 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"82000356-60a7-4c1b-8f86-c7ecc1a25d7f\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 15:27:36 crc kubenswrapper[4785]: I1126 15:27:36.753810 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/82000356-60a7-4c1b-8f86-c7ecc1a25d7f-lib-modules\") pod \"glance-default-internal-api-0\" (UID: \"82000356-60a7-4c1b-8f86-c7ecc1a25d7f\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 15:27:36 crc kubenswrapper[4785]: I1126 15:27:36.753835 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/82000356-60a7-4c1b-8f86-c7ecc1a25d7f-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"82000356-60a7-4c1b-8f86-c7ecc1a25d7f\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 15:27:36 crc kubenswrapper[4785]: I1126 15:27:36.753884 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-internal-api-0\" (UID: \"82000356-60a7-4c1b-8f86-c7ecc1a25d7f\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 15:27:36 crc kubenswrapper[4785]: I1126 15:27:36.753910 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/82000356-60a7-4c1b-8f86-c7ecc1a25d7f-config-data\") pod \"glance-default-internal-api-0\" (UID: \"82000356-60a7-4c1b-8f86-c7ecc1a25d7f\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 15:27:36 crc kubenswrapper[4785]: I1126 15:27:36.753933 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/82000356-60a7-4c1b-8f86-c7ecc1a25d7f-scripts\") pod \"glance-default-internal-api-0\" (UID: \"82000356-60a7-4c1b-8f86-c7ecc1a25d7f\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 15:27:36 crc kubenswrapper[4785]: I1126 15:27:36.753973 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/82000356-60a7-4c1b-8f86-c7ecc1a25d7f-sys\") pod \"glance-default-internal-api-0\" (UID: \"82000356-60a7-4c1b-8f86-c7ecc1a25d7f\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 15:27:36 crc kubenswrapper[4785]: I1126 15:27:36.753995 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r7g6j\" (UniqueName: \"kubernetes.io/projected/82000356-60a7-4c1b-8f86-c7ecc1a25d7f-kube-api-access-r7g6j\") pod \"glance-default-internal-api-0\" (UID: \"82000356-60a7-4c1b-8f86-c7ecc1a25d7f\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 15:27:36 crc kubenswrapper[4785]: I1126 15:27:36.754017 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/82000356-60a7-4c1b-8f86-c7ecc1a25d7f-dev\") pod \"glance-default-internal-api-0\" (UID: \"82000356-60a7-4c1b-8f86-c7ecc1a25d7f\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 15:27:36 crc kubenswrapper[4785]: I1126 15:27:36.754043 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/82000356-60a7-4c1b-8f86-c7ecc1a25d7f-run\") pod \"glance-default-internal-api-0\" (UID: \"82000356-60a7-4c1b-8f86-c7ecc1a25d7f\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 15:27:36 crc kubenswrapper[4785]: I1126 15:27:36.754076 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/82000356-60a7-4c1b-8f86-c7ecc1a25d7f-etc-nvme\") pod \"glance-default-internal-api-0\" (UID: \"82000356-60a7-4c1b-8f86-c7ecc1a25d7f\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 15:27:36 crc kubenswrapper[4785]: I1126 15:27:36.754118 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/82000356-60a7-4c1b-8f86-c7ecc1a25d7f-etc-iscsi\") pod \"glance-default-internal-api-0\" (UID: \"82000356-60a7-4c1b-8f86-c7ecc1a25d7f\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 15:27:36 crc kubenswrapper[4785]: I1126 15:27:36.754144 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/82000356-60a7-4c1b-8f86-c7ecc1a25d7f-logs\") pod \"glance-default-internal-api-0\" (UID: \"82000356-60a7-4c1b-8f86-c7ecc1a25d7f\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 15:27:36 crc kubenswrapper[4785]: I1126 15:27:36.754165 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/82000356-60a7-4c1b-8f86-c7ecc1a25d7f-var-locks-brick\") pod \"glance-default-internal-api-0\" (UID: \"82000356-60a7-4c1b-8f86-c7ecc1a25d7f\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 15:27:36 crc kubenswrapper[4785]: I1126 15:27:36.754279 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/82000356-60a7-4c1b-8f86-c7ecc1a25d7f-var-locks-brick\") pod \"glance-default-internal-api-0\" (UID: \"82000356-60a7-4c1b-8f86-c7ecc1a25d7f\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 15:27:36 crc kubenswrapper[4785]: I1126 15:27:36.754349 4785 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"82000356-60a7-4c1b-8f86-c7ecc1a25d7f\") device mount path \"/mnt/openstack/pv08\"" pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 15:27:36 crc kubenswrapper[4785]: I1126 15:27:36.755165 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/82000356-60a7-4c1b-8f86-c7ecc1a25d7f-sys\") pod \"glance-default-internal-api-0\" (UID: \"82000356-60a7-4c1b-8f86-c7ecc1a25d7f\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 15:27:36 crc kubenswrapper[4785]: I1126 15:27:36.755414 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/82000356-60a7-4c1b-8f86-c7ecc1a25d7f-etc-nvme\") pod \"glance-default-internal-api-0\" (UID: \"82000356-60a7-4c1b-8f86-c7ecc1a25d7f\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 15:27:36 crc kubenswrapper[4785]: I1126 15:27:36.755664 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/82000356-60a7-4c1b-8f86-c7ecc1a25d7f-dev\") pod \"glance-default-internal-api-0\" (UID: \"82000356-60a7-4c1b-8f86-c7ecc1a25d7f\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 15:27:36 crc kubenswrapper[4785]: I1126 15:27:36.755881 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/82000356-60a7-4c1b-8f86-c7ecc1a25d7f-run\") pod \"glance-default-internal-api-0\" (UID: \"82000356-60a7-4c1b-8f86-c7ecc1a25d7f\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 15:27:36 crc kubenswrapper[4785]: I1126 15:27:36.756730 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/82000356-60a7-4c1b-8f86-c7ecc1a25d7f-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"82000356-60a7-4c1b-8f86-c7ecc1a25d7f\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 15:27:36 crc kubenswrapper[4785]: I1126 15:27:36.757073 4785 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-internal-api-0\" (UID: \"82000356-60a7-4c1b-8f86-c7ecc1a25d7f\") device mount path \"/mnt/openstack/pv11\"" pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 15:27:36 crc kubenswrapper[4785]: I1126 15:27:36.757184 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/82000356-60a7-4c1b-8f86-c7ecc1a25d7f-etc-iscsi\") pod \"glance-default-internal-api-0\" (UID: \"82000356-60a7-4c1b-8f86-c7ecc1a25d7f\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 15:27:36 crc kubenswrapper[4785]: I1126 15:27:36.757426 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/82000356-60a7-4c1b-8f86-c7ecc1a25d7f-logs\") pod \"glance-default-internal-api-0\" (UID: \"82000356-60a7-4c1b-8f86-c7ecc1a25d7f\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 15:27:36 crc kubenswrapper[4785]: I1126 15:27:36.757886 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/82000356-60a7-4c1b-8f86-c7ecc1a25d7f-lib-modules\") pod \"glance-default-internal-api-0\" (UID: \"82000356-60a7-4c1b-8f86-c7ecc1a25d7f\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 15:27:36 crc kubenswrapper[4785]: I1126 15:27:36.759436 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/82000356-60a7-4c1b-8f86-c7ecc1a25d7f-config-data\") pod \"glance-default-internal-api-0\" (UID: \"82000356-60a7-4c1b-8f86-c7ecc1a25d7f\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 15:27:36 crc kubenswrapper[4785]: I1126 15:27:36.762028 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/82000356-60a7-4c1b-8f86-c7ecc1a25d7f-scripts\") pod \"glance-default-internal-api-0\" (UID: \"82000356-60a7-4c1b-8f86-c7ecc1a25d7f\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 15:27:36 crc kubenswrapper[4785]: I1126 15:27:36.774651 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r7g6j\" (UniqueName: \"kubernetes.io/projected/82000356-60a7-4c1b-8f86-c7ecc1a25d7f-kube-api-access-r7g6j\") pod \"glance-default-internal-api-0\" (UID: \"82000356-60a7-4c1b-8f86-c7ecc1a25d7f\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 15:27:36 crc kubenswrapper[4785]: I1126 15:27:36.777356 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"82000356-60a7-4c1b-8f86-c7ecc1a25d7f\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 15:27:36 crc kubenswrapper[4785]: I1126 15:27:36.792350 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-internal-api-0\" (UID: \"82000356-60a7-4c1b-8f86-c7ecc1a25d7f\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 15:27:36 crc kubenswrapper[4785]: I1126 15:27:36.890473 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 15:27:37 crc kubenswrapper[4785]: I1126 15:27:37.052655 4785 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fabcb040-8ff8-4302-8628-7c3d52c1a5cb" path="/var/lib/kubelet/pods/fabcb040-8ff8-4302-8628-7c3d52c1a5cb/volumes" Nov 26 15:27:37 crc kubenswrapper[4785]: I1126 15:27:37.393963 4785 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-0"] Nov 26 15:27:37 crc kubenswrapper[4785]: W1126 15:27:37.396315 4785 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod82000356_60a7_4c1b_8f86_c7ecc1a25d7f.slice/crio-abeb7bf4fc4c0bdf2c91f37e66cc41fa312d8702952fd3699afd0736c108f75a WatchSource:0}: Error finding container abeb7bf4fc4c0bdf2c91f37e66cc41fa312d8702952fd3699afd0736c108f75a: Status 404 returned error can't find the container with id abeb7bf4fc4c0bdf2c91f37e66cc41fa312d8702952fd3699afd0736c108f75a Nov 26 15:27:37 crc kubenswrapper[4785]: I1126 15:27:37.493686 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"82000356-60a7-4c1b-8f86-c7ecc1a25d7f","Type":"ContainerStarted","Data":"abeb7bf4fc4c0bdf2c91f37e66cc41fa312d8702952fd3699afd0736c108f75a"} Nov 26 15:27:38 crc kubenswrapper[4785]: I1126 15:27:38.509366 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"82000356-60a7-4c1b-8f86-c7ecc1a25d7f","Type":"ContainerStarted","Data":"cefc04f99b4503f008bc49ceb69407a74f6deaa1b363fbfc5ab84603ef62d131"} Nov 26 15:27:38 crc kubenswrapper[4785]: I1126 15:27:38.510141 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"82000356-60a7-4c1b-8f86-c7ecc1a25d7f","Type":"ContainerStarted","Data":"0325461aed5a7d722bc1cbb0a468dd15e87590d4795bf3dd772935bb7fd1700b"} Nov 26 15:27:38 crc kubenswrapper[4785]: I1126 15:27:38.510196 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"82000356-60a7-4c1b-8f86-c7ecc1a25d7f","Type":"ContainerStarted","Data":"4ced12214c50a28587aa32b07492bcd3bea3518a3f0d26f7fffcc40279fcf8b2"} Nov 26 15:27:44 crc kubenswrapper[4785]: I1126 15:27:44.111788 4785 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-external-api-0" Nov 26 15:27:44 crc kubenswrapper[4785]: I1126 15:27:44.112284 4785 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-external-api-0" Nov 26 15:27:44 crc kubenswrapper[4785]: I1126 15:27:44.112294 4785 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-external-api-0" Nov 26 15:27:44 crc kubenswrapper[4785]: I1126 15:27:44.146922 4785 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-external-api-0" Nov 26 15:27:44 crc kubenswrapper[4785]: I1126 15:27:44.152712 4785 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-external-api-0" Nov 26 15:27:44 crc kubenswrapper[4785]: I1126 15:27:44.153045 4785 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-external-api-0" Nov 26 15:27:44 crc kubenswrapper[4785]: I1126 15:27:44.189240 4785 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-default-internal-api-0" podStartSLOduration=8.189223089 podStartE2EDuration="8.189223089s" podCreationTimestamp="2025-11-26 15:27:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 15:27:38.561642859 +0000 UTC m=+1222.240008703" watchObservedRunningTime="2025-11-26 15:27:44.189223089 +0000 UTC m=+1227.867588853" Nov 26 15:27:44 crc kubenswrapper[4785]: I1126 15:27:44.564008 4785 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-external-api-0" Nov 26 15:27:44 crc kubenswrapper[4785]: I1126 15:27:44.564348 4785 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-external-api-0" Nov 26 15:27:44 crc kubenswrapper[4785]: I1126 15:27:44.564363 4785 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-external-api-0" Nov 26 15:27:44 crc kubenswrapper[4785]: I1126 15:27:44.577010 4785 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-external-api-0" Nov 26 15:27:44 crc kubenswrapper[4785]: I1126 15:27:44.577089 4785 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-external-api-0" Nov 26 15:27:44 crc kubenswrapper[4785]: I1126 15:27:44.579416 4785 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-external-api-0" Nov 26 15:27:46 crc kubenswrapper[4785]: I1126 15:27:46.901351 4785 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 15:27:46 crc kubenswrapper[4785]: I1126 15:27:46.902167 4785 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 15:27:46 crc kubenswrapper[4785]: I1126 15:27:46.902251 4785 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 15:27:46 crc kubenswrapper[4785]: I1126 15:27:46.932605 4785 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 15:27:46 crc kubenswrapper[4785]: I1126 15:27:46.947000 4785 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 15:27:46 crc kubenswrapper[4785]: I1126 15:27:46.947930 4785 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 15:27:47 crc kubenswrapper[4785]: I1126 15:27:47.589965 4785 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 15:27:47 crc kubenswrapper[4785]: I1126 15:27:47.590008 4785 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 15:27:47 crc kubenswrapper[4785]: I1126 15:27:47.590020 4785 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 15:27:47 crc kubenswrapper[4785]: I1126 15:27:47.602029 4785 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 15:27:47 crc kubenswrapper[4785]: I1126 15:27:47.602791 4785 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 15:27:47 crc kubenswrapper[4785]: I1126 15:27:47.603005 4785 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 26 15:28:37 crc kubenswrapper[4785]: I1126 15:28:37.289013 4785 patch_prober.go:28] interesting pod/machine-config-daemon-gkxdl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 26 15:28:37 crc kubenswrapper[4785]: I1126 15:28:37.289906 4785 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-gkxdl" podUID="5539d39a-e2bc-4e7f-8b5a-a5e3e10c4ba4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 26 15:29:07 crc kubenswrapper[4785]: I1126 15:29:07.289352 4785 patch_prober.go:28] interesting pod/machine-config-daemon-gkxdl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 26 15:29:07 crc kubenswrapper[4785]: I1126 15:29:07.290069 4785 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-gkxdl" podUID="5539d39a-e2bc-4e7f-8b5a-a5e3e10c4ba4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 26 15:29:17 crc kubenswrapper[4785]: I1126 15:29:17.680681 4785 scope.go:117] "RemoveContainer" containerID="30a99bd1d04da190c9ca9bbce3f155e1e1e59b8f6ee5df199b9eb29d72cc0b96" Nov 26 15:29:37 crc kubenswrapper[4785]: I1126 15:29:37.289141 4785 patch_prober.go:28] interesting pod/machine-config-daemon-gkxdl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 26 15:29:37 crc kubenswrapper[4785]: I1126 15:29:37.289998 4785 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-gkxdl" podUID="5539d39a-e2bc-4e7f-8b5a-a5e3e10c4ba4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 26 15:29:37 crc kubenswrapper[4785]: I1126 15:29:37.290070 4785 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-gkxdl" Nov 26 15:29:37 crc kubenswrapper[4785]: I1126 15:29:37.291129 4785 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"516aa1c954a9db24fdce21dd42520586d80f2cd7166b0bbaba72d20a1a2dfd33"} pod="openshift-machine-config-operator/machine-config-daemon-gkxdl" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 26 15:29:37 crc kubenswrapper[4785]: I1126 15:29:37.291240 4785 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-gkxdl" podUID="5539d39a-e2bc-4e7f-8b5a-a5e3e10c4ba4" containerName="machine-config-daemon" containerID="cri-o://516aa1c954a9db24fdce21dd42520586d80f2cd7166b0bbaba72d20a1a2dfd33" gracePeriod=600 Nov 26 15:29:37 crc kubenswrapper[4785]: I1126 15:29:37.594030 4785 generic.go:334] "Generic (PLEG): container finished" podID="5539d39a-e2bc-4e7f-8b5a-a5e3e10c4ba4" containerID="516aa1c954a9db24fdce21dd42520586d80f2cd7166b0bbaba72d20a1a2dfd33" exitCode=0 Nov 26 15:29:37 crc kubenswrapper[4785]: I1126 15:29:37.594083 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-gkxdl" event={"ID":"5539d39a-e2bc-4e7f-8b5a-a5e3e10c4ba4","Type":"ContainerDied","Data":"516aa1c954a9db24fdce21dd42520586d80f2cd7166b0bbaba72d20a1a2dfd33"} Nov 26 15:29:37 crc kubenswrapper[4785]: I1126 15:29:37.594119 4785 scope.go:117] "RemoveContainer" containerID="931a84441789a47b1ac55236dde8c88c217189d0a2b8d7b82c6917d783312096" Nov 26 15:29:38 crc kubenswrapper[4785]: I1126 15:29:38.604534 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-gkxdl" event={"ID":"5539d39a-e2bc-4e7f-8b5a-a5e3e10c4ba4","Type":"ContainerStarted","Data":"3e8c4964e31211d666dbae0e50d629dfc05802d610982228b7bbd6c4dcfcee2a"} Nov 26 15:30:00 crc kubenswrapper[4785]: I1126 15:30:00.157263 4785 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-cache-glance-default-external-api-0-cleaner-2940285v2t8b"] Nov 26 15:30:00 crc kubenswrapper[4785]: I1126 15:30:00.159296 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-cache-glance-default-external-api-0-cleaner-2940285v2t8b" Nov 26 15:30:00 crc kubenswrapper[4785]: I1126 15:30:00.177997 4785 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-cache-glance-default-internal-api-0-cleaner-2940285mwtnm"] Nov 26 15:30:00 crc kubenswrapper[4785]: I1126 15:30:00.181467 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-cache-glance-default-internal-api-0-cleaner-2940285mwtnm" Nov 26 15:30:00 crc kubenswrapper[4785]: I1126 15:30:00.194248 4785 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-cache-glance-default-external-api-0-cleaner-2940285v2t8b"] Nov 26 15:30:00 crc kubenswrapper[4785]: I1126 15:30:00.201148 4785 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-cache-glance-default-internal-api-0-cleaner-2940285mwtnm"] Nov 26 15:30:00 crc kubenswrapper[4785]: I1126 15:30:00.207758 4785 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29402850-6rtsw"] Nov 26 15:30:00 crc kubenswrapper[4785]: I1126 15:30:00.208839 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29402850-6rtsw" Nov 26 15:30:00 crc kubenswrapper[4785]: I1126 15:30:00.211364 4785 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Nov 26 15:30:00 crc kubenswrapper[4785]: I1126 15:30:00.211629 4785 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Nov 26 15:30:00 crc kubenswrapper[4785]: I1126 15:30:00.214944 4785 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29402850-6rtsw"] Nov 26 15:30:00 crc kubenswrapper[4785]: I1126 15:30:00.252892 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/74f034bb-db60-4d3a-877e-fcbb3aa9eb71-secret-volume\") pod \"collect-profiles-29402850-6rtsw\" (UID: \"74f034bb-db60-4d3a-877e-fcbb3aa9eb71\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29402850-6rtsw" Nov 26 15:30:00 crc kubenswrapper[4785]: I1126 15:30:00.253084 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/74f034bb-db60-4d3a-877e-fcbb3aa9eb71-config-volume\") pod \"collect-profiles-29402850-6rtsw\" (UID: \"74f034bb-db60-4d3a-877e-fcbb3aa9eb71\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29402850-6rtsw" Nov 26 15:30:00 crc kubenswrapper[4785]: I1126 15:30:00.253125 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jrpm4\" (UniqueName: \"kubernetes.io/projected/74f034bb-db60-4d3a-877e-fcbb3aa9eb71-kube-api-access-jrpm4\") pod \"collect-profiles-29402850-6rtsw\" (UID: \"74f034bb-db60-4d3a-877e-fcbb3aa9eb71\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29402850-6rtsw" Nov 26 15:30:00 crc kubenswrapper[4785]: I1126 15:30:00.354838 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-cache-config-data\" (UniqueName: \"kubernetes.io/secret/dc512af0-b1e9-4ee7-9e2c-ac53acd314f7-image-cache-config-data\") pod \"glance-cache-glance-default-external-api-0-cleaner-2940285v2t8b\" (UID: \"dc512af0-b1e9-4ee7-9e2c-ac53acd314f7\") " pod="glance-kuttl-tests/glance-cache-glance-default-external-api-0-cleaner-2940285v2t8b" Nov 26 15:30:00 crc kubenswrapper[4785]: I1126 15:30:00.355000 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-cache-glance-default-internal-api-0-cleaner-2940285mwtnm\" (UID: \"89d71759-41c7-4c92-b800-7305c551e991\") " pod="glance-kuttl-tests/glance-cache-glance-default-internal-api-0-cleaner-2940285mwtnm" Nov 26 15:30:00 crc kubenswrapper[4785]: I1126 15:30:00.355127 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/74f034bb-db60-4d3a-877e-fcbb3aa9eb71-config-volume\") pod \"collect-profiles-29402850-6rtsw\" (UID: \"74f034bb-db60-4d3a-877e-fcbb3aa9eb71\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29402850-6rtsw" Nov 26 15:30:00 crc kubenswrapper[4785]: I1126 15:30:00.355195 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-njlsx\" (UniqueName: \"kubernetes.io/projected/89d71759-41c7-4c92-b800-7305c551e991-kube-api-access-njlsx\") pod \"glance-cache-glance-default-internal-api-0-cleaner-2940285mwtnm\" (UID: \"89d71759-41c7-4c92-b800-7305c551e991\") " pod="glance-kuttl-tests/glance-cache-glance-default-internal-api-0-cleaner-2940285mwtnm" Nov 26 15:30:00 crc kubenswrapper[4785]: I1126 15:30:00.355261 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jrpm4\" (UniqueName: \"kubernetes.io/projected/74f034bb-db60-4d3a-877e-fcbb3aa9eb71-kube-api-access-jrpm4\") pod \"collect-profiles-29402850-6rtsw\" (UID: \"74f034bb-db60-4d3a-877e-fcbb3aa9eb71\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29402850-6rtsw" Nov 26 15:30:00 crc kubenswrapper[4785]: I1126 15:30:00.355325 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2l8vw\" (UniqueName: \"kubernetes.io/projected/dc512af0-b1e9-4ee7-9e2c-ac53acd314f7-kube-api-access-2l8vw\") pod \"glance-cache-glance-default-external-api-0-cleaner-2940285v2t8b\" (UID: \"dc512af0-b1e9-4ee7-9e2c-ac53acd314f7\") " pod="glance-kuttl-tests/glance-cache-glance-default-external-api-0-cleaner-2940285v2t8b" Nov 26 15:30:00 crc kubenswrapper[4785]: I1126 15:30:00.355470 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/74f034bb-db60-4d3a-877e-fcbb3aa9eb71-secret-volume\") pod \"collect-profiles-29402850-6rtsw\" (UID: \"74f034bb-db60-4d3a-877e-fcbb3aa9eb71\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29402850-6rtsw" Nov 26 15:30:00 crc kubenswrapper[4785]: I1126 15:30:00.355538 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-cache-config-data\" (UniqueName: \"kubernetes.io/secret/89d71759-41c7-4c92-b800-7305c551e991-image-cache-config-data\") pod \"glance-cache-glance-default-internal-api-0-cleaner-2940285mwtnm\" (UID: \"89d71759-41c7-4c92-b800-7305c551e991\") " pod="glance-kuttl-tests/glance-cache-glance-default-internal-api-0-cleaner-2940285mwtnm" Nov 26 15:30:00 crc kubenswrapper[4785]: I1126 15:30:00.355639 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-cache-glance-default-external-api-0-cleaner-2940285v2t8b\" (UID: \"dc512af0-b1e9-4ee7-9e2c-ac53acd314f7\") " pod="glance-kuttl-tests/glance-cache-glance-default-external-api-0-cleaner-2940285v2t8b" Nov 26 15:30:00 crc kubenswrapper[4785]: I1126 15:30:00.355990 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/74f034bb-db60-4d3a-877e-fcbb3aa9eb71-config-volume\") pod \"collect-profiles-29402850-6rtsw\" (UID: \"74f034bb-db60-4d3a-877e-fcbb3aa9eb71\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29402850-6rtsw" Nov 26 15:30:00 crc kubenswrapper[4785]: I1126 15:30:00.362189 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/74f034bb-db60-4d3a-877e-fcbb3aa9eb71-secret-volume\") pod \"collect-profiles-29402850-6rtsw\" (UID: \"74f034bb-db60-4d3a-877e-fcbb3aa9eb71\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29402850-6rtsw" Nov 26 15:30:00 crc kubenswrapper[4785]: I1126 15:30:00.371596 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jrpm4\" (UniqueName: \"kubernetes.io/projected/74f034bb-db60-4d3a-877e-fcbb3aa9eb71-kube-api-access-jrpm4\") pod \"collect-profiles-29402850-6rtsw\" (UID: \"74f034bb-db60-4d3a-877e-fcbb3aa9eb71\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29402850-6rtsw" Nov 26 15:30:00 crc kubenswrapper[4785]: I1126 15:30:00.376507 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-cache-glance-default-internal-api-0-cleaner-2940285mwtnm\" (UID: \"89d71759-41c7-4c92-b800-7305c551e991\") " pod="glance-kuttl-tests/glance-cache-glance-default-internal-api-0-cleaner-2940285mwtnm" Nov 26 15:30:00 crc kubenswrapper[4785]: I1126 15:30:00.384785 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-cache-glance-default-external-api-0-cleaner-2940285v2t8b\" (UID: \"dc512af0-b1e9-4ee7-9e2c-ac53acd314f7\") " pod="glance-kuttl-tests/glance-cache-glance-default-external-api-0-cleaner-2940285v2t8b" Nov 26 15:30:00 crc kubenswrapper[4785]: I1126 15:30:00.456683 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-njlsx\" (UniqueName: \"kubernetes.io/projected/89d71759-41c7-4c92-b800-7305c551e991-kube-api-access-njlsx\") pod \"glance-cache-glance-default-internal-api-0-cleaner-2940285mwtnm\" (UID: \"89d71759-41c7-4c92-b800-7305c551e991\") " pod="glance-kuttl-tests/glance-cache-glance-default-internal-api-0-cleaner-2940285mwtnm" Nov 26 15:30:00 crc kubenswrapper[4785]: I1126 15:30:00.456736 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2l8vw\" (UniqueName: \"kubernetes.io/projected/dc512af0-b1e9-4ee7-9e2c-ac53acd314f7-kube-api-access-2l8vw\") pod \"glance-cache-glance-default-external-api-0-cleaner-2940285v2t8b\" (UID: \"dc512af0-b1e9-4ee7-9e2c-ac53acd314f7\") " pod="glance-kuttl-tests/glance-cache-glance-default-external-api-0-cleaner-2940285v2t8b" Nov 26 15:30:00 crc kubenswrapper[4785]: I1126 15:30:00.456769 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-cache-config-data\" (UniqueName: \"kubernetes.io/secret/89d71759-41c7-4c92-b800-7305c551e991-image-cache-config-data\") pod \"glance-cache-glance-default-internal-api-0-cleaner-2940285mwtnm\" (UID: \"89d71759-41c7-4c92-b800-7305c551e991\") " pod="glance-kuttl-tests/glance-cache-glance-default-internal-api-0-cleaner-2940285mwtnm" Nov 26 15:30:00 crc kubenswrapper[4785]: I1126 15:30:00.456835 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-cache-config-data\" (UniqueName: \"kubernetes.io/secret/dc512af0-b1e9-4ee7-9e2c-ac53acd314f7-image-cache-config-data\") pod \"glance-cache-glance-default-external-api-0-cleaner-2940285v2t8b\" (UID: \"dc512af0-b1e9-4ee7-9e2c-ac53acd314f7\") " pod="glance-kuttl-tests/glance-cache-glance-default-external-api-0-cleaner-2940285v2t8b" Nov 26 15:30:00 crc kubenswrapper[4785]: I1126 15:30:00.460680 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-cache-config-data\" (UniqueName: \"kubernetes.io/secret/89d71759-41c7-4c92-b800-7305c551e991-image-cache-config-data\") pod \"glance-cache-glance-default-internal-api-0-cleaner-2940285mwtnm\" (UID: \"89d71759-41c7-4c92-b800-7305c551e991\") " pod="glance-kuttl-tests/glance-cache-glance-default-internal-api-0-cleaner-2940285mwtnm" Nov 26 15:30:00 crc kubenswrapper[4785]: I1126 15:30:00.465951 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-cache-config-data\" (UniqueName: \"kubernetes.io/secret/dc512af0-b1e9-4ee7-9e2c-ac53acd314f7-image-cache-config-data\") pod \"glance-cache-glance-default-external-api-0-cleaner-2940285v2t8b\" (UID: \"dc512af0-b1e9-4ee7-9e2c-ac53acd314f7\") " pod="glance-kuttl-tests/glance-cache-glance-default-external-api-0-cleaner-2940285v2t8b" Nov 26 15:30:00 crc kubenswrapper[4785]: I1126 15:30:00.471982 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-njlsx\" (UniqueName: \"kubernetes.io/projected/89d71759-41c7-4c92-b800-7305c551e991-kube-api-access-njlsx\") pod \"glance-cache-glance-default-internal-api-0-cleaner-2940285mwtnm\" (UID: \"89d71759-41c7-4c92-b800-7305c551e991\") " pod="glance-kuttl-tests/glance-cache-glance-default-internal-api-0-cleaner-2940285mwtnm" Nov 26 15:30:00 crc kubenswrapper[4785]: I1126 15:30:00.473154 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2l8vw\" (UniqueName: \"kubernetes.io/projected/dc512af0-b1e9-4ee7-9e2c-ac53acd314f7-kube-api-access-2l8vw\") pod \"glance-cache-glance-default-external-api-0-cleaner-2940285v2t8b\" (UID: \"dc512af0-b1e9-4ee7-9e2c-ac53acd314f7\") " pod="glance-kuttl-tests/glance-cache-glance-default-external-api-0-cleaner-2940285v2t8b" Nov 26 15:30:00 crc kubenswrapper[4785]: I1126 15:30:00.497179 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-cache-glance-default-external-api-0-cleaner-2940285v2t8b" Nov 26 15:30:00 crc kubenswrapper[4785]: I1126 15:30:00.519776 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-cache-glance-default-internal-api-0-cleaner-2940285mwtnm" Nov 26 15:30:00 crc kubenswrapper[4785]: I1126 15:30:00.539818 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29402850-6rtsw" Nov 26 15:30:00 crc kubenswrapper[4785]: I1126 15:30:00.917972 4785 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-cache-glance-default-external-api-0-cleaner-2940285v2t8b"] Nov 26 15:30:00 crc kubenswrapper[4785]: I1126 15:30:00.968290 4785 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-cache-glance-default-internal-api-0-cleaner-2940285mwtnm"] Nov 26 15:30:00 crc kubenswrapper[4785]: W1126 15:30:00.973064 4785 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod89d71759_41c7_4c92_b800_7305c551e991.slice/crio-5cac679cf24988dc05a118e81412c845b977f67f7b5bf8500eb5d0e01e4a3213 WatchSource:0}: Error finding container 5cac679cf24988dc05a118e81412c845b977f67f7b5bf8500eb5d0e01e4a3213: Status 404 returned error can't find the container with id 5cac679cf24988dc05a118e81412c845b977f67f7b5bf8500eb5d0e01e4a3213 Nov 26 15:30:01 crc kubenswrapper[4785]: I1126 15:30:01.015899 4785 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29402850-6rtsw"] Nov 26 15:30:01 crc kubenswrapper[4785]: W1126 15:30:01.024623 4785 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod74f034bb_db60_4d3a_877e_fcbb3aa9eb71.slice/crio-23f2e18e15ab7171c84d6317d8dab0187a9152c6c675377edc25f6d9ac366e58 WatchSource:0}: Error finding container 23f2e18e15ab7171c84d6317d8dab0187a9152c6c675377edc25f6d9ac366e58: Status 404 returned error can't find the container with id 23f2e18e15ab7171c84d6317d8dab0187a9152c6c675377edc25f6d9ac366e58 Nov 26 15:30:01 crc kubenswrapper[4785]: I1126 15:30:01.851765 4785 generic.go:334] "Generic (PLEG): container finished" podID="74f034bb-db60-4d3a-877e-fcbb3aa9eb71" containerID="4b3d0a99992838acae8df23eb9ca684df049e23014d5c39ba7ee723cc27dae93" exitCode=0 Nov 26 15:30:01 crc kubenswrapper[4785]: I1126 15:30:01.852291 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29402850-6rtsw" event={"ID":"74f034bb-db60-4d3a-877e-fcbb3aa9eb71","Type":"ContainerDied","Data":"4b3d0a99992838acae8df23eb9ca684df049e23014d5c39ba7ee723cc27dae93"} Nov 26 15:30:01 crc kubenswrapper[4785]: I1126 15:30:01.852317 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29402850-6rtsw" event={"ID":"74f034bb-db60-4d3a-877e-fcbb3aa9eb71","Type":"ContainerStarted","Data":"23f2e18e15ab7171c84d6317d8dab0187a9152c6c675377edc25f6d9ac366e58"} Nov 26 15:30:01 crc kubenswrapper[4785]: I1126 15:30:01.863816 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-cache-glance-default-internal-api-0-cleaner-2940285mwtnm" event={"ID":"89d71759-41c7-4c92-b800-7305c551e991","Type":"ContainerStarted","Data":"5f06c70635cde368253d2d8bc3afcc0fcdb0888eab74dc884a4d637be88e1e33"} Nov 26 15:30:01 crc kubenswrapper[4785]: I1126 15:30:01.863892 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-cache-glance-default-internal-api-0-cleaner-2940285mwtnm" event={"ID":"89d71759-41c7-4c92-b800-7305c551e991","Type":"ContainerStarted","Data":"5cac679cf24988dc05a118e81412c845b977f67f7b5bf8500eb5d0e01e4a3213"} Nov 26 15:30:01 crc kubenswrapper[4785]: I1126 15:30:01.876568 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-cache-glance-default-external-api-0-cleaner-2940285v2t8b" event={"ID":"dc512af0-b1e9-4ee7-9e2c-ac53acd314f7","Type":"ContainerStarted","Data":"027102539d92179f9c2368332f0420228db08b886427dd12691743e8f6d0594f"} Nov 26 15:30:01 crc kubenswrapper[4785]: I1126 15:30:01.876625 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-cache-glance-default-external-api-0-cleaner-2940285v2t8b" event={"ID":"dc512af0-b1e9-4ee7-9e2c-ac53acd314f7","Type":"ContainerStarted","Data":"1b703f29f9c7b7b7e529542c9e24d20291dba867e73e53a80b1b766264fa91a5"} Nov 26 15:30:01 crc kubenswrapper[4785]: I1126 15:30:01.887978 4785 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-cache-glance-default-internal-api-0-cleaner-2940285mwtnm" podStartSLOduration=1.887935779 podStartE2EDuration="1.887935779s" podCreationTimestamp="2025-11-26 15:30:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 15:30:01.883183593 +0000 UTC m=+1365.561549377" watchObservedRunningTime="2025-11-26 15:30:01.887935779 +0000 UTC m=+1365.566301543" Nov 26 15:30:01 crc kubenswrapper[4785]: I1126 15:30:01.901895 4785 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-cache-glance-default-external-api-0-cleaner-2940285v2t8b" podStartSLOduration=1.901874999 podStartE2EDuration="1.901874999s" podCreationTimestamp="2025-11-26 15:30:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 15:30:01.899374332 +0000 UTC m=+1365.577740116" watchObservedRunningTime="2025-11-26 15:30:01.901874999 +0000 UTC m=+1365.580240763" Nov 26 15:30:02 crc kubenswrapper[4785]: I1126 15:30:02.885248 4785 generic.go:334] "Generic (PLEG): container finished" podID="dc512af0-b1e9-4ee7-9e2c-ac53acd314f7" containerID="027102539d92179f9c2368332f0420228db08b886427dd12691743e8f6d0594f" exitCode=0 Nov 26 15:30:02 crc kubenswrapper[4785]: I1126 15:30:02.885365 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-cache-glance-default-external-api-0-cleaner-2940285v2t8b" event={"ID":"dc512af0-b1e9-4ee7-9e2c-ac53acd314f7","Type":"ContainerDied","Data":"027102539d92179f9c2368332f0420228db08b886427dd12691743e8f6d0594f"} Nov 26 15:30:02 crc kubenswrapper[4785]: I1126 15:30:02.888604 4785 generic.go:334] "Generic (PLEG): container finished" podID="89d71759-41c7-4c92-b800-7305c551e991" containerID="5f06c70635cde368253d2d8bc3afcc0fcdb0888eab74dc884a4d637be88e1e33" exitCode=0 Nov 26 15:30:02 crc kubenswrapper[4785]: I1126 15:30:02.888950 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-cache-glance-default-internal-api-0-cleaner-2940285mwtnm" event={"ID":"89d71759-41c7-4c92-b800-7305c551e991","Type":"ContainerDied","Data":"5f06c70635cde368253d2d8bc3afcc0fcdb0888eab74dc884a4d637be88e1e33"} Nov 26 15:30:03 crc kubenswrapper[4785]: I1126 15:30:03.146900 4785 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29402850-6rtsw" Nov 26 15:30:03 crc kubenswrapper[4785]: I1126 15:30:03.300806 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/74f034bb-db60-4d3a-877e-fcbb3aa9eb71-config-volume\") pod \"74f034bb-db60-4d3a-877e-fcbb3aa9eb71\" (UID: \"74f034bb-db60-4d3a-877e-fcbb3aa9eb71\") " Nov 26 15:30:03 crc kubenswrapper[4785]: I1126 15:30:03.300898 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jrpm4\" (UniqueName: \"kubernetes.io/projected/74f034bb-db60-4d3a-877e-fcbb3aa9eb71-kube-api-access-jrpm4\") pod \"74f034bb-db60-4d3a-877e-fcbb3aa9eb71\" (UID: \"74f034bb-db60-4d3a-877e-fcbb3aa9eb71\") " Nov 26 15:30:03 crc kubenswrapper[4785]: I1126 15:30:03.300923 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/74f034bb-db60-4d3a-877e-fcbb3aa9eb71-secret-volume\") pod \"74f034bb-db60-4d3a-877e-fcbb3aa9eb71\" (UID: \"74f034bb-db60-4d3a-877e-fcbb3aa9eb71\") " Nov 26 15:30:03 crc kubenswrapper[4785]: I1126 15:30:03.301970 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/74f034bb-db60-4d3a-877e-fcbb3aa9eb71-config-volume" (OuterVolumeSpecName: "config-volume") pod "74f034bb-db60-4d3a-877e-fcbb3aa9eb71" (UID: "74f034bb-db60-4d3a-877e-fcbb3aa9eb71"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 26 15:30:03 crc kubenswrapper[4785]: I1126 15:30:03.306079 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/74f034bb-db60-4d3a-877e-fcbb3aa9eb71-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "74f034bb-db60-4d3a-877e-fcbb3aa9eb71" (UID: "74f034bb-db60-4d3a-877e-fcbb3aa9eb71"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 15:30:03 crc kubenswrapper[4785]: I1126 15:30:03.307013 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/74f034bb-db60-4d3a-877e-fcbb3aa9eb71-kube-api-access-jrpm4" (OuterVolumeSpecName: "kube-api-access-jrpm4") pod "74f034bb-db60-4d3a-877e-fcbb3aa9eb71" (UID: "74f034bb-db60-4d3a-877e-fcbb3aa9eb71"). InnerVolumeSpecName "kube-api-access-jrpm4". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 15:30:03 crc kubenswrapper[4785]: I1126 15:30:03.402336 4785 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jrpm4\" (UniqueName: \"kubernetes.io/projected/74f034bb-db60-4d3a-877e-fcbb3aa9eb71-kube-api-access-jrpm4\") on node \"crc\" DevicePath \"\"" Nov 26 15:30:03 crc kubenswrapper[4785]: I1126 15:30:03.402374 4785 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/74f034bb-db60-4d3a-877e-fcbb3aa9eb71-secret-volume\") on node \"crc\" DevicePath \"\"" Nov 26 15:30:03 crc kubenswrapper[4785]: I1126 15:30:03.402383 4785 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/74f034bb-db60-4d3a-877e-fcbb3aa9eb71-config-volume\") on node \"crc\" DevicePath \"\"" Nov 26 15:30:03 crc kubenswrapper[4785]: I1126 15:30:03.901481 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29402850-6rtsw" event={"ID":"74f034bb-db60-4d3a-877e-fcbb3aa9eb71","Type":"ContainerDied","Data":"23f2e18e15ab7171c84d6317d8dab0187a9152c6c675377edc25f6d9ac366e58"} Nov 26 15:30:03 crc kubenswrapper[4785]: I1126 15:30:03.901591 4785 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="23f2e18e15ab7171c84d6317d8dab0187a9152c6c675377edc25f6d9ac366e58" Nov 26 15:30:03 crc kubenswrapper[4785]: I1126 15:30:03.901678 4785 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29402850-6rtsw" Nov 26 15:30:04 crc kubenswrapper[4785]: I1126 15:30:04.362031 4785 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-cache-glance-default-internal-api-0-cleaner-2940285mwtnm" Nov 26 15:30:04 crc kubenswrapper[4785]: I1126 15:30:04.368995 4785 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-cache-glance-default-external-api-0-cleaner-2940285v2t8b" Nov 26 15:30:04 crc kubenswrapper[4785]: I1126 15:30:04.508842 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-cache-config-data\" (UniqueName: \"kubernetes.io/secret/dc512af0-b1e9-4ee7-9e2c-ac53acd314f7-image-cache-config-data\") pod \"dc512af0-b1e9-4ee7-9e2c-ac53acd314f7\" (UID: \"dc512af0-b1e9-4ee7-9e2c-ac53acd314f7\") " Nov 26 15:30:04 crc kubenswrapper[4785]: I1126 15:30:04.508909 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance-cache\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"dc512af0-b1e9-4ee7-9e2c-ac53acd314f7\" (UID: \"dc512af0-b1e9-4ee7-9e2c-ac53acd314f7\") " Nov 26 15:30:04 crc kubenswrapper[4785]: I1126 15:30:04.508937 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-njlsx\" (UniqueName: \"kubernetes.io/projected/89d71759-41c7-4c92-b800-7305c551e991-kube-api-access-njlsx\") pod \"89d71759-41c7-4c92-b800-7305c551e991\" (UID: \"89d71759-41c7-4c92-b800-7305c551e991\") " Nov 26 15:30:04 crc kubenswrapper[4785]: I1126 15:30:04.508961 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2l8vw\" (UniqueName: \"kubernetes.io/projected/dc512af0-b1e9-4ee7-9e2c-ac53acd314f7-kube-api-access-2l8vw\") pod \"dc512af0-b1e9-4ee7-9e2c-ac53acd314f7\" (UID: \"dc512af0-b1e9-4ee7-9e2c-ac53acd314f7\") " Nov 26 15:30:04 crc kubenswrapper[4785]: I1126 15:30:04.508977 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance-cache\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"89d71759-41c7-4c92-b800-7305c551e991\" (UID: \"89d71759-41c7-4c92-b800-7305c551e991\") " Nov 26 15:30:04 crc kubenswrapper[4785]: I1126 15:30:04.509086 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-cache-config-data\" (UniqueName: \"kubernetes.io/secret/89d71759-41c7-4c92-b800-7305c551e991-image-cache-config-data\") pod \"89d71759-41c7-4c92-b800-7305c551e991\" (UID: \"89d71759-41c7-4c92-b800-7305c551e991\") " Nov 26 15:30:04 crc kubenswrapper[4785]: I1126 15:30:04.512754 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage08-crc" (OuterVolumeSpecName: "glance-cache") pod "89d71759-41c7-4c92-b800-7305c551e991" (UID: "89d71759-41c7-4c92-b800-7305c551e991"). InnerVolumeSpecName "local-storage08-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Nov 26 15:30:04 crc kubenswrapper[4785]: I1126 15:30:04.513470 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/89d71759-41c7-4c92-b800-7305c551e991-kube-api-access-njlsx" (OuterVolumeSpecName: "kube-api-access-njlsx") pod "89d71759-41c7-4c92-b800-7305c551e991" (UID: "89d71759-41c7-4c92-b800-7305c551e991"). InnerVolumeSpecName "kube-api-access-njlsx". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 15:30:04 crc kubenswrapper[4785]: I1126 15:30:04.514887 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc512af0-b1e9-4ee7-9e2c-ac53acd314f7-image-cache-config-data" (OuterVolumeSpecName: "image-cache-config-data") pod "dc512af0-b1e9-4ee7-9e2c-ac53acd314f7" (UID: "dc512af0-b1e9-4ee7-9e2c-ac53acd314f7"). InnerVolumeSpecName "image-cache-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 15:30:04 crc kubenswrapper[4785]: I1126 15:30:04.517587 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dc512af0-b1e9-4ee7-9e2c-ac53acd314f7-kube-api-access-2l8vw" (OuterVolumeSpecName: "kube-api-access-2l8vw") pod "dc512af0-b1e9-4ee7-9e2c-ac53acd314f7" (UID: "dc512af0-b1e9-4ee7-9e2c-ac53acd314f7"). InnerVolumeSpecName "kube-api-access-2l8vw". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 15:30:04 crc kubenswrapper[4785]: I1126 15:30:04.517614 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage12-crc" (OuterVolumeSpecName: "glance-cache") pod "dc512af0-b1e9-4ee7-9e2c-ac53acd314f7" (UID: "dc512af0-b1e9-4ee7-9e2c-ac53acd314f7"). InnerVolumeSpecName "local-storage12-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Nov 26 15:30:04 crc kubenswrapper[4785]: I1126 15:30:04.517642 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/89d71759-41c7-4c92-b800-7305c551e991-image-cache-config-data" (OuterVolumeSpecName: "image-cache-config-data") pod "89d71759-41c7-4c92-b800-7305c551e991" (UID: "89d71759-41c7-4c92-b800-7305c551e991"). InnerVolumeSpecName "image-cache-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 26 15:30:04 crc kubenswrapper[4785]: I1126 15:30:04.611611 4785 reconciler_common.go:293] "Volume detached for volume \"image-cache-config-data\" (UniqueName: \"kubernetes.io/secret/dc512af0-b1e9-4ee7-9e2c-ac53acd314f7-image-cache-config-data\") on node \"crc\" DevicePath \"\"" Nov 26 15:30:04 crc kubenswrapper[4785]: I1126 15:30:04.611678 4785 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-njlsx\" (UniqueName: \"kubernetes.io/projected/89d71759-41c7-4c92-b800-7305c551e991-kube-api-access-njlsx\") on node \"crc\" DevicePath \"\"" Nov 26 15:30:04 crc kubenswrapper[4785]: I1126 15:30:04.611705 4785 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2l8vw\" (UniqueName: \"kubernetes.io/projected/dc512af0-b1e9-4ee7-9e2c-ac53acd314f7-kube-api-access-2l8vw\") on node \"crc\" DevicePath \"\"" Nov 26 15:30:04 crc kubenswrapper[4785]: I1126 15:30:04.611731 4785 reconciler_common.go:293] "Volume detached for volume \"image-cache-config-data\" (UniqueName: \"kubernetes.io/secret/89d71759-41c7-4c92-b800-7305c551e991-image-cache-config-data\") on node \"crc\" DevicePath \"\"" Nov 26 15:30:04 crc kubenswrapper[4785]: I1126 15:30:04.916347 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-cache-glance-default-internal-api-0-cleaner-2940285mwtnm" event={"ID":"89d71759-41c7-4c92-b800-7305c551e991","Type":"ContainerDied","Data":"5cac679cf24988dc05a118e81412c845b977f67f7b5bf8500eb5d0e01e4a3213"} Nov 26 15:30:04 crc kubenswrapper[4785]: I1126 15:30:04.916386 4785 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5cac679cf24988dc05a118e81412c845b977f67f7b5bf8500eb5d0e01e4a3213" Nov 26 15:30:04 crc kubenswrapper[4785]: I1126 15:30:04.916439 4785 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-cache-glance-default-internal-api-0-cleaner-2940285mwtnm" Nov 26 15:30:04 crc kubenswrapper[4785]: I1126 15:30:04.928756 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-cache-glance-default-external-api-0-cleaner-2940285v2t8b" event={"ID":"dc512af0-b1e9-4ee7-9e2c-ac53acd314f7","Type":"ContainerDied","Data":"1b703f29f9c7b7b7e529542c9e24d20291dba867e73e53a80b1b766264fa91a5"} Nov 26 15:30:04 crc kubenswrapper[4785]: I1126 15:30:04.928797 4785 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1b703f29f9c7b7b7e529542c9e24d20291dba867e73e53a80b1b766264fa91a5" Nov 26 15:30:04 crc kubenswrapper[4785]: I1126 15:30:04.928846 4785 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-cache-glance-default-external-api-0-cleaner-2940285v2t8b" Nov 26 15:30:21 crc kubenswrapper[4785]: I1126 15:30:21.488701 4785 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Nov 26 15:30:21 crc kubenswrapper[4785]: E1126 15:30:21.489622 4785 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc512af0-b1e9-4ee7-9e2c-ac53acd314f7" containerName="glance-cache-glance-default-external-api-0-cleaner" Nov 26 15:30:21 crc kubenswrapper[4785]: I1126 15:30:21.489639 4785 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc512af0-b1e9-4ee7-9e2c-ac53acd314f7" containerName="glance-cache-glance-default-external-api-0-cleaner" Nov 26 15:30:21 crc kubenswrapper[4785]: E1126 15:30:21.489661 4785 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="89d71759-41c7-4c92-b800-7305c551e991" containerName="glance-cache-glance-default-internal-api-0-cleaner" Nov 26 15:30:21 crc kubenswrapper[4785]: I1126 15:30:21.489670 4785 state_mem.go:107] "Deleted CPUSet assignment" podUID="89d71759-41c7-4c92-b800-7305c551e991" containerName="glance-cache-glance-default-internal-api-0-cleaner" Nov 26 15:30:21 crc kubenswrapper[4785]: E1126 15:30:21.489691 4785 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74f034bb-db60-4d3a-877e-fcbb3aa9eb71" containerName="collect-profiles" Nov 26 15:30:21 crc kubenswrapper[4785]: I1126 15:30:21.489699 4785 state_mem.go:107] "Deleted CPUSet assignment" podUID="74f034bb-db60-4d3a-877e-fcbb3aa9eb71" containerName="collect-profiles" Nov 26 15:30:21 crc kubenswrapper[4785]: I1126 15:30:21.489850 4785 memory_manager.go:354] "RemoveStaleState removing state" podUID="74f034bb-db60-4d3a-877e-fcbb3aa9eb71" containerName="collect-profiles" Nov 26 15:30:21 crc kubenswrapper[4785]: I1126 15:30:21.489946 4785 memory_manager.go:354] "RemoveStaleState removing state" podUID="89d71759-41c7-4c92-b800-7305c551e991" containerName="glance-cache-glance-default-internal-api-0-cleaner" Nov 26 15:30:21 crc kubenswrapper[4785]: I1126 15:30:21.489968 4785 memory_manager.go:354] "RemoveStaleState removing state" podUID="dc512af0-b1e9-4ee7-9e2c-ac53acd314f7" containerName="glance-cache-glance-default-external-api-0-cleaner" Nov 26 15:30:21 crc kubenswrapper[4785]: I1126 15:30:21.490531 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Nov 26 15:30:21 crc kubenswrapper[4785]: I1126 15:30:21.492961 4785 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Nov 26 15:30:21 crc kubenswrapper[4785]: I1126 15:30:21.493090 4785 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Nov 26 15:30:21 crc kubenswrapper[4785]: I1126 15:30:21.501140 4785 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Nov 26 15:30:21 crc kubenswrapper[4785]: I1126 15:30:21.593020 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/19688a50-6579-4d01-8482-9cc5e0c5a576-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"19688a50-6579-4d01-8482-9cc5e0c5a576\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Nov 26 15:30:21 crc kubenswrapper[4785]: I1126 15:30:21.593097 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/19688a50-6579-4d01-8482-9cc5e0c5a576-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"19688a50-6579-4d01-8482-9cc5e0c5a576\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Nov 26 15:30:21 crc kubenswrapper[4785]: I1126 15:30:21.694771 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/19688a50-6579-4d01-8482-9cc5e0c5a576-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"19688a50-6579-4d01-8482-9cc5e0c5a576\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Nov 26 15:30:21 crc kubenswrapper[4785]: I1126 15:30:21.694854 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/19688a50-6579-4d01-8482-9cc5e0c5a576-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"19688a50-6579-4d01-8482-9cc5e0c5a576\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Nov 26 15:30:21 crc kubenswrapper[4785]: I1126 15:30:21.694918 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/19688a50-6579-4d01-8482-9cc5e0c5a576-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"19688a50-6579-4d01-8482-9cc5e0c5a576\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Nov 26 15:30:21 crc kubenswrapper[4785]: I1126 15:30:21.729698 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/19688a50-6579-4d01-8482-9cc5e0c5a576-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"19688a50-6579-4d01-8482-9cc5e0c5a576\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Nov 26 15:30:21 crc kubenswrapper[4785]: I1126 15:30:21.825408 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Nov 26 15:30:22 crc kubenswrapper[4785]: I1126 15:30:22.243387 4785 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Nov 26 15:30:22 crc kubenswrapper[4785]: W1126 15:30:22.250961 4785 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod19688a50_6579_4d01_8482_9cc5e0c5a576.slice/crio-c620da3d92c61292572f11853397a06774987471555c5a7c1418897f3bc7098b WatchSource:0}: Error finding container c620da3d92c61292572f11853397a06774987471555c5a7c1418897f3bc7098b: Status 404 returned error can't find the container with id c620da3d92c61292572f11853397a06774987471555c5a7c1418897f3bc7098b Nov 26 15:30:23 crc kubenswrapper[4785]: I1126 15:30:23.099874 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"19688a50-6579-4d01-8482-9cc5e0c5a576","Type":"ContainerStarted","Data":"6fd717ecad70eb91a112f5955a31aba1333d2de974a02df80d259f4169119241"} Nov 26 15:30:23 crc kubenswrapper[4785]: I1126 15:30:23.100210 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"19688a50-6579-4d01-8482-9cc5e0c5a576","Type":"ContainerStarted","Data":"c620da3d92c61292572f11853397a06774987471555c5a7c1418897f3bc7098b"} Nov 26 15:30:23 crc kubenswrapper[4785]: I1126 15:30:23.123985 4785 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-9-crc" podStartSLOduration=2.12396199 podStartE2EDuration="2.12396199s" podCreationTimestamp="2025-11-26 15:30:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 15:30:23.119076171 +0000 UTC m=+1386.797441965" watchObservedRunningTime="2025-11-26 15:30:23.12396199 +0000 UTC m=+1386.802327764" Nov 26 15:30:24 crc kubenswrapper[4785]: I1126 15:30:24.110383 4785 generic.go:334] "Generic (PLEG): container finished" podID="19688a50-6579-4d01-8482-9cc5e0c5a576" containerID="6fd717ecad70eb91a112f5955a31aba1333d2de974a02df80d259f4169119241" exitCode=0 Nov 26 15:30:24 crc kubenswrapper[4785]: I1126 15:30:24.110447 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"19688a50-6579-4d01-8482-9cc5e0c5a576","Type":"ContainerDied","Data":"6fd717ecad70eb91a112f5955a31aba1333d2de974a02df80d259f4169119241"} Nov 26 15:30:25 crc kubenswrapper[4785]: I1126 15:30:25.438580 4785 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Nov 26 15:30:25 crc kubenswrapper[4785]: I1126 15:30:25.569980 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/19688a50-6579-4d01-8482-9cc5e0c5a576-kube-api-access\") pod \"19688a50-6579-4d01-8482-9cc5e0c5a576\" (UID: \"19688a50-6579-4d01-8482-9cc5e0c5a576\") " Nov 26 15:30:25 crc kubenswrapper[4785]: I1126 15:30:25.570122 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/19688a50-6579-4d01-8482-9cc5e0c5a576-kubelet-dir\") pod \"19688a50-6579-4d01-8482-9cc5e0c5a576\" (UID: \"19688a50-6579-4d01-8482-9cc5e0c5a576\") " Nov 26 15:30:25 crc kubenswrapper[4785]: I1126 15:30:25.570232 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/19688a50-6579-4d01-8482-9cc5e0c5a576-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "19688a50-6579-4d01-8482-9cc5e0c5a576" (UID: "19688a50-6579-4d01-8482-9cc5e0c5a576"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 15:30:25 crc kubenswrapper[4785]: I1126 15:30:25.570593 4785 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/19688a50-6579-4d01-8482-9cc5e0c5a576-kubelet-dir\") on node \"crc\" DevicePath \"\"" Nov 26 15:30:25 crc kubenswrapper[4785]: I1126 15:30:25.577846 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/19688a50-6579-4d01-8482-9cc5e0c5a576-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "19688a50-6579-4d01-8482-9cc5e0c5a576" (UID: "19688a50-6579-4d01-8482-9cc5e0c5a576"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 15:30:25 crc kubenswrapper[4785]: I1126 15:30:25.672817 4785 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/19688a50-6579-4d01-8482-9cc5e0c5a576-kube-api-access\") on node \"crc\" DevicePath \"\"" Nov 26 15:30:26 crc kubenswrapper[4785]: I1126 15:30:26.131215 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"19688a50-6579-4d01-8482-9cc5e0c5a576","Type":"ContainerDied","Data":"c620da3d92c61292572f11853397a06774987471555c5a7c1418897f3bc7098b"} Nov 26 15:30:26 crc kubenswrapper[4785]: I1126 15:30:26.131290 4785 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c620da3d92c61292572f11853397a06774987471555c5a7c1418897f3bc7098b" Nov 26 15:30:26 crc kubenswrapper[4785]: I1126 15:30:26.131377 4785 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Nov 26 15:30:26 crc kubenswrapper[4785]: I1126 15:30:26.288354 4785 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Nov 26 15:30:26 crc kubenswrapper[4785]: E1126 15:30:26.288795 4785 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="19688a50-6579-4d01-8482-9cc5e0c5a576" containerName="pruner" Nov 26 15:30:26 crc kubenswrapper[4785]: I1126 15:30:26.288815 4785 state_mem.go:107] "Deleted CPUSet assignment" podUID="19688a50-6579-4d01-8482-9cc5e0c5a576" containerName="pruner" Nov 26 15:30:26 crc kubenswrapper[4785]: I1126 15:30:26.289042 4785 memory_manager.go:354] "RemoveStaleState removing state" podUID="19688a50-6579-4d01-8482-9cc5e0c5a576" containerName="pruner" Nov 26 15:30:26 crc kubenswrapper[4785]: I1126 15:30:26.289892 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Nov 26 15:30:26 crc kubenswrapper[4785]: I1126 15:30:26.298353 4785 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Nov 26 15:30:26 crc kubenswrapper[4785]: I1126 15:30:26.299611 4785 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Nov 26 15:30:26 crc kubenswrapper[4785]: I1126 15:30:26.299832 4785 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Nov 26 15:30:26 crc kubenswrapper[4785]: I1126 15:30:26.381435 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c5ec6b55-6a88-4582-b241-fea4151ff61f-kubelet-dir\") pod \"installer-9-crc\" (UID: \"c5ec6b55-6a88-4582-b241-fea4151ff61f\") " pod="openshift-kube-apiserver/installer-9-crc" Nov 26 15:30:26 crc kubenswrapper[4785]: I1126 15:30:26.381931 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/c5ec6b55-6a88-4582-b241-fea4151ff61f-var-lock\") pod \"installer-9-crc\" (UID: \"c5ec6b55-6a88-4582-b241-fea4151ff61f\") " pod="openshift-kube-apiserver/installer-9-crc" Nov 26 15:30:26 crc kubenswrapper[4785]: I1126 15:30:26.382231 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c5ec6b55-6a88-4582-b241-fea4151ff61f-kube-api-access\") pod \"installer-9-crc\" (UID: \"c5ec6b55-6a88-4582-b241-fea4151ff61f\") " pod="openshift-kube-apiserver/installer-9-crc" Nov 26 15:30:26 crc kubenswrapper[4785]: I1126 15:30:26.484030 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c5ec6b55-6a88-4582-b241-fea4151ff61f-kube-api-access\") pod \"installer-9-crc\" (UID: \"c5ec6b55-6a88-4582-b241-fea4151ff61f\") " pod="openshift-kube-apiserver/installer-9-crc" Nov 26 15:30:26 crc kubenswrapper[4785]: I1126 15:30:26.484187 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c5ec6b55-6a88-4582-b241-fea4151ff61f-kubelet-dir\") pod \"installer-9-crc\" (UID: \"c5ec6b55-6a88-4582-b241-fea4151ff61f\") " pod="openshift-kube-apiserver/installer-9-crc" Nov 26 15:30:26 crc kubenswrapper[4785]: I1126 15:30:26.484223 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/c5ec6b55-6a88-4582-b241-fea4151ff61f-var-lock\") pod \"installer-9-crc\" (UID: \"c5ec6b55-6a88-4582-b241-fea4151ff61f\") " pod="openshift-kube-apiserver/installer-9-crc" Nov 26 15:30:26 crc kubenswrapper[4785]: I1126 15:30:26.484282 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c5ec6b55-6a88-4582-b241-fea4151ff61f-kubelet-dir\") pod \"installer-9-crc\" (UID: \"c5ec6b55-6a88-4582-b241-fea4151ff61f\") " pod="openshift-kube-apiserver/installer-9-crc" Nov 26 15:30:26 crc kubenswrapper[4785]: I1126 15:30:26.484353 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/c5ec6b55-6a88-4582-b241-fea4151ff61f-var-lock\") pod \"installer-9-crc\" (UID: \"c5ec6b55-6a88-4582-b241-fea4151ff61f\") " pod="openshift-kube-apiserver/installer-9-crc" Nov 26 15:30:26 crc kubenswrapper[4785]: I1126 15:30:26.515391 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c5ec6b55-6a88-4582-b241-fea4151ff61f-kube-api-access\") pod \"installer-9-crc\" (UID: \"c5ec6b55-6a88-4582-b241-fea4151ff61f\") " pod="openshift-kube-apiserver/installer-9-crc" Nov 26 15:30:26 crc kubenswrapper[4785]: I1126 15:30:26.625003 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Nov 26 15:30:27 crc kubenswrapper[4785]: I1126 15:30:27.078534 4785 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Nov 26 15:30:27 crc kubenswrapper[4785]: I1126 15:30:27.141950 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"c5ec6b55-6a88-4582-b241-fea4151ff61f","Type":"ContainerStarted","Data":"fe9f125e10749f3d8cf2650a26ec82de26cc6ddf38c63c705a2d896942b1a644"} Nov 26 15:30:28 crc kubenswrapper[4785]: I1126 15:30:28.155286 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"c5ec6b55-6a88-4582-b241-fea4151ff61f","Type":"ContainerStarted","Data":"b7c91db795c1444c6e6209fe2e095e555fa2d98825b80ce293a7367546d71ab2"} Nov 26 15:31:05 crc kubenswrapper[4785]: I1126 15:31:05.421047 4785 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Nov 26 15:31:05 crc kubenswrapper[4785]: I1126 15:31:05.423159 4785 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Nov 26 15:31:05 crc kubenswrapper[4785]: I1126 15:31:05.423421 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 26 15:31:05 crc kubenswrapper[4785]: I1126 15:31:05.423631 4785 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" containerID="cri-o://26237e500f4655636096b017418a9029308b9ad9e5c12041546e51d919766529" gracePeriod=15 Nov 26 15:31:05 crc kubenswrapper[4785]: I1126 15:31:05.423786 4785 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://7904d95c1ece08fadcdc54e83f80b295f3904cc950fb84624bc2a6967e4d5bdd" gracePeriod=15 Nov 26 15:31:05 crc kubenswrapper[4785]: I1126 15:31:05.423803 4785 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" containerID="cri-o://936454b4c6cdedceb140c50e2dba0e677ce2b126075b203bdae8a77bbdefde76" gracePeriod=15 Nov 26 15:31:05 crc kubenswrapper[4785]: I1126 15:31:05.423905 4785 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" containerID="cri-o://2202f005c9b704cc7a97a7387e4563a14bda9e52001c63918f1067a7e676d34d" gracePeriod=15 Nov 26 15:31:05 crc kubenswrapper[4785]: I1126 15:31:05.423927 4785 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://bcce80995d95ff6993051a4f1ca260133d2513d216a291e3f2c905ec29f88d1b" gracePeriod=15 Nov 26 15:31:05 crc kubenswrapper[4785]: I1126 15:31:05.424636 4785 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Nov 26 15:31:05 crc kubenswrapper[4785]: E1126 15:31:05.424841 4785 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Nov 26 15:31:05 crc kubenswrapper[4785]: I1126 15:31:05.424858 4785 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Nov 26 15:31:05 crc kubenswrapper[4785]: E1126 15:31:05.424873 4785 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Nov 26 15:31:05 crc kubenswrapper[4785]: I1126 15:31:05.424880 4785 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Nov 26 15:31:05 crc kubenswrapper[4785]: E1126 15:31:05.424889 4785 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Nov 26 15:31:05 crc kubenswrapper[4785]: I1126 15:31:05.424896 4785 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Nov 26 15:31:05 crc kubenswrapper[4785]: E1126 15:31:05.424908 4785 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Nov 26 15:31:05 crc kubenswrapper[4785]: I1126 15:31:05.424915 4785 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Nov 26 15:31:05 crc kubenswrapper[4785]: E1126 15:31:05.424929 4785 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Nov 26 15:31:05 crc kubenswrapper[4785]: I1126 15:31:05.424934 4785 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Nov 26 15:31:05 crc kubenswrapper[4785]: E1126 15:31:05.424942 4785 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Nov 26 15:31:05 crc kubenswrapper[4785]: I1126 15:31:05.424948 4785 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Nov 26 15:31:05 crc kubenswrapper[4785]: E1126 15:31:05.424955 4785 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Nov 26 15:31:05 crc kubenswrapper[4785]: I1126 15:31:05.424960 4785 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Nov 26 15:31:05 crc kubenswrapper[4785]: I1126 15:31:05.425087 4785 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Nov 26 15:31:05 crc kubenswrapper[4785]: I1126 15:31:05.425095 4785 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Nov 26 15:31:05 crc kubenswrapper[4785]: I1126 15:31:05.425106 4785 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Nov 26 15:31:05 crc kubenswrapper[4785]: I1126 15:31:05.425115 4785 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Nov 26 15:31:05 crc kubenswrapper[4785]: I1126 15:31:05.425126 4785 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Nov 26 15:31:05 crc kubenswrapper[4785]: I1126 15:31:05.425354 4785 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Nov 26 15:31:05 crc kubenswrapper[4785]: I1126 15:31:05.469022 4785 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-9-crc" podStartSLOduration=39.469000165 podStartE2EDuration="39.469000165s" podCreationTimestamp="2025-11-26 15:30:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 15:30:28.184234666 +0000 UTC m=+1391.862600510" watchObservedRunningTime="2025-11-26 15:31:05.469000165 +0000 UTC m=+1429.147365939" Nov 26 15:31:05 crc kubenswrapper[4785]: I1126 15:31:05.474562 4785 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Nov 26 15:31:05 crc kubenswrapper[4785]: I1126 15:31:05.558127 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 26 15:31:05 crc kubenswrapper[4785]: I1126 15:31:05.558532 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 26 15:31:05 crc kubenswrapper[4785]: I1126 15:31:05.558639 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 26 15:31:05 crc kubenswrapper[4785]: I1126 15:31:05.558695 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 26 15:31:05 crc kubenswrapper[4785]: I1126 15:31:05.558742 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 26 15:31:05 crc kubenswrapper[4785]: I1126 15:31:05.558771 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 26 15:31:05 crc kubenswrapper[4785]: I1126 15:31:05.558821 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 26 15:31:05 crc kubenswrapper[4785]: I1126 15:31:05.558958 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 26 15:31:05 crc kubenswrapper[4785]: I1126 15:31:05.660539 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 26 15:31:05 crc kubenswrapper[4785]: I1126 15:31:05.660606 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 26 15:31:05 crc kubenswrapper[4785]: I1126 15:31:05.660623 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 26 15:31:05 crc kubenswrapper[4785]: I1126 15:31:05.660652 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 26 15:31:05 crc kubenswrapper[4785]: I1126 15:31:05.660628 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 26 15:31:05 crc kubenswrapper[4785]: I1126 15:31:05.660676 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 26 15:31:05 crc kubenswrapper[4785]: I1126 15:31:05.660727 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 26 15:31:05 crc kubenswrapper[4785]: I1126 15:31:05.660779 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 26 15:31:05 crc kubenswrapper[4785]: I1126 15:31:05.660813 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 26 15:31:05 crc kubenswrapper[4785]: I1126 15:31:05.660841 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 26 15:31:05 crc kubenswrapper[4785]: I1126 15:31:05.660885 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 26 15:31:05 crc kubenswrapper[4785]: I1126 15:31:05.660954 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 26 15:31:05 crc kubenswrapper[4785]: I1126 15:31:05.660988 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 26 15:31:05 crc kubenswrapper[4785]: I1126 15:31:05.661019 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 26 15:31:05 crc kubenswrapper[4785]: I1126 15:31:05.661021 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 26 15:31:05 crc kubenswrapper[4785]: I1126 15:31:05.661052 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 26 15:31:05 crc kubenswrapper[4785]: I1126 15:31:05.763582 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 26 15:31:05 crc kubenswrapper[4785]: E1126 15:31:05.789998 4785 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.44:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.187b983cc364139b openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-11-26 15:31:05.789256603 +0000 UTC m=+1429.467622357,LastTimestamp:2025-11-26 15:31:05.789256603 +0000 UTC m=+1429.467622357,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Nov 26 15:31:06 crc kubenswrapper[4785]: E1126 15:31:06.423001 4785 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.44:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.187b983cc364139b openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-11-26 15:31:05.789256603 +0000 UTC m=+1429.467622357,LastTimestamp:2025-11-26 15:31:05.789256603 +0000 UTC m=+1429.467622357,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Nov 26 15:31:06 crc kubenswrapper[4785]: I1126 15:31:06.537779 4785 generic.go:334] "Generic (PLEG): container finished" podID="c5ec6b55-6a88-4582-b241-fea4151ff61f" containerID="b7c91db795c1444c6e6209fe2e095e555fa2d98825b80ce293a7367546d71ab2" exitCode=0 Nov 26 15:31:06 crc kubenswrapper[4785]: I1126 15:31:06.537873 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"c5ec6b55-6a88-4582-b241-fea4151ff61f","Type":"ContainerDied","Data":"b7c91db795c1444c6e6209fe2e095e555fa2d98825b80ce293a7367546d71ab2"} Nov 26 15:31:06 crc kubenswrapper[4785]: I1126 15:31:06.538691 4785 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.44:6443: connect: connection refused" Nov 26 15:31:06 crc kubenswrapper[4785]: I1126 15:31:06.539116 4785 status_manager.go:851] "Failed to get status for pod" podUID="c5ec6b55-6a88-4582-b241-fea4151ff61f" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.44:6443: connect: connection refused" Nov 26 15:31:06 crc kubenswrapper[4785]: I1126 15:31:06.539726 4785 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.44:6443: connect: connection refused" Nov 26 15:31:06 crc kubenswrapper[4785]: I1126 15:31:06.542667 4785 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Nov 26 15:31:06 crc kubenswrapper[4785]: I1126 15:31:06.544976 4785 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Nov 26 15:31:06 crc kubenswrapper[4785]: I1126 15:31:06.546120 4785 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="2202f005c9b704cc7a97a7387e4563a14bda9e52001c63918f1067a7e676d34d" exitCode=0 Nov 26 15:31:06 crc kubenswrapper[4785]: I1126 15:31:06.546165 4785 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="bcce80995d95ff6993051a4f1ca260133d2513d216a291e3f2c905ec29f88d1b" exitCode=0 Nov 26 15:31:06 crc kubenswrapper[4785]: I1126 15:31:06.546176 4785 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="7904d95c1ece08fadcdc54e83f80b295f3904cc950fb84624bc2a6967e4d5bdd" exitCode=0 Nov 26 15:31:06 crc kubenswrapper[4785]: I1126 15:31:06.546184 4785 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="936454b4c6cdedceb140c50e2dba0e677ce2b126075b203bdae8a77bbdefde76" exitCode=2 Nov 26 15:31:06 crc kubenswrapper[4785]: I1126 15:31:06.546225 4785 scope.go:117] "RemoveContainer" containerID="6a6d4414478527bb76f539125e4be6ebf2a6470ddb46fd35e07180319fa98ede" Nov 26 15:31:06 crc kubenswrapper[4785]: I1126 15:31:06.549701 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"b3efab6087bbac29595039c25728ab3ec31fc432f718c79b860154e99919b047"} Nov 26 15:31:06 crc kubenswrapper[4785]: I1126 15:31:06.549751 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"8188ebae1f3026c525cb2741b8ddf64c9ea3d9df963a38a59775831763675a18"} Nov 26 15:31:06 crc kubenswrapper[4785]: I1126 15:31:06.550850 4785 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.44:6443: connect: connection refused" Nov 26 15:31:06 crc kubenswrapper[4785]: I1126 15:31:06.551882 4785 status_manager.go:851] "Failed to get status for pod" podUID="c5ec6b55-6a88-4582-b241-fea4151ff61f" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.44:6443: connect: connection refused" Nov 26 15:31:06 crc kubenswrapper[4785]: I1126 15:31:06.552290 4785 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.44:6443: connect: connection refused" Nov 26 15:31:06 crc kubenswrapper[4785]: I1126 15:31:06.974701 4785 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:6443/readyz\": dial tcp 192.168.126.11:6443: connect: connection refused" start-of-body= Nov 26 15:31:06 crc kubenswrapper[4785]: I1126 15:31:06.974841 4785 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="Get \"https://192.168.126.11:6443/readyz\": dial tcp 192.168.126.11:6443: connect: connection refused" Nov 26 15:31:07 crc kubenswrapper[4785]: I1126 15:31:07.063534 4785 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.44:6443: connect: connection refused" Nov 26 15:31:07 crc kubenswrapper[4785]: I1126 15:31:07.064134 4785 status_manager.go:851] "Failed to get status for pod" podUID="c5ec6b55-6a88-4582-b241-fea4151ff61f" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.44:6443: connect: connection refused" Nov 26 15:31:07 crc kubenswrapper[4785]: I1126 15:31:07.064672 4785 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.44:6443: connect: connection refused" Nov 26 15:31:07 crc kubenswrapper[4785]: I1126 15:31:07.588078 4785 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Nov 26 15:31:07 crc kubenswrapper[4785]: I1126 15:31:07.957039 4785 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Nov 26 15:31:07 crc kubenswrapper[4785]: I1126 15:31:07.957823 4785 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.44:6443: connect: connection refused" Nov 26 15:31:07 crc kubenswrapper[4785]: I1126 15:31:07.958124 4785 status_manager.go:851] "Failed to get status for pod" podUID="c5ec6b55-6a88-4582-b241-fea4151ff61f" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.44:6443: connect: connection refused" Nov 26 15:31:07 crc kubenswrapper[4785]: I1126 15:31:07.963965 4785 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Nov 26 15:31:07 crc kubenswrapper[4785]: I1126 15:31:07.964725 4785 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 26 15:31:07 crc kubenswrapper[4785]: I1126 15:31:07.965352 4785 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.44:6443: connect: connection refused" Nov 26 15:31:07 crc kubenswrapper[4785]: I1126 15:31:07.965795 4785 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.44:6443: connect: connection refused" Nov 26 15:31:07 crc kubenswrapper[4785]: I1126 15:31:07.966182 4785 status_manager.go:851] "Failed to get status for pod" podUID="c5ec6b55-6a88-4582-b241-fea4151ff61f" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.44:6443: connect: connection refused" Nov 26 15:31:08 crc kubenswrapper[4785]: E1126 15:31:08.055902 4785 desired_state_of_world_populator.go:312] "Error processing volume" err="error processing PVC glance-kuttl-tests/glance-glance-default-internal-api-0: failed to fetch PVC from API server: Get \"https://api-int.crc.testing:6443/api/v1/namespaces/glance-kuttl-tests/persistentvolumeclaims/glance-glance-default-internal-api-0\": dial tcp 38.102.83.44:6443: connect: connection refused" pod="glance-kuttl-tests/glance-default-internal-api-0" volumeName="glance" Nov 26 15:31:08 crc kubenswrapper[4785]: E1126 15:31:08.056514 4785 desired_state_of_world_populator.go:312] "Error processing volume" err="error processing PVC glance-kuttl-tests/glance-cache-glance-default-internal-api-0: failed to fetch PVC from API server: Get \"https://api-int.crc.testing:6443/api/v1/namespaces/glance-kuttl-tests/persistentvolumeclaims/glance-cache-glance-default-internal-api-0\": dial tcp 38.102.83.44:6443: connect: connection refused" pod="glance-kuttl-tests/glance-default-internal-api-0" volumeName="glance-cache" Nov 26 15:31:08 crc kubenswrapper[4785]: I1126 15:31:08.114129 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Nov 26 15:31:08 crc kubenswrapper[4785]: I1126 15:31:08.114586 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c5ec6b55-6a88-4582-b241-fea4151ff61f-kube-api-access\") pod \"c5ec6b55-6a88-4582-b241-fea4151ff61f\" (UID: \"c5ec6b55-6a88-4582-b241-fea4151ff61f\") " Nov 26 15:31:08 crc kubenswrapper[4785]: I1126 15:31:08.114288 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 15:31:08 crc kubenswrapper[4785]: I1126 15:31:08.114657 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Nov 26 15:31:08 crc kubenswrapper[4785]: I1126 15:31:08.114692 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c5ec6b55-6a88-4582-b241-fea4151ff61f-kubelet-dir\") pod \"c5ec6b55-6a88-4582-b241-fea4151ff61f\" (UID: \"c5ec6b55-6a88-4582-b241-fea4151ff61f\") " Nov 26 15:31:08 crc kubenswrapper[4785]: I1126 15:31:08.114716 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Nov 26 15:31:08 crc kubenswrapper[4785]: I1126 15:31:08.114810 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 15:31:08 crc kubenswrapper[4785]: I1126 15:31:08.114833 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c5ec6b55-6a88-4582-b241-fea4151ff61f-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "c5ec6b55-6a88-4582-b241-fea4151ff61f" (UID: "c5ec6b55-6a88-4582-b241-fea4151ff61f"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 15:31:08 crc kubenswrapper[4785]: I1126 15:31:08.114843 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/c5ec6b55-6a88-4582-b241-fea4151ff61f-var-lock\") pod \"c5ec6b55-6a88-4582-b241-fea4151ff61f\" (UID: \"c5ec6b55-6a88-4582-b241-fea4151ff61f\") " Nov 26 15:31:08 crc kubenswrapper[4785]: I1126 15:31:08.114893 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c5ec6b55-6a88-4582-b241-fea4151ff61f-var-lock" (OuterVolumeSpecName: "var-lock") pod "c5ec6b55-6a88-4582-b241-fea4151ff61f" (UID: "c5ec6b55-6a88-4582-b241-fea4151ff61f"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 15:31:08 crc kubenswrapper[4785]: I1126 15:31:08.114906 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 15:31:08 crc kubenswrapper[4785]: I1126 15:31:08.116165 4785 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/c5ec6b55-6a88-4582-b241-fea4151ff61f-var-lock\") on node \"crc\" DevicePath \"\"" Nov 26 15:31:08 crc kubenswrapper[4785]: I1126 15:31:08.116229 4785 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") on node \"crc\" DevicePath \"\"" Nov 26 15:31:08 crc kubenswrapper[4785]: I1126 15:31:08.116257 4785 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") on node \"crc\" DevicePath \"\"" Nov 26 15:31:08 crc kubenswrapper[4785]: I1126 15:31:08.116282 4785 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c5ec6b55-6a88-4582-b241-fea4151ff61f-kubelet-dir\") on node \"crc\" DevicePath \"\"" Nov 26 15:31:08 crc kubenswrapper[4785]: I1126 15:31:08.116308 4785 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") on node \"crc\" DevicePath \"\"" Nov 26 15:31:08 crc kubenswrapper[4785]: I1126 15:31:08.124313 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c5ec6b55-6a88-4582-b241-fea4151ff61f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "c5ec6b55-6a88-4582-b241-fea4151ff61f" (UID: "c5ec6b55-6a88-4582-b241-fea4151ff61f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 15:31:08 crc kubenswrapper[4785]: I1126 15:31:08.218376 4785 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c5ec6b55-6a88-4582-b241-fea4151ff61f-kube-api-access\") on node \"crc\" DevicePath \"\"" Nov 26 15:31:08 crc kubenswrapper[4785]: I1126 15:31:08.599610 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"c5ec6b55-6a88-4582-b241-fea4151ff61f","Type":"ContainerDied","Data":"fe9f125e10749f3d8cf2650a26ec82de26cc6ddf38c63c705a2d896942b1a644"} Nov 26 15:31:08 crc kubenswrapper[4785]: I1126 15:31:08.599646 4785 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Nov 26 15:31:08 crc kubenswrapper[4785]: I1126 15:31:08.599676 4785 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fe9f125e10749f3d8cf2650a26ec82de26cc6ddf38c63c705a2d896942b1a644" Nov 26 15:31:08 crc kubenswrapper[4785]: I1126 15:31:08.606078 4785 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Nov 26 15:31:08 crc kubenswrapper[4785]: I1126 15:31:08.608262 4785 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="26237e500f4655636096b017418a9029308b9ad9e5c12041546e51d919766529" exitCode=0 Nov 26 15:31:08 crc kubenswrapper[4785]: I1126 15:31:08.608346 4785 scope.go:117] "RemoveContainer" containerID="2202f005c9b704cc7a97a7387e4563a14bda9e52001c63918f1067a7e676d34d" Nov 26 15:31:08 crc kubenswrapper[4785]: I1126 15:31:08.608838 4785 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 26 15:31:08 crc kubenswrapper[4785]: I1126 15:31:08.626802 4785 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.44:6443: connect: connection refused" Nov 26 15:31:08 crc kubenswrapper[4785]: I1126 15:31:08.627253 4785 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.44:6443: connect: connection refused" Nov 26 15:31:08 crc kubenswrapper[4785]: I1126 15:31:08.627607 4785 status_manager.go:851] "Failed to get status for pod" podUID="c5ec6b55-6a88-4582-b241-fea4151ff61f" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.44:6443: connect: connection refused" Nov 26 15:31:08 crc kubenswrapper[4785]: I1126 15:31:08.647638 4785 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.44:6443: connect: connection refused" Nov 26 15:31:08 crc kubenswrapper[4785]: I1126 15:31:08.648004 4785 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.44:6443: connect: connection refused" Nov 26 15:31:08 crc kubenswrapper[4785]: I1126 15:31:08.648323 4785 status_manager.go:851] "Failed to get status for pod" podUID="c5ec6b55-6a88-4582-b241-fea4151ff61f" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.44:6443: connect: connection refused" Nov 26 15:31:08 crc kubenswrapper[4785]: I1126 15:31:08.648456 4785 scope.go:117] "RemoveContainer" containerID="bcce80995d95ff6993051a4f1ca260133d2513d216a291e3f2c905ec29f88d1b" Nov 26 15:31:08 crc kubenswrapper[4785]: I1126 15:31:08.683163 4785 scope.go:117] "RemoveContainer" containerID="7904d95c1ece08fadcdc54e83f80b295f3904cc950fb84624bc2a6967e4d5bdd" Nov 26 15:31:08 crc kubenswrapper[4785]: I1126 15:31:08.707233 4785 scope.go:117] "RemoveContainer" containerID="936454b4c6cdedceb140c50e2dba0e677ce2b126075b203bdae8a77bbdefde76" Nov 26 15:31:08 crc kubenswrapper[4785]: I1126 15:31:08.725061 4785 scope.go:117] "RemoveContainer" containerID="26237e500f4655636096b017418a9029308b9ad9e5c12041546e51d919766529" Nov 26 15:31:08 crc kubenswrapper[4785]: I1126 15:31:08.743404 4785 scope.go:117] "RemoveContainer" containerID="e86ca1ef167bd0c31ad5ae1283d40945c72c4e528f3702b577bbbf9818e84c08" Nov 26 15:31:08 crc kubenswrapper[4785]: I1126 15:31:08.768826 4785 scope.go:117] "RemoveContainer" containerID="2202f005c9b704cc7a97a7387e4563a14bda9e52001c63918f1067a7e676d34d" Nov 26 15:31:08 crc kubenswrapper[4785]: E1126 15:31:08.769323 4785 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2202f005c9b704cc7a97a7387e4563a14bda9e52001c63918f1067a7e676d34d\": container with ID starting with 2202f005c9b704cc7a97a7387e4563a14bda9e52001c63918f1067a7e676d34d not found: ID does not exist" containerID="2202f005c9b704cc7a97a7387e4563a14bda9e52001c63918f1067a7e676d34d" Nov 26 15:31:08 crc kubenswrapper[4785]: I1126 15:31:08.769390 4785 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2202f005c9b704cc7a97a7387e4563a14bda9e52001c63918f1067a7e676d34d"} err="failed to get container status \"2202f005c9b704cc7a97a7387e4563a14bda9e52001c63918f1067a7e676d34d\": rpc error: code = NotFound desc = could not find container \"2202f005c9b704cc7a97a7387e4563a14bda9e52001c63918f1067a7e676d34d\": container with ID starting with 2202f005c9b704cc7a97a7387e4563a14bda9e52001c63918f1067a7e676d34d not found: ID does not exist" Nov 26 15:31:08 crc kubenswrapper[4785]: I1126 15:31:08.769416 4785 scope.go:117] "RemoveContainer" containerID="bcce80995d95ff6993051a4f1ca260133d2513d216a291e3f2c905ec29f88d1b" Nov 26 15:31:08 crc kubenswrapper[4785]: E1126 15:31:08.769728 4785 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bcce80995d95ff6993051a4f1ca260133d2513d216a291e3f2c905ec29f88d1b\": container with ID starting with bcce80995d95ff6993051a4f1ca260133d2513d216a291e3f2c905ec29f88d1b not found: ID does not exist" containerID="bcce80995d95ff6993051a4f1ca260133d2513d216a291e3f2c905ec29f88d1b" Nov 26 15:31:08 crc kubenswrapper[4785]: I1126 15:31:08.769768 4785 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bcce80995d95ff6993051a4f1ca260133d2513d216a291e3f2c905ec29f88d1b"} err="failed to get container status \"bcce80995d95ff6993051a4f1ca260133d2513d216a291e3f2c905ec29f88d1b\": rpc error: code = NotFound desc = could not find container \"bcce80995d95ff6993051a4f1ca260133d2513d216a291e3f2c905ec29f88d1b\": container with ID starting with bcce80995d95ff6993051a4f1ca260133d2513d216a291e3f2c905ec29f88d1b not found: ID does not exist" Nov 26 15:31:08 crc kubenswrapper[4785]: I1126 15:31:08.769798 4785 scope.go:117] "RemoveContainer" containerID="7904d95c1ece08fadcdc54e83f80b295f3904cc950fb84624bc2a6967e4d5bdd" Nov 26 15:31:08 crc kubenswrapper[4785]: E1126 15:31:08.770272 4785 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7904d95c1ece08fadcdc54e83f80b295f3904cc950fb84624bc2a6967e4d5bdd\": container with ID starting with 7904d95c1ece08fadcdc54e83f80b295f3904cc950fb84624bc2a6967e4d5bdd not found: ID does not exist" containerID="7904d95c1ece08fadcdc54e83f80b295f3904cc950fb84624bc2a6967e4d5bdd" Nov 26 15:31:08 crc kubenswrapper[4785]: I1126 15:31:08.770708 4785 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7904d95c1ece08fadcdc54e83f80b295f3904cc950fb84624bc2a6967e4d5bdd"} err="failed to get container status \"7904d95c1ece08fadcdc54e83f80b295f3904cc950fb84624bc2a6967e4d5bdd\": rpc error: code = NotFound desc = could not find container \"7904d95c1ece08fadcdc54e83f80b295f3904cc950fb84624bc2a6967e4d5bdd\": container with ID starting with 7904d95c1ece08fadcdc54e83f80b295f3904cc950fb84624bc2a6967e4d5bdd not found: ID does not exist" Nov 26 15:31:08 crc kubenswrapper[4785]: I1126 15:31:08.770765 4785 scope.go:117] "RemoveContainer" containerID="936454b4c6cdedceb140c50e2dba0e677ce2b126075b203bdae8a77bbdefde76" Nov 26 15:31:08 crc kubenswrapper[4785]: E1126 15:31:08.771079 4785 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"936454b4c6cdedceb140c50e2dba0e677ce2b126075b203bdae8a77bbdefde76\": container with ID starting with 936454b4c6cdedceb140c50e2dba0e677ce2b126075b203bdae8a77bbdefde76 not found: ID does not exist" containerID="936454b4c6cdedceb140c50e2dba0e677ce2b126075b203bdae8a77bbdefde76" Nov 26 15:31:08 crc kubenswrapper[4785]: I1126 15:31:08.771121 4785 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"936454b4c6cdedceb140c50e2dba0e677ce2b126075b203bdae8a77bbdefde76"} err="failed to get container status \"936454b4c6cdedceb140c50e2dba0e677ce2b126075b203bdae8a77bbdefde76\": rpc error: code = NotFound desc = could not find container \"936454b4c6cdedceb140c50e2dba0e677ce2b126075b203bdae8a77bbdefde76\": container with ID starting with 936454b4c6cdedceb140c50e2dba0e677ce2b126075b203bdae8a77bbdefde76 not found: ID does not exist" Nov 26 15:31:08 crc kubenswrapper[4785]: I1126 15:31:08.771152 4785 scope.go:117] "RemoveContainer" containerID="26237e500f4655636096b017418a9029308b9ad9e5c12041546e51d919766529" Nov 26 15:31:08 crc kubenswrapper[4785]: E1126 15:31:08.771420 4785 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"26237e500f4655636096b017418a9029308b9ad9e5c12041546e51d919766529\": container with ID starting with 26237e500f4655636096b017418a9029308b9ad9e5c12041546e51d919766529 not found: ID does not exist" containerID="26237e500f4655636096b017418a9029308b9ad9e5c12041546e51d919766529" Nov 26 15:31:08 crc kubenswrapper[4785]: I1126 15:31:08.771464 4785 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"26237e500f4655636096b017418a9029308b9ad9e5c12041546e51d919766529"} err="failed to get container status \"26237e500f4655636096b017418a9029308b9ad9e5c12041546e51d919766529\": rpc error: code = NotFound desc = could not find container \"26237e500f4655636096b017418a9029308b9ad9e5c12041546e51d919766529\": container with ID starting with 26237e500f4655636096b017418a9029308b9ad9e5c12041546e51d919766529 not found: ID does not exist" Nov 26 15:31:08 crc kubenswrapper[4785]: I1126 15:31:08.771491 4785 scope.go:117] "RemoveContainer" containerID="e86ca1ef167bd0c31ad5ae1283d40945c72c4e528f3702b577bbbf9818e84c08" Nov 26 15:31:08 crc kubenswrapper[4785]: E1126 15:31:08.771761 4785 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e86ca1ef167bd0c31ad5ae1283d40945c72c4e528f3702b577bbbf9818e84c08\": container with ID starting with e86ca1ef167bd0c31ad5ae1283d40945c72c4e528f3702b577bbbf9818e84c08 not found: ID does not exist" containerID="e86ca1ef167bd0c31ad5ae1283d40945c72c4e528f3702b577bbbf9818e84c08" Nov 26 15:31:08 crc kubenswrapper[4785]: I1126 15:31:08.771795 4785 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e86ca1ef167bd0c31ad5ae1283d40945c72c4e528f3702b577bbbf9818e84c08"} err="failed to get container status \"e86ca1ef167bd0c31ad5ae1283d40945c72c4e528f3702b577bbbf9818e84c08\": rpc error: code = NotFound desc = could not find container \"e86ca1ef167bd0c31ad5ae1283d40945c72c4e528f3702b577bbbf9818e84c08\": container with ID starting with e86ca1ef167bd0c31ad5ae1283d40945c72c4e528f3702b577bbbf9818e84c08 not found: ID does not exist" Nov 26 15:31:09 crc kubenswrapper[4785]: I1126 15:31:09.045087 4785 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4b27818a5e8e43d0dc095d08835c792" path="/var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/volumes" Nov 26 15:31:09 crc kubenswrapper[4785]: E1126 15:31:09.496639 4785 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.44:6443: connect: connection refused" Nov 26 15:31:09 crc kubenswrapper[4785]: E1126 15:31:09.497527 4785 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.44:6443: connect: connection refused" Nov 26 15:31:09 crc kubenswrapper[4785]: E1126 15:31:09.498093 4785 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.44:6443: connect: connection refused" Nov 26 15:31:09 crc kubenswrapper[4785]: E1126 15:31:09.498869 4785 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.44:6443: connect: connection refused" Nov 26 15:31:09 crc kubenswrapper[4785]: E1126 15:31:09.499367 4785 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.44:6443: connect: connection refused" Nov 26 15:31:09 crc kubenswrapper[4785]: I1126 15:31:09.499429 4785 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Nov 26 15:31:09 crc kubenswrapper[4785]: E1126 15:31:09.499871 4785 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.44:6443: connect: connection refused" interval="200ms" Nov 26 15:31:09 crc kubenswrapper[4785]: E1126 15:31:09.701725 4785 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.44:6443: connect: connection refused" interval="400ms" Nov 26 15:31:10 crc kubenswrapper[4785]: E1126 15:31:10.102901 4785 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.44:6443: connect: connection refused" interval="800ms" Nov 26 15:31:11 crc kubenswrapper[4785]: E1126 15:31:11.250986 4785 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.44:6443: connect: connection refused" interval="1.6s" Nov 26 15:31:12 crc kubenswrapper[4785]: E1126 15:31:12.852045 4785 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.44:6443: connect: connection refused" interval="3.2s" Nov 26 15:31:16 crc kubenswrapper[4785]: E1126 15:31:16.054233 4785 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.44:6443: connect: connection refused" interval="6.4s" Nov 26 15:31:16 crc kubenswrapper[4785]: E1126 15:31:16.423972 4785 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.44:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.187b983cc364139b openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-11-26 15:31:05.789256603 +0000 UTC m=+1429.467622357,LastTimestamp:2025-11-26 15:31:05.789256603 +0000 UTC m=+1429.467622357,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Nov 26 15:31:16 crc kubenswrapper[4785]: I1126 15:31:16.731393 4785 generic.go:334] "Generic (PLEG): container finished" podID="3e7e6d72-e83c-4188-a2bc-11b7c16e6e8e" containerID="4a9013328a4c852f6798521a620dd057d4cb7ec37f86824d3e564c9aa58dfbd8" exitCode=1 Nov 26 15:31:16 crc kubenswrapper[4785]: I1126 15:31:16.731491 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-647db694df-qnrxh" event={"ID":"3e7e6d72-e83c-4188-a2bc-11b7c16e6e8e","Type":"ContainerDied","Data":"4a9013328a4c852f6798521a620dd057d4cb7ec37f86824d3e564c9aa58dfbd8"} Nov 26 15:31:16 crc kubenswrapper[4785]: I1126 15:31:16.733169 4785 scope.go:117] "RemoveContainer" containerID="4a9013328a4c852f6798521a620dd057d4cb7ec37f86824d3e564c9aa58dfbd8" Nov 26 15:31:16 crc kubenswrapper[4785]: I1126 15:31:16.733105 4785 status_manager.go:851] "Failed to get status for pod" podUID="3e7e6d72-e83c-4188-a2bc-11b7c16e6e8e" pod="openstack-operators/horizon-operator-controller-manager-647db694df-qnrxh" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openstack-operators/pods/horizon-operator-controller-manager-647db694df-qnrxh\": dial tcp 38.102.83.44:6443: connect: connection refused" Nov 26 15:31:16 crc kubenswrapper[4785]: I1126 15:31:16.733831 4785 status_manager.go:851] "Failed to get status for pod" podUID="c5ec6b55-6a88-4582-b241-fea4151ff61f" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.44:6443: connect: connection refused" Nov 26 15:31:16 crc kubenswrapper[4785]: I1126 15:31:16.734596 4785 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.44:6443: connect: connection refused" Nov 26 15:31:16 crc kubenswrapper[4785]: I1126 15:31:16.739614 4785 generic.go:334] "Generic (PLEG): container finished" podID="c39759f2-3183-48fa-aaee-14b24c5337d7" containerID="a8852f789a2088c60fe9325c6132d607841f1460ce489ff9a31ce9c9aaf74710" exitCode=1 Nov 26 15:31:16 crc kubenswrapper[4785]: I1126 15:31:16.739661 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-6pwlx" event={"ID":"c39759f2-3183-48fa-aaee-14b24c5337d7","Type":"ContainerDied","Data":"a8852f789a2088c60fe9325c6132d607841f1460ce489ff9a31ce9c9aaf74710"} Nov 26 15:31:16 crc kubenswrapper[4785]: I1126 15:31:16.740056 4785 scope.go:117] "RemoveContainer" containerID="a8852f789a2088c60fe9325c6132d607841f1460ce489ff9a31ce9c9aaf74710" Nov 26 15:31:16 crc kubenswrapper[4785]: I1126 15:31:16.740666 4785 status_manager.go:851] "Failed to get status for pod" podUID="3e7e6d72-e83c-4188-a2bc-11b7c16e6e8e" pod="openstack-operators/horizon-operator-controller-manager-647db694df-qnrxh" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openstack-operators/pods/horizon-operator-controller-manager-647db694df-qnrxh\": dial tcp 38.102.83.44:6443: connect: connection refused" Nov 26 15:31:16 crc kubenswrapper[4785]: I1126 15:31:16.741257 4785 status_manager.go:851] "Failed to get status for pod" podUID="c5ec6b55-6a88-4582-b241-fea4151ff61f" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.44:6443: connect: connection refused" Nov 26 15:31:16 crc kubenswrapper[4785]: I1126 15:31:16.741759 4785 status_manager.go:851] "Failed to get status for pod" podUID="c39759f2-3183-48fa-aaee-14b24c5337d7" pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-6pwlx" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openstack-operators/pods/rabbitmq-cluster-operator-779fc9694b-6pwlx\": dial tcp 38.102.83.44:6443: connect: connection refused" Nov 26 15:31:16 crc kubenswrapper[4785]: I1126 15:31:16.742225 4785 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.44:6443: connect: connection refused" Nov 26 15:31:16 crc kubenswrapper[4785]: I1126 15:31:16.743310 4785 generic.go:334] "Generic (PLEG): container finished" podID="8d72a0d6-d729-4c0f-90c7-2a5eca6fc32a" containerID="24bb1ca1f490c9d7f463fb454dd4b20660d0c21a62cc095d06e0fca8c5f19482" exitCode=1 Nov 26 15:31:16 crc kubenswrapper[4785]: I1126 15:31:16.743381 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-747fb5cb85-5slw2" event={"ID":"8d72a0d6-d729-4c0f-90c7-2a5eca6fc32a","Type":"ContainerDied","Data":"24bb1ca1f490c9d7f463fb454dd4b20660d0c21a62cc095d06e0fca8c5f19482"} Nov 26 15:31:16 crc kubenswrapper[4785]: I1126 15:31:16.744106 4785 scope.go:117] "RemoveContainer" containerID="24bb1ca1f490c9d7f463fb454dd4b20660d0c21a62cc095d06e0fca8c5f19482" Nov 26 15:31:16 crc kubenswrapper[4785]: I1126 15:31:16.744633 4785 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.44:6443: connect: connection refused" Nov 26 15:31:16 crc kubenswrapper[4785]: I1126 15:31:16.744995 4785 status_manager.go:851] "Failed to get status for pod" podUID="8d72a0d6-d729-4c0f-90c7-2a5eca6fc32a" pod="openstack-operators/mariadb-operator-controller-manager-747fb5cb85-5slw2" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openstack-operators/pods/mariadb-operator-controller-manager-747fb5cb85-5slw2\": dial tcp 38.102.83.44:6443: connect: connection refused" Nov 26 15:31:16 crc kubenswrapper[4785]: I1126 15:31:16.745297 4785 status_manager.go:851] "Failed to get status for pod" podUID="3e7e6d72-e83c-4188-a2bc-11b7c16e6e8e" pod="openstack-operators/horizon-operator-controller-manager-647db694df-qnrxh" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openstack-operators/pods/horizon-operator-controller-manager-647db694df-qnrxh\": dial tcp 38.102.83.44:6443: connect: connection refused" Nov 26 15:31:16 crc kubenswrapper[4785]: I1126 15:31:16.745744 4785 status_manager.go:851] "Failed to get status for pod" podUID="c5ec6b55-6a88-4582-b241-fea4151ff61f" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.44:6443: connect: connection refused" Nov 26 15:31:16 crc kubenswrapper[4785]: I1126 15:31:16.746777 4785 status_manager.go:851] "Failed to get status for pod" podUID="c39759f2-3183-48fa-aaee-14b24c5337d7" pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-6pwlx" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openstack-operators/pods/rabbitmq-cluster-operator-779fc9694b-6pwlx\": dial tcp 38.102.83.44:6443: connect: connection refused" Nov 26 15:31:16 crc kubenswrapper[4785]: I1126 15:31:16.748655 4785 generic.go:334] "Generic (PLEG): container finished" podID="161001b1-a5be-49ea-8031-e2c11dd07800" containerID="49e10019708c7c760007229dd4fdf7afe9b43556b42661516134fda3234f359a" exitCode=1 Nov 26 15:31:16 crc kubenswrapper[4785]: I1126 15:31:16.748710 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-84667dbb5-sslgl" event={"ID":"161001b1-a5be-49ea-8031-e2c11dd07800","Type":"ContainerDied","Data":"49e10019708c7c760007229dd4fdf7afe9b43556b42661516134fda3234f359a"} Nov 26 15:31:16 crc kubenswrapper[4785]: I1126 15:31:16.749481 4785 scope.go:117] "RemoveContainer" containerID="49e10019708c7c760007229dd4fdf7afe9b43556b42661516134fda3234f359a" Nov 26 15:31:16 crc kubenswrapper[4785]: I1126 15:31:16.750106 4785 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.44:6443: connect: connection refused" Nov 26 15:31:16 crc kubenswrapper[4785]: I1126 15:31:16.750740 4785 status_manager.go:851] "Failed to get status for pod" podUID="161001b1-a5be-49ea-8031-e2c11dd07800" pod="metallb-system/metallb-operator-controller-manager-84667dbb5-sslgl" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/metallb-system/pods/metallb-operator-controller-manager-84667dbb5-sslgl\": dial tcp 38.102.83.44:6443: connect: connection refused" Nov 26 15:31:16 crc kubenswrapper[4785]: I1126 15:31:16.751325 4785 status_manager.go:851] "Failed to get status for pod" podUID="8d72a0d6-d729-4c0f-90c7-2a5eca6fc32a" pod="openstack-operators/mariadb-operator-controller-manager-747fb5cb85-5slw2" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openstack-operators/pods/mariadb-operator-controller-manager-747fb5cb85-5slw2\": dial tcp 38.102.83.44:6443: connect: connection refused" Nov 26 15:31:16 crc kubenswrapper[4785]: I1126 15:31:16.751983 4785 status_manager.go:851] "Failed to get status for pod" podUID="3e7e6d72-e83c-4188-a2bc-11b7c16e6e8e" pod="openstack-operators/horizon-operator-controller-manager-647db694df-qnrxh" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openstack-operators/pods/horizon-operator-controller-manager-647db694df-qnrxh\": dial tcp 38.102.83.44:6443: connect: connection refused" Nov 26 15:31:16 crc kubenswrapper[4785]: I1126 15:31:16.752452 4785 status_manager.go:851] "Failed to get status for pod" podUID="c5ec6b55-6a88-4582-b241-fea4151ff61f" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.44:6443: connect: connection refused" Nov 26 15:31:16 crc kubenswrapper[4785]: I1126 15:31:16.753264 4785 status_manager.go:851] "Failed to get status for pod" podUID="c39759f2-3183-48fa-aaee-14b24c5337d7" pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-6pwlx" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openstack-operators/pods/rabbitmq-cluster-operator-779fc9694b-6pwlx\": dial tcp 38.102.83.44:6443: connect: connection refused" Nov 26 15:31:17 crc kubenswrapper[4785]: I1126 15:31:17.036293 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 26 15:31:17 crc kubenswrapper[4785]: I1126 15:31:17.049169 4785 status_manager.go:851] "Failed to get status for pod" podUID="3e7e6d72-e83c-4188-a2bc-11b7c16e6e8e" pod="openstack-operators/horizon-operator-controller-manager-647db694df-qnrxh" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openstack-operators/pods/horizon-operator-controller-manager-647db694df-qnrxh\": dial tcp 38.102.83.44:6443: connect: connection refused" Nov 26 15:31:17 crc kubenswrapper[4785]: I1126 15:31:17.049623 4785 status_manager.go:851] "Failed to get status for pod" podUID="c5ec6b55-6a88-4582-b241-fea4151ff61f" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.44:6443: connect: connection refused" Nov 26 15:31:17 crc kubenswrapper[4785]: I1126 15:31:17.050101 4785 status_manager.go:851] "Failed to get status for pod" podUID="c39759f2-3183-48fa-aaee-14b24c5337d7" pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-6pwlx" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openstack-operators/pods/rabbitmq-cluster-operator-779fc9694b-6pwlx\": dial tcp 38.102.83.44:6443: connect: connection refused" Nov 26 15:31:17 crc kubenswrapper[4785]: I1126 15:31:17.050551 4785 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.44:6443: connect: connection refused" Nov 26 15:31:17 crc kubenswrapper[4785]: I1126 15:31:17.050923 4785 status_manager.go:851] "Failed to get status for pod" podUID="161001b1-a5be-49ea-8031-e2c11dd07800" pod="metallb-system/metallb-operator-controller-manager-84667dbb5-sslgl" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/metallb-system/pods/metallb-operator-controller-manager-84667dbb5-sslgl\": dial tcp 38.102.83.44:6443: connect: connection refused" Nov 26 15:31:17 crc kubenswrapper[4785]: I1126 15:31:17.051265 4785 status_manager.go:851] "Failed to get status for pod" podUID="8d72a0d6-d729-4c0f-90c7-2a5eca6fc32a" pod="openstack-operators/mariadb-operator-controller-manager-747fb5cb85-5slw2" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openstack-operators/pods/mariadb-operator-controller-manager-747fb5cb85-5slw2\": dial tcp 38.102.83.44:6443: connect: connection refused" Nov 26 15:31:17 crc kubenswrapper[4785]: I1126 15:31:17.051918 4785 status_manager.go:851] "Failed to get status for pod" podUID="8d72a0d6-d729-4c0f-90c7-2a5eca6fc32a" pod="openstack-operators/mariadb-operator-controller-manager-747fb5cb85-5slw2" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openstack-operators/pods/mariadb-operator-controller-manager-747fb5cb85-5slw2\": dial tcp 38.102.83.44:6443: connect: connection refused" Nov 26 15:31:17 crc kubenswrapper[4785]: I1126 15:31:17.052352 4785 status_manager.go:851] "Failed to get status for pod" podUID="3e7e6d72-e83c-4188-a2bc-11b7c16e6e8e" pod="openstack-operators/horizon-operator-controller-manager-647db694df-qnrxh" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openstack-operators/pods/horizon-operator-controller-manager-647db694df-qnrxh\": dial tcp 38.102.83.44:6443: connect: connection refused" Nov 26 15:31:17 crc kubenswrapper[4785]: I1126 15:31:17.052867 4785 status_manager.go:851] "Failed to get status for pod" podUID="c5ec6b55-6a88-4582-b241-fea4151ff61f" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.44:6443: connect: connection refused" Nov 26 15:31:17 crc kubenswrapper[4785]: I1126 15:31:17.053496 4785 status_manager.go:851] "Failed to get status for pod" podUID="c39759f2-3183-48fa-aaee-14b24c5337d7" pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-6pwlx" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openstack-operators/pods/rabbitmq-cluster-operator-779fc9694b-6pwlx\": dial tcp 38.102.83.44:6443: connect: connection refused" Nov 26 15:31:17 crc kubenswrapper[4785]: I1126 15:31:17.053942 4785 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.44:6443: connect: connection refused" Nov 26 15:31:17 crc kubenswrapper[4785]: I1126 15:31:17.054497 4785 status_manager.go:851] "Failed to get status for pod" podUID="161001b1-a5be-49ea-8031-e2c11dd07800" pod="metallb-system/metallb-operator-controller-manager-84667dbb5-sslgl" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/metallb-system/pods/metallb-operator-controller-manager-84667dbb5-sslgl\": dial tcp 38.102.83.44:6443: connect: connection refused" Nov 26 15:31:17 crc kubenswrapper[4785]: I1126 15:31:17.129900 4785 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/infra-operator-controller-manager-f68bdc44b-4p65x" podUID="b37e9a18-1d3b-4a5b-aa23-d9cc2f6394e3" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.62:8081/readyz\": dial tcp 10.217.0.62:8081: connect: connection refused" Nov 26 15:31:17 crc kubenswrapper[4785]: I1126 15:31:17.145470 4785 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="ce2473df-8540-437a-9b68-a0c79c8f1189" Nov 26 15:31:17 crc kubenswrapper[4785]: I1126 15:31:17.145521 4785 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="ce2473df-8540-437a-9b68-a0c79c8f1189" Nov 26 15:31:17 crc kubenswrapper[4785]: E1126 15:31:17.146424 4785 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.44:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 26 15:31:17 crc kubenswrapper[4785]: I1126 15:31:17.147463 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 26 15:31:17 crc kubenswrapper[4785]: I1126 15:31:17.744488 4785 scope.go:117] "RemoveContainer" containerID="46c065037f15eff56f988da4c26fe1e139a76145894ebc7129dcf286c98afb94" Nov 26 15:31:17 crc kubenswrapper[4785]: I1126 15:31:17.756995 4785 generic.go:334] "Generic (PLEG): container finished" podID="60b24860-07b4-4841-9c4a-a5e6456a45dc" containerID="5b25616e36bb67a22a9385ef0360a339f8572ff565f1fff90f7f634a3370f4b4" exitCode=1 Nov 26 15:31:17 crc kubenswrapper[4785]: I1126 15:31:17.757126 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-5d784fc5bb-kn67f" event={"ID":"60b24860-07b4-4841-9c4a-a5e6456a45dc","Type":"ContainerDied","Data":"5b25616e36bb67a22a9385ef0360a339f8572ff565f1fff90f7f634a3370f4b4"} Nov 26 15:31:17 crc kubenswrapper[4785]: I1126 15:31:17.758210 4785 status_manager.go:851] "Failed to get status for pod" podUID="60b24860-07b4-4841-9c4a-a5e6456a45dc" pod="openstack-operators/swift-operator-controller-manager-5d784fc5bb-kn67f" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openstack-operators/pods/swift-operator-controller-manager-5d784fc5bb-kn67f\": dial tcp 38.102.83.44:6443: connect: connection refused" Nov 26 15:31:17 crc kubenswrapper[4785]: I1126 15:31:17.758230 4785 scope.go:117] "RemoveContainer" containerID="5b25616e36bb67a22a9385ef0360a339f8572ff565f1fff90f7f634a3370f4b4" Nov 26 15:31:17 crc kubenswrapper[4785]: I1126 15:31:17.758814 4785 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.44:6443: connect: connection refused" Nov 26 15:31:17 crc kubenswrapper[4785]: I1126 15:31:17.758863 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-68b4f95d6c-cpkqd" event={"ID":"bc8e3329-ae9c-48b1-a49c-92eeef6ae114","Type":"ContainerDied","Data":"6f9f9b6a1f351c004314e3206140ceb097bd494a3c7a042b212fcd42e9327814"} Nov 26 15:31:17 crc kubenswrapper[4785]: I1126 15:31:17.758841 4785 generic.go:334] "Generic (PLEG): container finished" podID="bc8e3329-ae9c-48b1-a49c-92eeef6ae114" containerID="6f9f9b6a1f351c004314e3206140ceb097bd494a3c7a042b212fcd42e9327814" exitCode=1 Nov 26 15:31:17 crc kubenswrapper[4785]: I1126 15:31:17.759173 4785 scope.go:117] "RemoveContainer" containerID="6f9f9b6a1f351c004314e3206140ceb097bd494a3c7a042b212fcd42e9327814" Nov 26 15:31:17 crc kubenswrapper[4785]: I1126 15:31:17.759346 4785 status_manager.go:851] "Failed to get status for pod" podUID="161001b1-a5be-49ea-8031-e2c11dd07800" pod="metallb-system/metallb-operator-controller-manager-84667dbb5-sslgl" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/metallb-system/pods/metallb-operator-controller-manager-84667dbb5-sslgl\": dial tcp 38.102.83.44:6443: connect: connection refused" Nov 26 15:31:17 crc kubenswrapper[4785]: I1126 15:31:17.759674 4785 status_manager.go:851] "Failed to get status for pod" podUID="8d72a0d6-d729-4c0f-90c7-2a5eca6fc32a" pod="openstack-operators/mariadb-operator-controller-manager-747fb5cb85-5slw2" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openstack-operators/pods/mariadb-operator-controller-manager-747fb5cb85-5slw2\": dial tcp 38.102.83.44:6443: connect: connection refused" Nov 26 15:31:17 crc kubenswrapper[4785]: I1126 15:31:17.759983 4785 status_manager.go:851] "Failed to get status for pod" podUID="3e7e6d72-e83c-4188-a2bc-11b7c16e6e8e" pod="openstack-operators/horizon-operator-controller-manager-647db694df-qnrxh" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openstack-operators/pods/horizon-operator-controller-manager-647db694df-qnrxh\": dial tcp 38.102.83.44:6443: connect: connection refused" Nov 26 15:31:17 crc kubenswrapper[4785]: I1126 15:31:17.760214 4785 status_manager.go:851] "Failed to get status for pod" podUID="c5ec6b55-6a88-4582-b241-fea4151ff61f" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.44:6443: connect: connection refused" Nov 26 15:31:17 crc kubenswrapper[4785]: I1126 15:31:17.760688 4785 status_manager.go:851] "Failed to get status for pod" podUID="c39759f2-3183-48fa-aaee-14b24c5337d7" pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-6pwlx" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openstack-operators/pods/rabbitmq-cluster-operator-779fc9694b-6pwlx\": dial tcp 38.102.83.44:6443: connect: connection refused" Nov 26 15:31:17 crc kubenswrapper[4785]: I1126 15:31:17.761195 4785 generic.go:334] "Generic (PLEG): container finished" podID="8d72a0d6-d729-4c0f-90c7-2a5eca6fc32a" containerID="cd44bf4be8266213edfeb75984a0f32780c31cbacd9894cbd20a9535fb5dfd62" exitCode=1 Nov 26 15:31:17 crc kubenswrapper[4785]: I1126 15:31:17.761219 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-747fb5cb85-5slw2" event={"ID":"8d72a0d6-d729-4c0f-90c7-2a5eca6fc32a","Type":"ContainerDied","Data":"cd44bf4be8266213edfeb75984a0f32780c31cbacd9894cbd20a9535fb5dfd62"} Nov 26 15:31:17 crc kubenswrapper[4785]: I1126 15:31:17.761241 4785 scope.go:117] "RemoveContainer" containerID="24bb1ca1f490c9d7f463fb454dd4b20660d0c21a62cc095d06e0fca8c5f19482" Nov 26 15:31:17 crc kubenswrapper[4785]: I1126 15:31:17.761203 4785 status_manager.go:851] "Failed to get status for pod" podUID="bc8e3329-ae9c-48b1-a49c-92eeef6ae114" pod="openstack-operators/keystone-operator-controller-manager-68b4f95d6c-cpkqd" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openstack-operators/pods/keystone-operator-controller-manager-68b4f95d6c-cpkqd\": dial tcp 38.102.83.44:6443: connect: connection refused" Nov 26 15:31:17 crc kubenswrapper[4785]: I1126 15:31:17.761527 4785 status_manager.go:851] "Failed to get status for pod" podUID="60b24860-07b4-4841-9c4a-a5e6456a45dc" pod="openstack-operators/swift-operator-controller-manager-5d784fc5bb-kn67f" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openstack-operators/pods/swift-operator-controller-manager-5d784fc5bb-kn67f\": dial tcp 38.102.83.44:6443: connect: connection refused" Nov 26 15:31:17 crc kubenswrapper[4785]: I1126 15:31:17.761795 4785 scope.go:117] "RemoveContainer" containerID="cd44bf4be8266213edfeb75984a0f32780c31cbacd9894cbd20a9535fb5dfd62" Nov 26 15:31:17 crc kubenswrapper[4785]: I1126 15:31:17.761988 4785 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.44:6443: connect: connection refused" Nov 26 15:31:17 crc kubenswrapper[4785]: E1126 15:31:17.762099 4785 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=manager pod=mariadb-operator-controller-manager-747fb5cb85-5slw2_openstack-operators(8d72a0d6-d729-4c0f-90c7-2a5eca6fc32a)\"" pod="openstack-operators/mariadb-operator-controller-manager-747fb5cb85-5slw2" podUID="8d72a0d6-d729-4c0f-90c7-2a5eca6fc32a" Nov 26 15:31:17 crc kubenswrapper[4785]: I1126 15:31:17.762459 4785 status_manager.go:851] "Failed to get status for pod" podUID="161001b1-a5be-49ea-8031-e2c11dd07800" pod="metallb-system/metallb-operator-controller-manager-84667dbb5-sslgl" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/metallb-system/pods/metallb-operator-controller-manager-84667dbb5-sslgl\": dial tcp 38.102.83.44:6443: connect: connection refused" Nov 26 15:31:17 crc kubenswrapper[4785]: I1126 15:31:17.762793 4785 status_manager.go:851] "Failed to get status for pod" podUID="8d72a0d6-d729-4c0f-90c7-2a5eca6fc32a" pod="openstack-operators/mariadb-operator-controller-manager-747fb5cb85-5slw2" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openstack-operators/pods/mariadb-operator-controller-manager-747fb5cb85-5slw2\": dial tcp 38.102.83.44:6443: connect: connection refused" Nov 26 15:31:17 crc kubenswrapper[4785]: I1126 15:31:17.763488 4785 status_manager.go:851] "Failed to get status for pod" podUID="3e7e6d72-e83c-4188-a2bc-11b7c16e6e8e" pod="openstack-operators/horizon-operator-controller-manager-647db694df-qnrxh" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openstack-operators/pods/horizon-operator-controller-manager-647db694df-qnrxh\": dial tcp 38.102.83.44:6443: connect: connection refused" Nov 26 15:31:17 crc kubenswrapper[4785]: I1126 15:31:17.763744 4785 status_manager.go:851] "Failed to get status for pod" podUID="c5ec6b55-6a88-4582-b241-fea4151ff61f" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.44:6443: connect: connection refused" Nov 26 15:31:17 crc kubenswrapper[4785]: I1126 15:31:17.764678 4785 status_manager.go:851] "Failed to get status for pod" podUID="c39759f2-3183-48fa-aaee-14b24c5337d7" pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-6pwlx" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openstack-operators/pods/rabbitmq-cluster-operator-779fc9694b-6pwlx\": dial tcp 38.102.83.44:6443: connect: connection refused" Nov 26 15:31:17 crc kubenswrapper[4785]: I1126 15:31:17.764945 4785 status_manager.go:851] "Failed to get status for pod" podUID="3e7e6d72-e83c-4188-a2bc-11b7c16e6e8e" pod="openstack-operators/horizon-operator-controller-manager-647db694df-qnrxh" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openstack-operators/pods/horizon-operator-controller-manager-647db694df-qnrxh\": dial tcp 38.102.83.44:6443: connect: connection refused" Nov 26 15:31:17 crc kubenswrapper[4785]: I1126 15:31:17.765152 4785 status_manager.go:851] "Failed to get status for pod" podUID="c5ec6b55-6a88-4582-b241-fea4151ff61f" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.44:6443: connect: connection refused" Nov 26 15:31:17 crc kubenswrapper[4785]: I1126 15:31:17.765475 4785 generic.go:334] "Generic (PLEG): container finished" podID="161001b1-a5be-49ea-8031-e2c11dd07800" containerID="1ae8bc58bcb4a5ddfd83e2802195db3a2eb6c8de6dfb1afb973f7491401beee1" exitCode=1 Nov 26 15:31:17 crc kubenswrapper[4785]: I1126 15:31:17.765499 4785 status_manager.go:851] "Failed to get status for pod" podUID="c39759f2-3183-48fa-aaee-14b24c5337d7" pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-6pwlx" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openstack-operators/pods/rabbitmq-cluster-operator-779fc9694b-6pwlx\": dial tcp 38.102.83.44:6443: connect: connection refused" Nov 26 15:31:17 crc kubenswrapper[4785]: I1126 15:31:17.765610 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-84667dbb5-sslgl" event={"ID":"161001b1-a5be-49ea-8031-e2c11dd07800","Type":"ContainerDied","Data":"1ae8bc58bcb4a5ddfd83e2802195db3a2eb6c8de6dfb1afb973f7491401beee1"} Nov 26 15:31:17 crc kubenswrapper[4785]: I1126 15:31:17.765712 4785 status_manager.go:851] "Failed to get status for pod" podUID="bc8e3329-ae9c-48b1-a49c-92eeef6ae114" pod="openstack-operators/keystone-operator-controller-manager-68b4f95d6c-cpkqd" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openstack-operators/pods/keystone-operator-controller-manager-68b4f95d6c-cpkqd\": dial tcp 38.102.83.44:6443: connect: connection refused" Nov 26 15:31:17 crc kubenswrapper[4785]: I1126 15:31:17.765918 4785 status_manager.go:851] "Failed to get status for pod" podUID="60b24860-07b4-4841-9c4a-a5e6456a45dc" pod="openstack-operators/swift-operator-controller-manager-5d784fc5bb-kn67f" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openstack-operators/pods/swift-operator-controller-manager-5d784fc5bb-kn67f\": dial tcp 38.102.83.44:6443: connect: connection refused" Nov 26 15:31:17 crc kubenswrapper[4785]: I1126 15:31:17.766169 4785 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.44:6443: connect: connection refused" Nov 26 15:31:17 crc kubenswrapper[4785]: I1126 15:31:17.766431 4785 scope.go:117] "RemoveContainer" containerID="1ae8bc58bcb4a5ddfd83e2802195db3a2eb6c8de6dfb1afb973f7491401beee1" Nov 26 15:31:17 crc kubenswrapper[4785]: I1126 15:31:17.766444 4785 status_manager.go:851] "Failed to get status for pod" podUID="161001b1-a5be-49ea-8031-e2c11dd07800" pod="metallb-system/metallb-operator-controller-manager-84667dbb5-sslgl" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/metallb-system/pods/metallb-operator-controller-manager-84667dbb5-sslgl\": dial tcp 38.102.83.44:6443: connect: connection refused" Nov 26 15:31:17 crc kubenswrapper[4785]: I1126 15:31:17.766811 4785 status_manager.go:851] "Failed to get status for pod" podUID="8d72a0d6-d729-4c0f-90c7-2a5eca6fc32a" pod="openstack-operators/mariadb-operator-controller-manager-747fb5cb85-5slw2" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openstack-operators/pods/mariadb-operator-controller-manager-747fb5cb85-5slw2\": dial tcp 38.102.83.44:6443: connect: connection refused" Nov 26 15:31:17 crc kubenswrapper[4785]: E1126 15:31:17.766875 4785 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=manager pod=metallb-operator-controller-manager-84667dbb5-sslgl_metallb-system(161001b1-a5be-49ea-8031-e2c11dd07800)\"" pod="metallb-system/metallb-operator-controller-manager-84667dbb5-sslgl" podUID="161001b1-a5be-49ea-8031-e2c11dd07800" Nov 26 15:31:17 crc kubenswrapper[4785]: I1126 15:31:17.767250 4785 status_manager.go:851] "Failed to get status for pod" podUID="3e7e6d72-e83c-4188-a2bc-11b7c16e6e8e" pod="openstack-operators/horizon-operator-controller-manager-647db694df-qnrxh" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openstack-operators/pods/horizon-operator-controller-manager-647db694df-qnrxh\": dial tcp 38.102.83.44:6443: connect: connection refused" Nov 26 15:31:17 crc kubenswrapper[4785]: I1126 15:31:17.767458 4785 generic.go:334] "Generic (PLEG): container finished" podID="62cac43a-a147-46b5-bbd6-4b452a008291" containerID="76f39fa4101934ea2d6699cd939e3a8b8fd3a504908f7a2fc04158b6f504764c" exitCode=1 Nov 26 15:31:17 crc kubenswrapper[4785]: I1126 15:31:17.767522 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-56ccd5f88c-dzft5" event={"ID":"62cac43a-a147-46b5-bbd6-4b452a008291","Type":"ContainerDied","Data":"76f39fa4101934ea2d6699cd939e3a8b8fd3a504908f7a2fc04158b6f504764c"} Nov 26 15:31:17 crc kubenswrapper[4785]: I1126 15:31:17.767471 4785 status_manager.go:851] "Failed to get status for pod" podUID="c5ec6b55-6a88-4582-b241-fea4151ff61f" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.44:6443: connect: connection refused" Nov 26 15:31:17 crc kubenswrapper[4785]: I1126 15:31:17.767890 4785 status_manager.go:851] "Failed to get status for pod" podUID="c39759f2-3183-48fa-aaee-14b24c5337d7" pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-6pwlx" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openstack-operators/pods/rabbitmq-cluster-operator-779fc9694b-6pwlx\": dial tcp 38.102.83.44:6443: connect: connection refused" Nov 26 15:31:17 crc kubenswrapper[4785]: I1126 15:31:17.767916 4785 scope.go:117] "RemoveContainer" containerID="76f39fa4101934ea2d6699cd939e3a8b8fd3a504908f7a2fc04158b6f504764c" Nov 26 15:31:17 crc kubenswrapper[4785]: I1126 15:31:17.768219 4785 status_manager.go:851] "Failed to get status for pod" podUID="bc8e3329-ae9c-48b1-a49c-92eeef6ae114" pod="openstack-operators/keystone-operator-controller-manager-68b4f95d6c-cpkqd" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openstack-operators/pods/keystone-operator-controller-manager-68b4f95d6c-cpkqd\": dial tcp 38.102.83.44:6443: connect: connection refused" Nov 26 15:31:17 crc kubenswrapper[4785]: I1126 15:31:17.768440 4785 status_manager.go:851] "Failed to get status for pod" podUID="60b24860-07b4-4841-9c4a-a5e6456a45dc" pod="openstack-operators/swift-operator-controller-manager-5d784fc5bb-kn67f" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openstack-operators/pods/swift-operator-controller-manager-5d784fc5bb-kn67f\": dial tcp 38.102.83.44:6443: connect: connection refused" Nov 26 15:31:17 crc kubenswrapper[4785]: I1126 15:31:17.768669 4785 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.44:6443: connect: connection refused" Nov 26 15:31:17 crc kubenswrapper[4785]: I1126 15:31:17.768866 4785 status_manager.go:851] "Failed to get status for pod" podUID="161001b1-a5be-49ea-8031-e2c11dd07800" pod="metallb-system/metallb-operator-controller-manager-84667dbb5-sslgl" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/metallb-system/pods/metallb-operator-controller-manager-84667dbb5-sslgl\": dial tcp 38.102.83.44:6443: connect: connection refused" Nov 26 15:31:17 crc kubenswrapper[4785]: I1126 15:31:17.768980 4785 generic.go:334] "Generic (PLEG): container finished" podID="3e7e6d72-e83c-4188-a2bc-11b7c16e6e8e" containerID="a4338cbefff3dd3d29f22b12ea8c1868649745ea0b7ff794ac101d8c6ac73777" exitCode=1 Nov 26 15:31:17 crc kubenswrapper[4785]: I1126 15:31:17.769035 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-647db694df-qnrxh" event={"ID":"3e7e6d72-e83c-4188-a2bc-11b7c16e6e8e","Type":"ContainerDied","Data":"a4338cbefff3dd3d29f22b12ea8c1868649745ea0b7ff794ac101d8c6ac73777"} Nov 26 15:31:17 crc kubenswrapper[4785]: I1126 15:31:17.769056 4785 status_manager.go:851] "Failed to get status for pod" podUID="8d72a0d6-d729-4c0f-90c7-2a5eca6fc32a" pod="openstack-operators/mariadb-operator-controller-manager-747fb5cb85-5slw2" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openstack-operators/pods/mariadb-operator-controller-manager-747fb5cb85-5slw2\": dial tcp 38.102.83.44:6443: connect: connection refused" Nov 26 15:31:17 crc kubenswrapper[4785]: I1126 15:31:17.769346 4785 scope.go:117] "RemoveContainer" containerID="a4338cbefff3dd3d29f22b12ea8c1868649745ea0b7ff794ac101d8c6ac73777" Nov 26 15:31:17 crc kubenswrapper[4785]: I1126 15:31:17.769431 4785 status_manager.go:851] "Failed to get status for pod" podUID="3e7e6d72-e83c-4188-a2bc-11b7c16e6e8e" pod="openstack-operators/horizon-operator-controller-manager-647db694df-qnrxh" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openstack-operators/pods/horizon-operator-controller-manager-647db694df-qnrxh\": dial tcp 38.102.83.44:6443: connect: connection refused" Nov 26 15:31:17 crc kubenswrapper[4785]: E1126 15:31:17.769499 4785 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=manager pod=horizon-operator-controller-manager-647db694df-qnrxh_openstack-operators(3e7e6d72-e83c-4188-a2bc-11b7c16e6e8e)\"" pod="openstack-operators/horizon-operator-controller-manager-647db694df-qnrxh" podUID="3e7e6d72-e83c-4188-a2bc-11b7c16e6e8e" Nov 26 15:31:17 crc kubenswrapper[4785]: I1126 15:31:17.769648 4785 status_manager.go:851] "Failed to get status for pod" podUID="c5ec6b55-6a88-4582-b241-fea4151ff61f" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.44:6443: connect: connection refused" Nov 26 15:31:17 crc kubenswrapper[4785]: I1126 15:31:17.769858 4785 status_manager.go:851] "Failed to get status for pod" podUID="c39759f2-3183-48fa-aaee-14b24c5337d7" pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-6pwlx" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openstack-operators/pods/rabbitmq-cluster-operator-779fc9694b-6pwlx\": dial tcp 38.102.83.44:6443: connect: connection refused" Nov 26 15:31:17 crc kubenswrapper[4785]: I1126 15:31:17.770051 4785 status_manager.go:851] "Failed to get status for pod" podUID="bc8e3329-ae9c-48b1-a49c-92eeef6ae114" pod="openstack-operators/keystone-operator-controller-manager-68b4f95d6c-cpkqd" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openstack-operators/pods/keystone-operator-controller-manager-68b4f95d6c-cpkqd\": dial tcp 38.102.83.44:6443: connect: connection refused" Nov 26 15:31:17 crc kubenswrapper[4785]: I1126 15:31:17.770241 4785 status_manager.go:851] "Failed to get status for pod" podUID="62cac43a-a147-46b5-bbd6-4b452a008291" pod="openstack-operators/glance-operator-controller-manager-56ccd5f88c-dzft5" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openstack-operators/pods/glance-operator-controller-manager-56ccd5f88c-dzft5\": dial tcp 38.102.83.44:6443: connect: connection refused" Nov 26 15:31:17 crc kubenswrapper[4785]: I1126 15:31:17.770421 4785 status_manager.go:851] "Failed to get status for pod" podUID="60b24860-07b4-4841-9c4a-a5e6456a45dc" pod="openstack-operators/swift-operator-controller-manager-5d784fc5bb-kn67f" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openstack-operators/pods/swift-operator-controller-manager-5d784fc5bb-kn67f\": dial tcp 38.102.83.44:6443: connect: connection refused" Nov 26 15:31:17 crc kubenswrapper[4785]: I1126 15:31:17.770664 4785 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.44:6443: connect: connection refused" Nov 26 15:31:17 crc kubenswrapper[4785]: I1126 15:31:17.770861 4785 status_manager.go:851] "Failed to get status for pod" podUID="161001b1-a5be-49ea-8031-e2c11dd07800" pod="metallb-system/metallb-operator-controller-manager-84667dbb5-sslgl" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/metallb-system/pods/metallb-operator-controller-manager-84667dbb5-sslgl\": dial tcp 38.102.83.44:6443: connect: connection refused" Nov 26 15:31:17 crc kubenswrapper[4785]: I1126 15:31:17.771049 4785 status_manager.go:851] "Failed to get status for pod" podUID="8d72a0d6-d729-4c0f-90c7-2a5eca6fc32a" pod="openstack-operators/mariadb-operator-controller-manager-747fb5cb85-5slw2" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openstack-operators/pods/mariadb-operator-controller-manager-747fb5cb85-5slw2\": dial tcp 38.102.83.44:6443: connect: connection refused" Nov 26 15:31:17 crc kubenswrapper[4785]: I1126 15:31:17.771361 4785 status_manager.go:851] "Failed to get status for pod" podUID="3e7e6d72-e83c-4188-a2bc-11b7c16e6e8e" pod="openstack-operators/horizon-operator-controller-manager-647db694df-qnrxh" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openstack-operators/pods/horizon-operator-controller-manager-647db694df-qnrxh\": dial tcp 38.102.83.44:6443: connect: connection refused" Nov 26 15:31:17 crc kubenswrapper[4785]: I1126 15:31:17.771746 4785 generic.go:334] "Generic (PLEG): container finished" podID="b37e9a18-1d3b-4a5b-aa23-d9cc2f6394e3" containerID="cd10c2f7bdcff41f08931840f5f399b0f572b7eed28573f56f2a292c3fb744c1" exitCode=1 Nov 26 15:31:17 crc kubenswrapper[4785]: I1126 15:31:17.771784 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-f68bdc44b-4p65x" event={"ID":"b37e9a18-1d3b-4a5b-aa23-d9cc2f6394e3","Type":"ContainerDied","Data":"cd10c2f7bdcff41f08931840f5f399b0f572b7eed28573f56f2a292c3fb744c1"} Nov 26 15:31:17 crc kubenswrapper[4785]: I1126 15:31:17.771718 4785 status_manager.go:851] "Failed to get status for pod" podUID="c5ec6b55-6a88-4582-b241-fea4151ff61f" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.44:6443: connect: connection refused" Nov 26 15:31:17 crc kubenswrapper[4785]: I1126 15:31:17.772079 4785 status_manager.go:851] "Failed to get status for pod" podUID="c39759f2-3183-48fa-aaee-14b24c5337d7" pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-6pwlx" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openstack-operators/pods/rabbitmq-cluster-operator-779fc9694b-6pwlx\": dial tcp 38.102.83.44:6443: connect: connection refused" Nov 26 15:31:17 crc kubenswrapper[4785]: I1126 15:31:17.772233 4785 status_manager.go:851] "Failed to get status for pod" podUID="bc8e3329-ae9c-48b1-a49c-92eeef6ae114" pod="openstack-operators/keystone-operator-controller-manager-68b4f95d6c-cpkqd" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openstack-operators/pods/keystone-operator-controller-manager-68b4f95d6c-cpkqd\": dial tcp 38.102.83.44:6443: connect: connection refused" Nov 26 15:31:17 crc kubenswrapper[4785]: I1126 15:31:17.772369 4785 scope.go:117] "RemoveContainer" containerID="cd10c2f7bdcff41f08931840f5f399b0f572b7eed28573f56f2a292c3fb744c1" Nov 26 15:31:17 crc kubenswrapper[4785]: I1126 15:31:17.772480 4785 status_manager.go:851] "Failed to get status for pod" podUID="62cac43a-a147-46b5-bbd6-4b452a008291" pod="openstack-operators/glance-operator-controller-manager-56ccd5f88c-dzft5" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openstack-operators/pods/glance-operator-controller-manager-56ccd5f88c-dzft5\": dial tcp 38.102.83.44:6443: connect: connection refused" Nov 26 15:31:17 crc kubenswrapper[4785]: I1126 15:31:17.772815 4785 status_manager.go:851] "Failed to get status for pod" podUID="60b24860-07b4-4841-9c4a-a5e6456a45dc" pod="openstack-operators/swift-operator-controller-manager-5d784fc5bb-kn67f" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openstack-operators/pods/swift-operator-controller-manager-5d784fc5bb-kn67f\": dial tcp 38.102.83.44:6443: connect: connection refused" Nov 26 15:31:17 crc kubenswrapper[4785]: I1126 15:31:17.773034 4785 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.44:6443: connect: connection refused" Nov 26 15:31:17 crc kubenswrapper[4785]: I1126 15:31:17.773265 4785 status_manager.go:851] "Failed to get status for pod" podUID="161001b1-a5be-49ea-8031-e2c11dd07800" pod="metallb-system/metallb-operator-controller-manager-84667dbb5-sslgl" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/metallb-system/pods/metallb-operator-controller-manager-84667dbb5-sslgl\": dial tcp 38.102.83.44:6443: connect: connection refused" Nov 26 15:31:17 crc kubenswrapper[4785]: I1126 15:31:17.773434 4785 status_manager.go:851] "Failed to get status for pod" podUID="8d72a0d6-d729-4c0f-90c7-2a5eca6fc32a" pod="openstack-operators/mariadb-operator-controller-manager-747fb5cb85-5slw2" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openstack-operators/pods/mariadb-operator-controller-manager-747fb5cb85-5slw2\": dial tcp 38.102.83.44:6443: connect: connection refused" Nov 26 15:31:17 crc kubenswrapper[4785]: I1126 15:31:17.773535 4785 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="320e5c65bf01308b478954f54449ed322b8efb5a2d9b2620bfaa87733769c2b9" exitCode=0 Nov 26 15:31:17 crc kubenswrapper[4785]: I1126 15:31:17.773618 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerDied","Data":"320e5c65bf01308b478954f54449ed322b8efb5a2d9b2620bfaa87733769c2b9"} Nov 26 15:31:17 crc kubenswrapper[4785]: I1126 15:31:17.773647 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"ead08611e9d698f6a7dfb2a1f24ce3dd86af4f62d45621710b3199ed57ac2e1d"} Nov 26 15:31:17 crc kubenswrapper[4785]: I1126 15:31:17.773727 4785 status_manager.go:851] "Failed to get status for pod" podUID="8d72a0d6-d729-4c0f-90c7-2a5eca6fc32a" pod="openstack-operators/mariadb-operator-controller-manager-747fb5cb85-5slw2" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openstack-operators/pods/mariadb-operator-controller-manager-747fb5cb85-5slw2\": dial tcp 38.102.83.44:6443: connect: connection refused" Nov 26 15:31:17 crc kubenswrapper[4785]: I1126 15:31:17.773818 4785 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="ce2473df-8540-437a-9b68-a0c79c8f1189" Nov 26 15:31:17 crc kubenswrapper[4785]: I1126 15:31:17.773835 4785 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="ce2473df-8540-437a-9b68-a0c79c8f1189" Nov 26 15:31:17 crc kubenswrapper[4785]: I1126 15:31:17.773893 4785 status_manager.go:851] "Failed to get status for pod" podUID="3e7e6d72-e83c-4188-a2bc-11b7c16e6e8e" pod="openstack-operators/horizon-operator-controller-manager-647db694df-qnrxh" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openstack-operators/pods/horizon-operator-controller-manager-647db694df-qnrxh\": dial tcp 38.102.83.44:6443: connect: connection refused" Nov 26 15:31:17 crc kubenswrapper[4785]: E1126 15:31:17.774011 4785 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.44:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 26 15:31:17 crc kubenswrapper[4785]: I1126 15:31:17.774043 4785 status_manager.go:851] "Failed to get status for pod" podUID="c5ec6b55-6a88-4582-b241-fea4151ff61f" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.44:6443: connect: connection refused" Nov 26 15:31:17 crc kubenswrapper[4785]: I1126 15:31:17.774441 4785 status_manager.go:851] "Failed to get status for pod" podUID="c39759f2-3183-48fa-aaee-14b24c5337d7" pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-6pwlx" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openstack-operators/pods/rabbitmq-cluster-operator-779fc9694b-6pwlx\": dial tcp 38.102.83.44:6443: connect: connection refused" Nov 26 15:31:17 crc kubenswrapper[4785]: I1126 15:31:17.774644 4785 status_manager.go:851] "Failed to get status for pod" podUID="bc8e3329-ae9c-48b1-a49c-92eeef6ae114" pod="openstack-operators/keystone-operator-controller-manager-68b4f95d6c-cpkqd" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openstack-operators/pods/keystone-operator-controller-manager-68b4f95d6c-cpkqd\": dial tcp 38.102.83.44:6443: connect: connection refused" Nov 26 15:31:17 crc kubenswrapper[4785]: I1126 15:31:17.774949 4785 status_manager.go:851] "Failed to get status for pod" podUID="b37e9a18-1d3b-4a5b-aa23-d9cc2f6394e3" pod="openstack-operators/infra-operator-controller-manager-f68bdc44b-4p65x" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openstack-operators/pods/infra-operator-controller-manager-f68bdc44b-4p65x\": dial tcp 38.102.83.44:6443: connect: connection refused" Nov 26 15:31:17 crc kubenswrapper[4785]: I1126 15:31:17.775254 4785 status_manager.go:851] "Failed to get status for pod" podUID="62cac43a-a147-46b5-bbd6-4b452a008291" pod="openstack-operators/glance-operator-controller-manager-56ccd5f88c-dzft5" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openstack-operators/pods/glance-operator-controller-manager-56ccd5f88c-dzft5\": dial tcp 38.102.83.44:6443: connect: connection refused" Nov 26 15:31:17 crc kubenswrapper[4785]: I1126 15:31:17.775493 4785 status_manager.go:851] "Failed to get status for pod" podUID="60b24860-07b4-4841-9c4a-a5e6456a45dc" pod="openstack-operators/swift-operator-controller-manager-5d784fc5bb-kn67f" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openstack-operators/pods/swift-operator-controller-manager-5d784fc5bb-kn67f\": dial tcp 38.102.83.44:6443: connect: connection refused" Nov 26 15:31:17 crc kubenswrapper[4785]: I1126 15:31:17.775915 4785 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.44:6443: connect: connection refused" Nov 26 15:31:17 crc kubenswrapper[4785]: I1126 15:31:17.776178 4785 status_manager.go:851] "Failed to get status for pod" podUID="161001b1-a5be-49ea-8031-e2c11dd07800" pod="metallb-system/metallb-operator-controller-manager-84667dbb5-sslgl" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/metallb-system/pods/metallb-operator-controller-manager-84667dbb5-sslgl\": dial tcp 38.102.83.44:6443: connect: connection refused" Nov 26 15:31:17 crc kubenswrapper[4785]: I1126 15:31:17.776656 4785 generic.go:334] "Generic (PLEG): container finished" podID="c39759f2-3183-48fa-aaee-14b24c5337d7" containerID="f153c8fd06b9d532de8b91f47f1ded2a952edeb8132f8bf8f8790ea9ad527540" exitCode=1 Nov 26 15:31:17 crc kubenswrapper[4785]: I1126 15:31:17.776739 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-6pwlx" event={"ID":"c39759f2-3183-48fa-aaee-14b24c5337d7","Type":"ContainerDied","Data":"f153c8fd06b9d532de8b91f47f1ded2a952edeb8132f8bf8f8790ea9ad527540"} Nov 26 15:31:17 crc kubenswrapper[4785]: I1126 15:31:17.777133 4785 scope.go:117] "RemoveContainer" containerID="f153c8fd06b9d532de8b91f47f1ded2a952edeb8132f8bf8f8790ea9ad527540" Nov 26 15:31:17 crc kubenswrapper[4785]: E1126 15:31:17.777372 4785 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=operator pod=rabbitmq-cluster-operator-779fc9694b-6pwlx_openstack-operators(c39759f2-3183-48fa-aaee-14b24c5337d7)\"" pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-6pwlx" podUID="c39759f2-3183-48fa-aaee-14b24c5337d7" Nov 26 15:31:17 crc kubenswrapper[4785]: I1126 15:31:17.776471 4785 status_manager.go:851] "Failed to get status for pod" podUID="161001b1-a5be-49ea-8031-e2c11dd07800" pod="metallb-system/metallb-operator-controller-manager-84667dbb5-sslgl" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/metallb-system/pods/metallb-operator-controller-manager-84667dbb5-sslgl\": dial tcp 38.102.83.44:6443: connect: connection refused" Nov 26 15:31:17 crc kubenswrapper[4785]: I1126 15:31:17.779964 4785 status_manager.go:851] "Failed to get status for pod" podUID="8d72a0d6-d729-4c0f-90c7-2a5eca6fc32a" pod="openstack-operators/mariadb-operator-controller-manager-747fb5cb85-5slw2" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openstack-operators/pods/mariadb-operator-controller-manager-747fb5cb85-5slw2\": dial tcp 38.102.83.44:6443: connect: connection refused" Nov 26 15:31:17 crc kubenswrapper[4785]: I1126 15:31:17.780141 4785 status_manager.go:851] "Failed to get status for pod" podUID="3e7e6d72-e83c-4188-a2bc-11b7c16e6e8e" pod="openstack-operators/horizon-operator-controller-manager-647db694df-qnrxh" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openstack-operators/pods/horizon-operator-controller-manager-647db694df-qnrxh\": dial tcp 38.102.83.44:6443: connect: connection refused" Nov 26 15:31:17 crc kubenswrapper[4785]: I1126 15:31:17.780348 4785 status_manager.go:851] "Failed to get status for pod" podUID="c5ec6b55-6a88-4582-b241-fea4151ff61f" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.44:6443: connect: connection refused" Nov 26 15:31:17 crc kubenswrapper[4785]: I1126 15:31:17.780580 4785 status_manager.go:851] "Failed to get status for pod" podUID="c39759f2-3183-48fa-aaee-14b24c5337d7" pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-6pwlx" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openstack-operators/pods/rabbitmq-cluster-operator-779fc9694b-6pwlx\": dial tcp 38.102.83.44:6443: connect: connection refused" Nov 26 15:31:17 crc kubenswrapper[4785]: I1126 15:31:17.780801 4785 status_manager.go:851] "Failed to get status for pod" podUID="bc8e3329-ae9c-48b1-a49c-92eeef6ae114" pod="openstack-operators/keystone-operator-controller-manager-68b4f95d6c-cpkqd" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openstack-operators/pods/keystone-operator-controller-manager-68b4f95d6c-cpkqd\": dial tcp 38.102.83.44:6443: connect: connection refused" Nov 26 15:31:17 crc kubenswrapper[4785]: I1126 15:31:17.781028 4785 status_manager.go:851] "Failed to get status for pod" podUID="b37e9a18-1d3b-4a5b-aa23-d9cc2f6394e3" pod="openstack-operators/infra-operator-controller-manager-f68bdc44b-4p65x" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openstack-operators/pods/infra-operator-controller-manager-f68bdc44b-4p65x\": dial tcp 38.102.83.44:6443: connect: connection refused" Nov 26 15:31:17 crc kubenswrapper[4785]: I1126 15:31:17.781231 4785 status_manager.go:851] "Failed to get status for pod" podUID="62cac43a-a147-46b5-bbd6-4b452a008291" pod="openstack-operators/glance-operator-controller-manager-56ccd5f88c-dzft5" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openstack-operators/pods/glance-operator-controller-manager-56ccd5f88c-dzft5\": dial tcp 38.102.83.44:6443: connect: connection refused" Nov 26 15:31:17 crc kubenswrapper[4785]: I1126 15:31:17.781422 4785 status_manager.go:851] "Failed to get status for pod" podUID="60b24860-07b4-4841-9c4a-a5e6456a45dc" pod="openstack-operators/swift-operator-controller-manager-5d784fc5bb-kn67f" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openstack-operators/pods/swift-operator-controller-manager-5d784fc5bb-kn67f\": dial tcp 38.102.83.44:6443: connect: connection refused" Nov 26 15:31:17 crc kubenswrapper[4785]: I1126 15:31:17.781650 4785 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.44:6443: connect: connection refused" Nov 26 15:31:17 crc kubenswrapper[4785]: I1126 15:31:17.782007 4785 status_manager.go:851] "Failed to get status for pod" podUID="3e7e6d72-e83c-4188-a2bc-11b7c16e6e8e" pod="openstack-operators/horizon-operator-controller-manager-647db694df-qnrxh" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openstack-operators/pods/horizon-operator-controller-manager-647db694df-qnrxh\": dial tcp 38.102.83.44:6443: connect: connection refused" Nov 26 15:31:17 crc kubenswrapper[4785]: I1126 15:31:17.782330 4785 status_manager.go:851] "Failed to get status for pod" podUID="c5ec6b55-6a88-4582-b241-fea4151ff61f" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.44:6443: connect: connection refused" Nov 26 15:31:17 crc kubenswrapper[4785]: I1126 15:31:17.782899 4785 status_manager.go:851] "Failed to get status for pod" podUID="c39759f2-3183-48fa-aaee-14b24c5337d7" pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-6pwlx" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openstack-operators/pods/rabbitmq-cluster-operator-779fc9694b-6pwlx\": dial tcp 38.102.83.44:6443: connect: connection refused" Nov 26 15:31:17 crc kubenswrapper[4785]: I1126 15:31:17.783252 4785 status_manager.go:851] "Failed to get status for pod" podUID="bc8e3329-ae9c-48b1-a49c-92eeef6ae114" pod="openstack-operators/keystone-operator-controller-manager-68b4f95d6c-cpkqd" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openstack-operators/pods/keystone-operator-controller-manager-68b4f95d6c-cpkqd\": dial tcp 38.102.83.44:6443: connect: connection refused" Nov 26 15:31:17 crc kubenswrapper[4785]: I1126 15:31:17.783495 4785 status_manager.go:851] "Failed to get status for pod" podUID="b37e9a18-1d3b-4a5b-aa23-d9cc2f6394e3" pod="openstack-operators/infra-operator-controller-manager-f68bdc44b-4p65x" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openstack-operators/pods/infra-operator-controller-manager-f68bdc44b-4p65x\": dial tcp 38.102.83.44:6443: connect: connection refused" Nov 26 15:31:17 crc kubenswrapper[4785]: I1126 15:31:17.783816 4785 status_manager.go:851] "Failed to get status for pod" podUID="62cac43a-a147-46b5-bbd6-4b452a008291" pod="openstack-operators/glance-operator-controller-manager-56ccd5f88c-dzft5" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openstack-operators/pods/glance-operator-controller-manager-56ccd5f88c-dzft5\": dial tcp 38.102.83.44:6443: connect: connection refused" Nov 26 15:31:17 crc kubenswrapper[4785]: I1126 15:31:17.784081 4785 status_manager.go:851] "Failed to get status for pod" podUID="60b24860-07b4-4841-9c4a-a5e6456a45dc" pod="openstack-operators/swift-operator-controller-manager-5d784fc5bb-kn67f" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openstack-operators/pods/swift-operator-controller-manager-5d784fc5bb-kn67f\": dial tcp 38.102.83.44:6443: connect: connection refused" Nov 26 15:31:17 crc kubenswrapper[4785]: I1126 15:31:17.784333 4785 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.44:6443: connect: connection refused" Nov 26 15:31:17 crc kubenswrapper[4785]: I1126 15:31:17.784677 4785 status_manager.go:851] "Failed to get status for pod" podUID="161001b1-a5be-49ea-8031-e2c11dd07800" pod="metallb-system/metallb-operator-controller-manager-84667dbb5-sslgl" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/metallb-system/pods/metallb-operator-controller-manager-84667dbb5-sslgl\": dial tcp 38.102.83.44:6443: connect: connection refused" Nov 26 15:31:17 crc kubenswrapper[4785]: I1126 15:31:17.784917 4785 status_manager.go:851] "Failed to get status for pod" podUID="8d72a0d6-d729-4c0f-90c7-2a5eca6fc32a" pod="openstack-operators/mariadb-operator-controller-manager-747fb5cb85-5slw2" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openstack-operators/pods/mariadb-operator-controller-manager-747fb5cb85-5slw2\": dial tcp 38.102.83.44:6443: connect: connection refused" Nov 26 15:31:17 crc kubenswrapper[4785]: I1126 15:31:17.803195 4785 scope.go:117] "RemoveContainer" containerID="74ec31c8a8a5f6a8a079033bb0b6dc933a2ee0a52ccbde9648717cfa0cc00d91" Nov 26 15:31:17 crc kubenswrapper[4785]: I1126 15:31:17.834197 4785 scope.go:117] "RemoveContainer" containerID="49e10019708c7c760007229dd4fdf7afe9b43556b42661516134fda3234f359a" Nov 26 15:31:17 crc kubenswrapper[4785]: I1126 15:31:17.948516 4785 scope.go:117] "RemoveContainer" containerID="4a9013328a4c852f6798521a620dd057d4cb7ec37f86824d3e564c9aa58dfbd8" Nov 26 15:31:18 crc kubenswrapper[4785]: I1126 15:31:18.017799 4785 scope.go:117] "RemoveContainer" containerID="a8852f789a2088c60fe9325c6132d607841f1460ce489ff9a31ce9c9aaf74710" Nov 26 15:31:18 crc kubenswrapper[4785]: I1126 15:31:18.800703 4785 generic.go:334] "Generic (PLEG): container finished" podID="60b24860-07b4-4841-9c4a-a5e6456a45dc" containerID="a09871758ad26f7ef77da7d8a8b9faa644fe8ef538dd79721ded36c93b27ae7c" exitCode=1 Nov 26 15:31:18 crc kubenswrapper[4785]: I1126 15:31:18.800780 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-5d784fc5bb-kn67f" event={"ID":"60b24860-07b4-4841-9c4a-a5e6456a45dc","Type":"ContainerDied","Data":"a09871758ad26f7ef77da7d8a8b9faa644fe8ef538dd79721ded36c93b27ae7c"} Nov 26 15:31:18 crc kubenswrapper[4785]: I1126 15:31:18.801115 4785 scope.go:117] "RemoveContainer" containerID="5b25616e36bb67a22a9385ef0360a339f8572ff565f1fff90f7f634a3370f4b4" Nov 26 15:31:18 crc kubenswrapper[4785]: I1126 15:31:18.801727 4785 scope.go:117] "RemoveContainer" containerID="a09871758ad26f7ef77da7d8a8b9faa644fe8ef538dd79721ded36c93b27ae7c" Nov 26 15:31:18 crc kubenswrapper[4785]: E1126 15:31:18.801960 4785 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=manager pod=swift-operator-controller-manager-5d784fc5bb-kn67f_openstack-operators(60b24860-07b4-4841-9c4a-a5e6456a45dc)\"" pod="openstack-operators/swift-operator-controller-manager-5d784fc5bb-kn67f" podUID="60b24860-07b4-4841-9c4a-a5e6456a45dc" Nov 26 15:31:18 crc kubenswrapper[4785]: I1126 15:31:18.806295 4785 generic.go:334] "Generic (PLEG): container finished" podID="bc8e3329-ae9c-48b1-a49c-92eeef6ae114" containerID="e16c47549cb3b081e50bcdd5671c357ed565da23b924429f09721b1a29c4ab11" exitCode=1 Nov 26 15:31:18 crc kubenswrapper[4785]: I1126 15:31:18.806368 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-68b4f95d6c-cpkqd" event={"ID":"bc8e3329-ae9c-48b1-a49c-92eeef6ae114","Type":"ContainerDied","Data":"e16c47549cb3b081e50bcdd5671c357ed565da23b924429f09721b1a29c4ab11"} Nov 26 15:31:18 crc kubenswrapper[4785]: I1126 15:31:18.807022 4785 scope.go:117] "RemoveContainer" containerID="e16c47549cb3b081e50bcdd5671c357ed565da23b924429f09721b1a29c4ab11" Nov 26 15:31:18 crc kubenswrapper[4785]: E1126 15:31:18.807285 4785 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=manager pod=keystone-operator-controller-manager-68b4f95d6c-cpkqd_openstack-operators(bc8e3329-ae9c-48b1-a49c-92eeef6ae114)\"" pod="openstack-operators/keystone-operator-controller-manager-68b4f95d6c-cpkqd" podUID="bc8e3329-ae9c-48b1-a49c-92eeef6ae114" Nov 26 15:31:18 crc kubenswrapper[4785]: I1126 15:31:18.841175 4785 generic.go:334] "Generic (PLEG): container finished" podID="b37e9a18-1d3b-4a5b-aa23-d9cc2f6394e3" containerID="9ab49f54300742800821728bac50f7c6edebd7d7a1451d0d701d2787f2a52730" exitCode=1 Nov 26 15:31:18 crc kubenswrapper[4785]: I1126 15:31:18.841258 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-f68bdc44b-4p65x" event={"ID":"b37e9a18-1d3b-4a5b-aa23-d9cc2f6394e3","Type":"ContainerDied","Data":"9ab49f54300742800821728bac50f7c6edebd7d7a1451d0d701d2787f2a52730"} Nov 26 15:31:18 crc kubenswrapper[4785]: I1126 15:31:18.841917 4785 scope.go:117] "RemoveContainer" containerID="9ab49f54300742800821728bac50f7c6edebd7d7a1451d0d701d2787f2a52730" Nov 26 15:31:18 crc kubenswrapper[4785]: E1126 15:31:18.842366 4785 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=manager pod=infra-operator-controller-manager-f68bdc44b-4p65x_openstack-operators(b37e9a18-1d3b-4a5b-aa23-d9cc2f6394e3)\"" pod="openstack-operators/infra-operator-controller-manager-f68bdc44b-4p65x" podUID="b37e9a18-1d3b-4a5b-aa23-d9cc2f6394e3" Nov 26 15:31:18 crc kubenswrapper[4785]: I1126 15:31:18.850708 4785 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Nov 26 15:31:18 crc kubenswrapper[4785]: I1126 15:31:18.850783 4785 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="1b79ab88118b1e683ad26fc3bd390961d15af99cce73126c56587c0df819f3a5" exitCode=1 Nov 26 15:31:18 crc kubenswrapper[4785]: I1126 15:31:18.850866 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"1b79ab88118b1e683ad26fc3bd390961d15af99cce73126c56587c0df819f3a5"} Nov 26 15:31:18 crc kubenswrapper[4785]: I1126 15:31:18.851758 4785 scope.go:117] "RemoveContainer" containerID="1b79ab88118b1e683ad26fc3bd390961d15af99cce73126c56587c0df819f3a5" Nov 26 15:31:18 crc kubenswrapper[4785]: I1126 15:31:18.856269 4785 generic.go:334] "Generic (PLEG): container finished" podID="62cac43a-a147-46b5-bbd6-4b452a008291" containerID="f56706b4672836f8f2971f85f1f0abaf0f06e449de9fb09dea15da5117600707" exitCode=1 Nov 26 15:31:18 crc kubenswrapper[4785]: I1126 15:31:18.856366 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-56ccd5f88c-dzft5" event={"ID":"62cac43a-a147-46b5-bbd6-4b452a008291","Type":"ContainerDied","Data":"f56706b4672836f8f2971f85f1f0abaf0f06e449de9fb09dea15da5117600707"} Nov 26 15:31:18 crc kubenswrapper[4785]: I1126 15:31:18.857044 4785 scope.go:117] "RemoveContainer" containerID="f56706b4672836f8f2971f85f1f0abaf0f06e449de9fb09dea15da5117600707" Nov 26 15:31:18 crc kubenswrapper[4785]: E1126 15:31:18.857309 4785 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=manager pod=glance-operator-controller-manager-56ccd5f88c-dzft5_openstack-operators(62cac43a-a147-46b5-bbd6-4b452a008291)\"" pod="openstack-operators/glance-operator-controller-manager-56ccd5f88c-dzft5" podUID="62cac43a-a147-46b5-bbd6-4b452a008291" Nov 26 15:31:18 crc kubenswrapper[4785]: I1126 15:31:18.860083 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"b7397084814389214913c2a57338e8bda28b1dcc61d6f02ea60f6832a0ae5429"} Nov 26 15:31:18 crc kubenswrapper[4785]: I1126 15:31:18.860112 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"62a47ccee5de70dbe5061f253ca14fd0b4bb4a50bb55a64909677d5d7cb627a4"} Nov 26 15:31:18 crc kubenswrapper[4785]: I1126 15:31:18.912894 4785 scope.go:117] "RemoveContainer" containerID="6f9f9b6a1f351c004314e3206140ceb097bd494a3c7a042b212fcd42e9327814" Nov 26 15:31:18 crc kubenswrapper[4785]: I1126 15:31:18.997055 4785 scope.go:117] "RemoveContainer" containerID="cd10c2f7bdcff41f08931840f5f399b0f572b7eed28573f56f2a292c3fb744c1" Nov 26 15:31:19 crc kubenswrapper[4785]: I1126 15:31:19.033703 4785 scope.go:117] "RemoveContainer" containerID="76f39fa4101934ea2d6699cd939e3a8b8fd3a504908f7a2fc04158b6f504764c" Nov 26 15:31:19 crc kubenswrapper[4785]: I1126 15:31:19.132639 4785 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 26 15:31:19 crc kubenswrapper[4785]: I1126 15:31:19.211507 4785 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-84667dbb5-sslgl" Nov 26 15:31:19 crc kubenswrapper[4785]: I1126 15:31:19.212146 4785 scope.go:117] "RemoveContainer" containerID="1ae8bc58bcb4a5ddfd83e2802195db3a2eb6c8de6dfb1afb973f7491401beee1" Nov 26 15:31:19 crc kubenswrapper[4785]: E1126 15:31:19.212324 4785 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=manager pod=metallb-operator-controller-manager-84667dbb5-sslgl_metallb-system(161001b1-a5be-49ea-8031-e2c11dd07800)\"" pod="metallb-system/metallb-operator-controller-manager-84667dbb5-sslgl" podUID="161001b1-a5be-49ea-8031-e2c11dd07800" Nov 26 15:31:19 crc kubenswrapper[4785]: I1126 15:31:19.875412 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"c0c991f00ee7d12cd3a76a62a8595cb73267af6783ad9bf7134c5a31eec24d06"} Nov 26 15:31:19 crc kubenswrapper[4785]: I1126 15:31:19.876767 4785 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 26 15:31:19 crc kubenswrapper[4785]: I1126 15:31:19.876866 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"06a4c532a933a25e1165ba0dfa8827b1e2656368823f72e3b8eb1bf3f3946b9e"} Nov 26 15:31:19 crc kubenswrapper[4785]: I1126 15:31:19.876955 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"d97f02d3a8c9883a49618db0e6b396a52413070c81f8fb5b0568fcb222e1056e"} Nov 26 15:31:19 crc kubenswrapper[4785]: I1126 15:31:19.875661 4785 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="ce2473df-8540-437a-9b68-a0c79c8f1189" Nov 26 15:31:19 crc kubenswrapper[4785]: I1126 15:31:19.877102 4785 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="ce2473df-8540-437a-9b68-a0c79c8f1189" Nov 26 15:31:19 crc kubenswrapper[4785]: I1126 15:31:19.884289 4785 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Nov 26 15:31:19 crc kubenswrapper[4785]: I1126 15:31:19.884349 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"2b19fb3b534fe040ff113bbd25a0e3ae368317725ce11ac936e80c99232a43f9"} Nov 26 15:31:20 crc kubenswrapper[4785]: I1126 15:31:20.902971 4785 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-68b4f95d6c-cpkqd" Nov 26 15:31:20 crc kubenswrapper[4785]: I1126 15:31:20.903648 4785 scope.go:117] "RemoveContainer" containerID="e16c47549cb3b081e50bcdd5671c357ed565da23b924429f09721b1a29c4ab11" Nov 26 15:31:20 crc kubenswrapper[4785]: E1126 15:31:20.903971 4785 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=manager pod=keystone-operator-controller-manager-68b4f95d6c-cpkqd_openstack-operators(bc8e3329-ae9c-48b1-a49c-92eeef6ae114)\"" pod="openstack-operators/keystone-operator-controller-manager-68b4f95d6c-cpkqd" podUID="bc8e3329-ae9c-48b1-a49c-92eeef6ae114" Nov 26 15:31:20 crc kubenswrapper[4785]: I1126 15:31:20.981052 4785 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-5d784fc5bb-kn67f" Nov 26 15:31:20 crc kubenswrapper[4785]: I1126 15:31:20.981096 4785 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack-operators/swift-operator-controller-manager-5d784fc5bb-kn67f" Nov 26 15:31:20 crc kubenswrapper[4785]: I1126 15:31:20.981945 4785 scope.go:117] "RemoveContainer" containerID="a09871758ad26f7ef77da7d8a8b9faa644fe8ef538dd79721ded36c93b27ae7c" Nov 26 15:31:20 crc kubenswrapper[4785]: E1126 15:31:20.982296 4785 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=manager pod=swift-operator-controller-manager-5d784fc5bb-kn67f_openstack-operators(60b24860-07b4-4841-9c4a-a5e6456a45dc)\"" pod="openstack-operators/swift-operator-controller-manager-5d784fc5bb-kn67f" podUID="60b24860-07b4-4841-9c4a-a5e6456a45dc" Nov 26 15:31:21 crc kubenswrapper[4785]: I1126 15:31:21.145100 4785 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack-operators/glance-operator-controller-manager-56ccd5f88c-dzft5" Nov 26 15:31:21 crc kubenswrapper[4785]: I1126 15:31:21.145193 4785 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-56ccd5f88c-dzft5" Nov 26 15:31:21 crc kubenswrapper[4785]: I1126 15:31:21.146120 4785 scope.go:117] "RemoveContainer" containerID="f56706b4672836f8f2971f85f1f0abaf0f06e449de9fb09dea15da5117600707" Nov 26 15:31:21 crc kubenswrapper[4785]: E1126 15:31:21.146494 4785 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=manager pod=glance-operator-controller-manager-56ccd5f88c-dzft5_openstack-operators(62cac43a-a147-46b5-bbd6-4b452a008291)\"" pod="openstack-operators/glance-operator-controller-manager-56ccd5f88c-dzft5" podUID="62cac43a-a147-46b5-bbd6-4b452a008291" Nov 26 15:31:21 crc kubenswrapper[4785]: I1126 15:31:21.884106 4785 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-647db694df-qnrxh" Nov 26 15:31:21 crc kubenswrapper[4785]: I1126 15:31:21.884965 4785 scope.go:117] "RemoveContainer" containerID="a4338cbefff3dd3d29f22b12ea8c1868649745ea0b7ff794ac101d8c6ac73777" Nov 26 15:31:21 crc kubenswrapper[4785]: E1126 15:31:21.885206 4785 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=manager pod=horizon-operator-controller-manager-647db694df-qnrxh_openstack-operators(3e7e6d72-e83c-4188-a2bc-11b7c16e6e8e)\"" pod="openstack-operators/horizon-operator-controller-manager-647db694df-qnrxh" podUID="3e7e6d72-e83c-4188-a2bc-11b7c16e6e8e" Nov 26 15:31:22 crc kubenswrapper[4785]: I1126 15:31:22.148161 4785 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 26 15:31:22 crc kubenswrapper[4785]: I1126 15:31:22.148236 4785 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 26 15:31:22 crc kubenswrapper[4785]: I1126 15:31:22.153004 4785 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 26 15:31:24 crc kubenswrapper[4785]: I1126 15:31:24.384021 4785 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-747fb5cb85-5slw2" Nov 26 15:31:24 crc kubenswrapper[4785]: I1126 15:31:24.385154 4785 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack-operators/mariadb-operator-controller-manager-747fb5cb85-5slw2" Nov 26 15:31:24 crc kubenswrapper[4785]: I1126 15:31:24.385704 4785 scope.go:117] "RemoveContainer" containerID="cd44bf4be8266213edfeb75984a0f32780c31cbacd9894cbd20a9535fb5dfd62" Nov 26 15:31:24 crc kubenswrapper[4785]: E1126 15:31:24.385921 4785 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=manager pod=mariadb-operator-controller-manager-747fb5cb85-5slw2_openstack-operators(8d72a0d6-d729-4c0f-90c7-2a5eca6fc32a)\"" pod="openstack-operators/mariadb-operator-controller-manager-747fb5cb85-5slw2" podUID="8d72a0d6-d729-4c0f-90c7-2a5eca6fc32a" Nov 26 15:31:24 crc kubenswrapper[4785]: I1126 15:31:24.894508 4785 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 26 15:31:24 crc kubenswrapper[4785]: I1126 15:31:24.927377 4785 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="ce2473df-8540-437a-9b68-a0c79c8f1189" Nov 26 15:31:24 crc kubenswrapper[4785]: I1126 15:31:24.927415 4785 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="ce2473df-8540-437a-9b68-a0c79c8f1189" Nov 26 15:31:24 crc kubenswrapper[4785]: I1126 15:31:24.927391 4785 scope.go:117] "RemoveContainer" containerID="cd44bf4be8266213edfeb75984a0f32780c31cbacd9894cbd20a9535fb5dfd62" Nov 26 15:31:24 crc kubenswrapper[4785]: E1126 15:31:24.928092 4785 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=manager pod=mariadb-operator-controller-manager-747fb5cb85-5slw2_openstack-operators(8d72a0d6-d729-4c0f-90c7-2a5eca6fc32a)\"" pod="openstack-operators/mariadb-operator-controller-manager-747fb5cb85-5slw2" podUID="8d72a0d6-d729-4c0f-90c7-2a5eca6fc32a" Nov 26 15:31:24 crc kubenswrapper[4785]: I1126 15:31:24.933827 4785 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 26 15:31:25 crc kubenswrapper[4785]: I1126 15:31:25.937332 4785 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="ce2473df-8540-437a-9b68-a0c79c8f1189" Nov 26 15:31:25 crc kubenswrapper[4785]: I1126 15:31:25.937373 4785 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="ce2473df-8540-437a-9b68-a0c79c8f1189" Nov 26 15:31:26 crc kubenswrapper[4785]: I1126 15:31:26.289600 4785 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 26 15:31:26 crc kubenswrapper[4785]: I1126 15:31:26.297481 4785 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 26 15:31:26 crc kubenswrapper[4785]: I1126 15:31:26.961494 4785 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 26 15:31:27 crc kubenswrapper[4785]: I1126 15:31:27.068771 4785 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="17ddc6bf-0450-4a2b-b7da-39b555e914b3" Nov 26 15:31:27 crc kubenswrapper[4785]: I1126 15:31:27.130322 4785 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack-operators/infra-operator-controller-manager-f68bdc44b-4p65x" Nov 26 15:31:27 crc kubenswrapper[4785]: I1126 15:31:27.131059 4785 scope.go:117] "RemoveContainer" containerID="9ab49f54300742800821728bac50f7c6edebd7d7a1451d0d701d2787f2a52730" Nov 26 15:31:27 crc kubenswrapper[4785]: E1126 15:31:27.131341 4785 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=manager pod=infra-operator-controller-manager-f68bdc44b-4p65x_openstack-operators(b37e9a18-1d3b-4a5b-aa23-d9cc2f6394e3)\"" pod="openstack-operators/infra-operator-controller-manager-f68bdc44b-4p65x" podUID="b37e9a18-1d3b-4a5b-aa23-d9cc2f6394e3" Nov 26 15:31:27 crc kubenswrapper[4785]: I1126 15:31:27.131399 4785 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-f68bdc44b-4p65x" Nov 26 15:31:27 crc kubenswrapper[4785]: I1126 15:31:27.973476 4785 scope.go:117] "RemoveContainer" containerID="9ab49f54300742800821728bac50f7c6edebd7d7a1451d0d701d2787f2a52730" Nov 26 15:31:27 crc kubenswrapper[4785]: E1126 15:31:27.973897 4785 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=manager pod=infra-operator-controller-manager-f68bdc44b-4p65x_openstack-operators(b37e9a18-1d3b-4a5b-aa23-d9cc2f6394e3)\"" pod="openstack-operators/infra-operator-controller-manager-f68bdc44b-4p65x" podUID="b37e9a18-1d3b-4a5b-aa23-d9cc2f6394e3" Nov 26 15:31:29 crc kubenswrapper[4785]: I1126 15:31:29.141865 4785 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 26 15:31:30 crc kubenswrapper[4785]: I1126 15:31:30.036663 4785 scope.go:117] "RemoveContainer" containerID="f153c8fd06b9d532de8b91f47f1ded2a952edeb8132f8bf8f8790ea9ad527540" Nov 26 15:31:30 crc kubenswrapper[4785]: I1126 15:31:30.902610 4785 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack-operators/keystone-operator-controller-manager-68b4f95d6c-cpkqd" Nov 26 15:31:30 crc kubenswrapper[4785]: I1126 15:31:30.904344 4785 scope.go:117] "RemoveContainer" containerID="e16c47549cb3b081e50bcdd5671c357ed565da23b924429f09721b1a29c4ab11" Nov 26 15:31:30 crc kubenswrapper[4785]: I1126 15:31:30.999684 4785 generic.go:334] "Generic (PLEG): container finished" podID="c39759f2-3183-48fa-aaee-14b24c5337d7" containerID="0656cbbef2ca4d93172de86af1871129e2f9cdfe898e4a3ff4e71664e7cc41ee" exitCode=1 Nov 26 15:31:30 crc kubenswrapper[4785]: I1126 15:31:30.999736 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-6pwlx" event={"ID":"c39759f2-3183-48fa-aaee-14b24c5337d7","Type":"ContainerDied","Data":"0656cbbef2ca4d93172de86af1871129e2f9cdfe898e4a3ff4e71664e7cc41ee"} Nov 26 15:31:30 crc kubenswrapper[4785]: I1126 15:31:30.999771 4785 scope.go:117] "RemoveContainer" containerID="f153c8fd06b9d532de8b91f47f1ded2a952edeb8132f8bf8f8790ea9ad527540" Nov 26 15:31:31 crc kubenswrapper[4785]: I1126 15:31:31.000410 4785 scope.go:117] "RemoveContainer" containerID="0656cbbef2ca4d93172de86af1871129e2f9cdfe898e4a3ff4e71664e7cc41ee" Nov 26 15:31:31 crc kubenswrapper[4785]: E1126 15:31:31.000669 4785 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=operator pod=rabbitmq-cluster-operator-779fc9694b-6pwlx_openstack-operators(c39759f2-3183-48fa-aaee-14b24c5337d7)\"" pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-6pwlx" podUID="c39759f2-3183-48fa-aaee-14b24c5337d7" Nov 26 15:31:31 crc kubenswrapper[4785]: I1126 15:31:31.037076 4785 scope.go:117] "RemoveContainer" containerID="1ae8bc58bcb4a5ddfd83e2802195db3a2eb6c8de6dfb1afb973f7491401beee1" Nov 26 15:31:31 crc kubenswrapper[4785]: I1126 15:31:31.884619 4785 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack-operators/horizon-operator-controller-manager-647db694df-qnrxh" Nov 26 15:31:31 crc kubenswrapper[4785]: I1126 15:31:31.885474 4785 scope.go:117] "RemoveContainer" containerID="a4338cbefff3dd3d29f22b12ea8c1868649745ea0b7ff794ac101d8c6ac73777" Nov 26 15:31:32 crc kubenswrapper[4785]: I1126 15:31:32.014076 4785 generic.go:334] "Generic (PLEG): container finished" podID="bc8e3329-ae9c-48b1-a49c-92eeef6ae114" containerID="b49d0a231b02050ae980bc7bc9d683ace1a8cf024b23508af7ae9e4865580cdb" exitCode=1 Nov 26 15:31:32 crc kubenswrapper[4785]: I1126 15:31:32.014171 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-68b4f95d6c-cpkqd" event={"ID":"bc8e3329-ae9c-48b1-a49c-92eeef6ae114","Type":"ContainerDied","Data":"b49d0a231b02050ae980bc7bc9d683ace1a8cf024b23508af7ae9e4865580cdb"} Nov 26 15:31:32 crc kubenswrapper[4785]: I1126 15:31:32.014216 4785 scope.go:117] "RemoveContainer" containerID="e16c47549cb3b081e50bcdd5671c357ed565da23b924429f09721b1a29c4ab11" Nov 26 15:31:32 crc kubenswrapper[4785]: I1126 15:31:32.014923 4785 scope.go:117] "RemoveContainer" containerID="b49d0a231b02050ae980bc7bc9d683ace1a8cf024b23508af7ae9e4865580cdb" Nov 26 15:31:32 crc kubenswrapper[4785]: E1126 15:31:32.015253 4785 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 20s restarting failed container=manager pod=keystone-operator-controller-manager-68b4f95d6c-cpkqd_openstack-operators(bc8e3329-ae9c-48b1-a49c-92eeef6ae114)\"" pod="openstack-operators/keystone-operator-controller-manager-68b4f95d6c-cpkqd" podUID="bc8e3329-ae9c-48b1-a49c-92eeef6ae114" Nov 26 15:31:32 crc kubenswrapper[4785]: I1126 15:31:32.020086 4785 generic.go:334] "Generic (PLEG): container finished" podID="161001b1-a5be-49ea-8031-e2c11dd07800" containerID="d4af367c56f3c38355ea85841d9d9666b98fd70e7493e25b6c38880c8398b31f" exitCode=1 Nov 26 15:31:32 crc kubenswrapper[4785]: I1126 15:31:32.020168 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-84667dbb5-sslgl" event={"ID":"161001b1-a5be-49ea-8031-e2c11dd07800","Type":"ContainerDied","Data":"d4af367c56f3c38355ea85841d9d9666b98fd70e7493e25b6c38880c8398b31f"} Nov 26 15:31:32 crc kubenswrapper[4785]: I1126 15:31:32.020839 4785 scope.go:117] "RemoveContainer" containerID="d4af367c56f3c38355ea85841d9d9666b98fd70e7493e25b6c38880c8398b31f" Nov 26 15:31:32 crc kubenswrapper[4785]: E1126 15:31:32.021154 4785 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 20s restarting failed container=manager pod=metallb-operator-controller-manager-84667dbb5-sslgl_metallb-system(161001b1-a5be-49ea-8031-e2c11dd07800)\"" pod="metallb-system/metallb-operator-controller-manager-84667dbb5-sslgl" podUID="161001b1-a5be-49ea-8031-e2c11dd07800" Nov 26 15:31:32 crc kubenswrapper[4785]: I1126 15:31:32.071309 4785 scope.go:117] "RemoveContainer" containerID="1ae8bc58bcb4a5ddfd83e2802195db3a2eb6c8de6dfb1afb973f7491401beee1" Nov 26 15:31:33 crc kubenswrapper[4785]: I1126 15:31:33.046352 4785 generic.go:334] "Generic (PLEG): container finished" podID="3e7e6d72-e83c-4188-a2bc-11b7c16e6e8e" containerID="60d827a8d87a92fe2f68095a9b54f28b944efeca0837f67ff2d4bc193d4cb7d2" exitCode=1 Nov 26 15:31:33 crc kubenswrapper[4785]: I1126 15:31:33.051073 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-647db694df-qnrxh" event={"ID":"3e7e6d72-e83c-4188-a2bc-11b7c16e6e8e","Type":"ContainerDied","Data":"60d827a8d87a92fe2f68095a9b54f28b944efeca0837f67ff2d4bc193d4cb7d2"} Nov 26 15:31:33 crc kubenswrapper[4785]: I1126 15:31:33.051363 4785 scope.go:117] "RemoveContainer" containerID="a4338cbefff3dd3d29f22b12ea8c1868649745ea0b7ff794ac101d8c6ac73777" Nov 26 15:31:33 crc kubenswrapper[4785]: I1126 15:31:33.052368 4785 scope.go:117] "RemoveContainer" containerID="60d827a8d87a92fe2f68095a9b54f28b944efeca0837f67ff2d4bc193d4cb7d2" Nov 26 15:31:33 crc kubenswrapper[4785]: E1126 15:31:33.052907 4785 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 20s restarting failed container=manager pod=horizon-operator-controller-manager-647db694df-qnrxh_openstack-operators(3e7e6d72-e83c-4188-a2bc-11b7c16e6e8e)\"" pod="openstack-operators/horizon-operator-controller-manager-647db694df-qnrxh" podUID="3e7e6d72-e83c-4188-a2bc-11b7c16e6e8e" Nov 26 15:31:35 crc kubenswrapper[4785]: I1126 15:31:35.035897 4785 scope.go:117] "RemoveContainer" containerID="a09871758ad26f7ef77da7d8a8b9faa644fe8ef538dd79721ded36c93b27ae7c" Nov 26 15:31:35 crc kubenswrapper[4785]: I1126 15:31:35.522496 4785 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Nov 26 15:31:35 crc kubenswrapper[4785]: I1126 15:31:35.785119 4785 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Nov 26 15:31:35 crc kubenswrapper[4785]: I1126 15:31:35.882857 4785 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"rabbitmq-default-user" Nov 26 15:31:36 crc kubenswrapper[4785]: I1126 15:31:36.036756 4785 scope.go:117] "RemoveContainer" containerID="f56706b4672836f8f2971f85f1f0abaf0f06e449de9fb09dea15da5117600707" Nov 26 15:31:36 crc kubenswrapper[4785]: I1126 15:31:36.092443 4785 generic.go:334] "Generic (PLEG): container finished" podID="60b24860-07b4-4841-9c4a-a5e6456a45dc" containerID="b1f9cc1d9a271ab83902646afda36e5d9aa5c62a0003a566acebc65861362b7a" exitCode=1 Nov 26 15:31:36 crc kubenswrapper[4785]: I1126 15:31:36.092507 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-5d784fc5bb-kn67f" event={"ID":"60b24860-07b4-4841-9c4a-a5e6456a45dc","Type":"ContainerDied","Data":"b1f9cc1d9a271ab83902646afda36e5d9aa5c62a0003a566acebc65861362b7a"} Nov 26 15:31:36 crc kubenswrapper[4785]: I1126 15:31:36.092598 4785 scope.go:117] "RemoveContainer" containerID="a09871758ad26f7ef77da7d8a8b9faa644fe8ef538dd79721ded36c93b27ae7c" Nov 26 15:31:36 crc kubenswrapper[4785]: I1126 15:31:36.093417 4785 scope.go:117] "RemoveContainer" containerID="b1f9cc1d9a271ab83902646afda36e5d9aa5c62a0003a566acebc65861362b7a" Nov 26 15:31:36 crc kubenswrapper[4785]: E1126 15:31:36.093852 4785 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 20s restarting failed container=manager pod=swift-operator-controller-manager-5d784fc5bb-kn67f_openstack-operators(60b24860-07b4-4841-9c4a-a5e6456a45dc)\"" pod="openstack-operators/swift-operator-controller-manager-5d784fc5bb-kn67f" podUID="60b24860-07b4-4841-9c4a-a5e6456a45dc" Nov 26 15:31:36 crc kubenswrapper[4785]: I1126 15:31:36.162718 4785 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Nov 26 15:31:36 crc kubenswrapper[4785]: I1126 15:31:36.270904 4785 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Nov 26 15:31:36 crc kubenswrapper[4785]: I1126 15:31:36.381923 4785 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Nov 26 15:31:36 crc kubenswrapper[4785]: I1126 15:31:36.389682 4785 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Nov 26 15:31:36 crc kubenswrapper[4785]: I1126 15:31:36.521315 4785 reflector.go:368] Caches populated for *v1.ConfigMap from object-"glance-kuttl-tests"/"kube-root-ca.crt" Nov 26 15:31:36 crc kubenswrapper[4785]: I1126 15:31:36.856992 4785 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Nov 26 15:31:36 crc kubenswrapper[4785]: I1126 15:31:36.991630 4785 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Nov 26 15:31:37 crc kubenswrapper[4785]: I1126 15:31:37.103266 4785 generic.go:334] "Generic (PLEG): container finished" podID="62cac43a-a147-46b5-bbd6-4b452a008291" containerID="bfdbbfd53693ffb13e3109168263dbcd1dd7251f73c2975eec5193a202b71db5" exitCode=1 Nov 26 15:31:37 crc kubenswrapper[4785]: I1126 15:31:37.103334 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-56ccd5f88c-dzft5" event={"ID":"62cac43a-a147-46b5-bbd6-4b452a008291","Type":"ContainerDied","Data":"bfdbbfd53693ffb13e3109168263dbcd1dd7251f73c2975eec5193a202b71db5"} Nov 26 15:31:37 crc kubenswrapper[4785]: I1126 15:31:37.103376 4785 scope.go:117] "RemoveContainer" containerID="f56706b4672836f8f2971f85f1f0abaf0f06e449de9fb09dea15da5117600707" Nov 26 15:31:37 crc kubenswrapper[4785]: I1126 15:31:37.103925 4785 scope.go:117] "RemoveContainer" containerID="bfdbbfd53693ffb13e3109168263dbcd1dd7251f73c2975eec5193a202b71db5" Nov 26 15:31:37 crc kubenswrapper[4785]: E1126 15:31:37.104174 4785 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 20s restarting failed container=manager pod=glance-operator-controller-manager-56ccd5f88c-dzft5_openstack-operators(62cac43a-a147-46b5-bbd6-4b452a008291)\"" pod="openstack-operators/glance-operator-controller-manager-56ccd5f88c-dzft5" podUID="62cac43a-a147-46b5-bbd6-4b452a008291" Nov 26 15:31:37 crc kubenswrapper[4785]: I1126 15:31:37.178179 4785 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Nov 26 15:31:37 crc kubenswrapper[4785]: I1126 15:31:37.289655 4785 patch_prober.go:28] interesting pod/machine-config-daemon-gkxdl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 26 15:31:37 crc kubenswrapper[4785]: I1126 15:31:37.289736 4785 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-gkxdl" podUID="5539d39a-e2bc-4e7f-8b5a-a5e3e10c4ba4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 26 15:31:37 crc kubenswrapper[4785]: I1126 15:31:37.353089 4785 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Nov 26 15:31:37 crc kubenswrapper[4785]: I1126 15:31:37.424458 4785 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Nov 26 15:31:37 crc kubenswrapper[4785]: I1126 15:31:37.550469 4785 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Nov 26 15:31:37 crc kubenswrapper[4785]: I1126 15:31:37.759914 4785 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Nov 26 15:31:37 crc kubenswrapper[4785]: I1126 15:31:37.976074 4785 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Nov 26 15:31:38 crc kubenswrapper[4785]: I1126 15:31:38.012250 4785 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Nov 26 15:31:38 crc kubenswrapper[4785]: I1126 15:31:38.037038 4785 scope.go:117] "RemoveContainer" containerID="9ab49f54300742800821728bac50f7c6edebd7d7a1451d0d701d2787f2a52730" Nov 26 15:31:38 crc kubenswrapper[4785]: I1126 15:31:38.358492 4785 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Nov 26 15:31:38 crc kubenswrapper[4785]: I1126 15:31:38.391964 4785 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Nov 26 15:31:38 crc kubenswrapper[4785]: I1126 15:31:38.407622 4785 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Nov 26 15:31:38 crc kubenswrapper[4785]: I1126 15:31:38.440182 4785 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Nov 26 15:31:38 crc kubenswrapper[4785]: I1126 15:31:38.494055 4785 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Nov 26 15:31:38 crc kubenswrapper[4785]: I1126 15:31:38.498205 4785 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Nov 26 15:31:38 crc kubenswrapper[4785]: I1126 15:31:38.552992 4785 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Nov 26 15:31:38 crc kubenswrapper[4785]: I1126 15:31:38.648573 4785 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Nov 26 15:31:38 crc kubenswrapper[4785]: I1126 15:31:38.755968 4785 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Nov 26 15:31:38 crc kubenswrapper[4785]: I1126 15:31:38.800828 4785 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Nov 26 15:31:38 crc kubenswrapper[4785]: I1126 15:31:38.806503 4785 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Nov 26 15:31:38 crc kubenswrapper[4785]: I1126 15:31:38.839875 4785 reflector.go:368] Caches populated for *v1.ConfigMap from object-"glance-kuttl-tests"/"rabbitmq-server-conf" Nov 26 15:31:38 crc kubenswrapper[4785]: I1126 15:31:38.848696 4785 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Nov 26 15:31:38 crc kubenswrapper[4785]: I1126 15:31:38.872072 4785 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-service-cert" Nov 26 15:31:38 crc kubenswrapper[4785]: I1126 15:31:38.964474 4785 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Nov 26 15:31:39 crc kubenswrapper[4785]: I1126 15:31:39.011100 4785 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Nov 26 15:31:39 crc kubenswrapper[4785]: I1126 15:31:39.016972 4785 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Nov 26 15:31:39 crc kubenswrapper[4785]: I1126 15:31:39.098406 4785 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Nov 26 15:31:39 crc kubenswrapper[4785]: I1126 15:31:39.126964 4785 generic.go:334] "Generic (PLEG): container finished" podID="b37e9a18-1d3b-4a5b-aa23-d9cc2f6394e3" containerID="94ca74f05ace369e770af1e6ec9079e0bfd52006a01284ba7aa5d333cc167f5d" exitCode=1 Nov 26 15:31:39 crc kubenswrapper[4785]: I1126 15:31:39.126950 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-f68bdc44b-4p65x" event={"ID":"b37e9a18-1d3b-4a5b-aa23-d9cc2f6394e3","Type":"ContainerDied","Data":"94ca74f05ace369e770af1e6ec9079e0bfd52006a01284ba7aa5d333cc167f5d"} Nov 26 15:31:39 crc kubenswrapper[4785]: I1126 15:31:39.127050 4785 scope.go:117] "RemoveContainer" containerID="9ab49f54300742800821728bac50f7c6edebd7d7a1451d0d701d2787f2a52730" Nov 26 15:31:39 crc kubenswrapper[4785]: I1126 15:31:39.128330 4785 scope.go:117] "RemoveContainer" containerID="94ca74f05ace369e770af1e6ec9079e0bfd52006a01284ba7aa5d333cc167f5d" Nov 26 15:31:39 crc kubenswrapper[4785]: E1126 15:31:39.129099 4785 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 20s restarting failed container=manager pod=infra-operator-controller-manager-f68bdc44b-4p65x_openstack-operators(b37e9a18-1d3b-4a5b-aa23-d9cc2f6394e3)\"" pod="openstack-operators/infra-operator-controller-manager-f68bdc44b-4p65x" podUID="b37e9a18-1d3b-4a5b-aa23-d9cc2f6394e3" Nov 26 15:31:39 crc kubenswrapper[4785]: I1126 15:31:39.151258 4785 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Nov 26 15:31:39 crc kubenswrapper[4785]: I1126 15:31:39.176122 4785 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Nov 26 15:31:39 crc kubenswrapper[4785]: I1126 15:31:39.211202 4785 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-84667dbb5-sslgl" Nov 26 15:31:39 crc kubenswrapper[4785]: I1126 15:31:39.212892 4785 scope.go:117] "RemoveContainer" containerID="d4af367c56f3c38355ea85841d9d9666b98fd70e7493e25b6c38880c8398b31f" Nov 26 15:31:39 crc kubenswrapper[4785]: E1126 15:31:39.213123 4785 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 20s restarting failed container=manager pod=metallb-operator-controller-manager-84667dbb5-sslgl_metallb-system(161001b1-a5be-49ea-8031-e2c11dd07800)\"" pod="metallb-system/metallb-operator-controller-manager-84667dbb5-sslgl" podUID="161001b1-a5be-49ea-8031-e2c11dd07800" Nov 26 15:31:39 crc kubenswrapper[4785]: I1126 15:31:39.215214 4785 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Nov 26 15:31:39 crc kubenswrapper[4785]: I1126 15:31:39.305636 4785 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Nov 26 15:31:39 crc kubenswrapper[4785]: I1126 15:31:39.557823 4785 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Nov 26 15:31:39 crc kubenswrapper[4785]: I1126 15:31:39.598146 4785 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Nov 26 15:31:39 crc kubenswrapper[4785]: I1126 15:31:39.674245 4785 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-6wjlm" Nov 26 15:31:39 crc kubenswrapper[4785]: I1126 15:31:39.777811 4785 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Nov 26 15:31:39 crc kubenswrapper[4785]: I1126 15:31:39.779792 4785 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Nov 26 15:31:39 crc kubenswrapper[4785]: I1126 15:31:39.797079 4785 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-default-external-config-data" Nov 26 15:31:39 crc kubenswrapper[4785]: I1126 15:31:39.891848 4785 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Nov 26 15:31:39 crc kubenswrapper[4785]: I1126 15:31:39.895261 4785 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Nov 26 15:31:39 crc kubenswrapper[4785]: I1126 15:31:39.932880 4785 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Nov 26 15:31:39 crc kubenswrapper[4785]: I1126 15:31:39.953480 4785 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Nov 26 15:31:39 crc kubenswrapper[4785]: I1126 15:31:39.975202 4785 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Nov 26 15:31:39 crc kubenswrapper[4785]: I1126 15:31:39.995432 4785 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Nov 26 15:31:40 crc kubenswrapper[4785]: I1126 15:31:40.035916 4785 scope.go:117] "RemoveContainer" containerID="cd44bf4be8266213edfeb75984a0f32780c31cbacd9894cbd20a9535fb5dfd62" Nov 26 15:31:40 crc kubenswrapper[4785]: I1126 15:31:40.101140 4785 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Nov 26 15:31:40 crc kubenswrapper[4785]: I1126 15:31:40.153486 4785 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Nov 26 15:31:40 crc kubenswrapper[4785]: I1126 15:31:40.154458 4785 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Nov 26 15:31:40 crc kubenswrapper[4785]: I1126 15:31:40.156309 4785 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Nov 26 15:31:40 crc kubenswrapper[4785]: I1126 15:31:40.198125 4785 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Nov 26 15:31:40 crc kubenswrapper[4785]: I1126 15:31:40.263315 4785 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Nov 26 15:31:40 crc kubenswrapper[4785]: I1126 15:31:40.359082 4785 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Nov 26 15:31:40 crc kubenswrapper[4785]: I1126 15:31:40.359158 4785 reflector.go:368] Caches populated for *v1.ConfigMap from object-"glance-kuttl-tests"/"openstack-scripts" Nov 26 15:31:40 crc kubenswrapper[4785]: I1126 15:31:40.478497 4785 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Nov 26 15:31:40 crc kubenswrapper[4785]: I1126 15:31:40.485077 4785 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Nov 26 15:31:40 crc kubenswrapper[4785]: I1126 15:31:40.535688 4785 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Nov 26 15:31:40 crc kubenswrapper[4785]: I1126 15:31:40.544248 4785 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Nov 26 15:31:40 crc kubenswrapper[4785]: I1126 15:31:40.544658 4785 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Nov 26 15:31:40 crc kubenswrapper[4785]: I1126 15:31:40.548630 4785 reflector.go:368] Caches populated for *v1.ConfigMap from object-"glance-kuttl-tests"/"rabbitmq-plugins-conf" Nov 26 15:31:40 crc kubenswrapper[4785]: I1126 15:31:40.620878 4785 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Nov 26 15:31:40 crc kubenswrapper[4785]: I1126 15:31:40.630983 4785 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Nov 26 15:31:40 crc kubenswrapper[4785]: I1126 15:31:40.644963 4785 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podStartSLOduration=35.644941937 podStartE2EDuration="35.644941937s" podCreationTimestamp="2025-11-26 15:31:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 15:31:24.772061967 +0000 UTC m=+1448.450427731" watchObservedRunningTime="2025-11-26 15:31:40.644941937 +0000 UTC m=+1464.323307721" Nov 26 15:31:40 crc kubenswrapper[4785]: I1126 15:31:40.645479 4785 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Nov 26 15:31:40 crc kubenswrapper[4785]: I1126 15:31:40.645536 4785 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Nov 26 15:31:40 crc kubenswrapper[4785]: I1126 15:31:40.652382 4785 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 26 15:31:40 crc kubenswrapper[4785]: I1126 15:31:40.655438 4785 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Nov 26 15:31:40 crc kubenswrapper[4785]: I1126 15:31:40.677830 4785 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=16.677808052 podStartE2EDuration="16.677808052s" podCreationTimestamp="2025-11-26 15:31:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 15:31:40.670475932 +0000 UTC m=+1464.348841726" watchObservedRunningTime="2025-11-26 15:31:40.677808052 +0000 UTC m=+1464.356173826" Nov 26 15:31:40 crc kubenswrapper[4785]: I1126 15:31:40.705230 4785 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Nov 26 15:31:40 crc kubenswrapper[4785]: I1126 15:31:40.745378 4785 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-scripts" Nov 26 15:31:40 crc kubenswrapper[4785]: I1126 15:31:40.760597 4785 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Nov 26 15:31:40 crc kubenswrapper[4785]: I1126 15:31:40.899374 4785 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Nov 26 15:31:40 crc kubenswrapper[4785]: I1126 15:31:40.902771 4785 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-68b4f95d6c-cpkqd" Nov 26 15:31:40 crc kubenswrapper[4785]: I1126 15:31:40.903784 4785 scope.go:117] "RemoveContainer" containerID="b49d0a231b02050ae980bc7bc9d683ace1a8cf024b23508af7ae9e4865580cdb" Nov 26 15:31:40 crc kubenswrapper[4785]: E1126 15:31:40.904283 4785 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 20s restarting failed container=manager pod=keystone-operator-controller-manager-68b4f95d6c-cpkqd_openstack-operators(bc8e3329-ae9c-48b1-a49c-92eeef6ae114)\"" pod="openstack-operators/keystone-operator-controller-manager-68b4f95d6c-cpkqd" podUID="bc8e3329-ae9c-48b1-a49c-92eeef6ae114" Nov 26 15:31:40 crc kubenswrapper[4785]: I1126 15:31:40.981233 4785 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-5d784fc5bb-kn67f" Nov 26 15:31:40 crc kubenswrapper[4785]: I1126 15:31:40.981316 4785 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack-operators/swift-operator-controller-manager-5d784fc5bb-kn67f" Nov 26 15:31:40 crc kubenswrapper[4785]: I1126 15:31:40.981912 4785 scope.go:117] "RemoveContainer" containerID="b1f9cc1d9a271ab83902646afda36e5d9aa5c62a0003a566acebc65861362b7a" Nov 26 15:31:40 crc kubenswrapper[4785]: E1126 15:31:40.982100 4785 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 20s restarting failed container=manager pod=swift-operator-controller-manager-5d784fc5bb-kn67f_openstack-operators(60b24860-07b4-4841-9c4a-a5e6456a45dc)\"" pod="openstack-operators/swift-operator-controller-manager-5d784fc5bb-kn67f" podUID="60b24860-07b4-4841-9c4a-a5e6456a45dc" Nov 26 15:31:41 crc kubenswrapper[4785]: I1126 15:31:41.012957 4785 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Nov 26 15:31:41 crc kubenswrapper[4785]: I1126 15:31:41.022064 4785 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Nov 26 15:31:41 crc kubenswrapper[4785]: I1126 15:31:41.145248 4785 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack-operators/glance-operator-controller-manager-56ccd5f88c-dzft5" Nov 26 15:31:41 crc kubenswrapper[4785]: I1126 15:31:41.145322 4785 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-56ccd5f88c-dzft5" Nov 26 15:31:41 crc kubenswrapper[4785]: I1126 15:31:41.146131 4785 scope.go:117] "RemoveContainer" containerID="bfdbbfd53693ffb13e3109168263dbcd1dd7251f73c2975eec5193a202b71db5" Nov 26 15:31:41 crc kubenswrapper[4785]: E1126 15:31:41.146504 4785 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 20s restarting failed container=manager pod=glance-operator-controller-manager-56ccd5f88c-dzft5_openstack-operators(62cac43a-a147-46b5-bbd6-4b452a008291)\"" pod="openstack-operators/glance-operator-controller-manager-56ccd5f88c-dzft5" podUID="62cac43a-a147-46b5-bbd6-4b452a008291" Nov 26 15:31:41 crc kubenswrapper[4785]: I1126 15:31:41.156662 4785 generic.go:334] "Generic (PLEG): container finished" podID="8d72a0d6-d729-4c0f-90c7-2a5eca6fc32a" containerID="1c4d56368d21e55ca5e605cd5efcd1425e412efc9e9d1d8d4dd4cf2b910a1b69" exitCode=1 Nov 26 15:31:41 crc kubenswrapper[4785]: I1126 15:31:41.156770 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-747fb5cb85-5slw2" event={"ID":"8d72a0d6-d729-4c0f-90c7-2a5eca6fc32a","Type":"ContainerDied","Data":"1c4d56368d21e55ca5e605cd5efcd1425e412efc9e9d1d8d4dd4cf2b910a1b69"} Nov 26 15:31:41 crc kubenswrapper[4785]: I1126 15:31:41.156852 4785 scope.go:117] "RemoveContainer" containerID="cd44bf4be8266213edfeb75984a0f32780c31cbacd9894cbd20a9535fb5dfd62" Nov 26 15:31:41 crc kubenswrapper[4785]: I1126 15:31:41.157391 4785 scope.go:117] "RemoveContainer" containerID="b1f9cc1d9a271ab83902646afda36e5d9aa5c62a0003a566acebc65861362b7a" Nov 26 15:31:41 crc kubenswrapper[4785]: E1126 15:31:41.157851 4785 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 20s restarting failed container=manager pod=swift-operator-controller-manager-5d784fc5bb-kn67f_openstack-operators(60b24860-07b4-4841-9c4a-a5e6456a45dc)\"" pod="openstack-operators/swift-operator-controller-manager-5d784fc5bb-kn67f" podUID="60b24860-07b4-4841-9c4a-a5e6456a45dc" Nov 26 15:31:41 crc kubenswrapper[4785]: I1126 15:31:41.158403 4785 scope.go:117] "RemoveContainer" containerID="1c4d56368d21e55ca5e605cd5efcd1425e412efc9e9d1d8d4dd4cf2b910a1b69" Nov 26 15:31:41 crc kubenswrapper[4785]: E1126 15:31:41.158964 4785 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 20s restarting failed container=manager pod=mariadb-operator-controller-manager-747fb5cb85-5slw2_openstack-operators(8d72a0d6-d729-4c0f-90c7-2a5eca6fc32a)\"" pod="openstack-operators/mariadb-operator-controller-manager-747fb5cb85-5slw2" podUID="8d72a0d6-d729-4c0f-90c7-2a5eca6fc32a" Nov 26 15:31:41 crc kubenswrapper[4785]: I1126 15:31:41.201770 4785 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Nov 26 15:31:41 crc kubenswrapper[4785]: I1126 15:31:41.301482 4785 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-mjzws" Nov 26 15:31:41 crc kubenswrapper[4785]: I1126 15:31:41.309382 4785 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Nov 26 15:31:41 crc kubenswrapper[4785]: I1126 15:31:41.312720 4785 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Nov 26 15:31:41 crc kubenswrapper[4785]: I1126 15:31:41.366188 4785 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Nov 26 15:31:41 crc kubenswrapper[4785]: I1126 15:31:41.418572 4785 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-pnk82" Nov 26 15:31:41 crc kubenswrapper[4785]: I1126 15:31:41.442416 4785 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Nov 26 15:31:41 crc kubenswrapper[4785]: I1126 15:31:41.454339 4785 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Nov 26 15:31:41 crc kubenswrapper[4785]: I1126 15:31:41.463436 4785 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Nov 26 15:31:41 crc kubenswrapper[4785]: I1126 15:31:41.559130 4785 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Nov 26 15:31:41 crc kubenswrapper[4785]: I1126 15:31:41.674840 4785 reflector.go:368] Caches populated for *v1.ConfigMap from object-"glance-kuttl-tests"/"openshift-service-ca.crt" Nov 26 15:31:41 crc kubenswrapper[4785]: I1126 15:31:41.820002 4785 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Nov 26 15:31:41 crc kubenswrapper[4785]: I1126 15:31:41.852801 4785 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Nov 26 15:31:41 crc kubenswrapper[4785]: I1126 15:31:41.867615 4785 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Nov 26 15:31:41 crc kubenswrapper[4785]: I1126 15:31:41.883600 4785 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Nov 26 15:31:41 crc kubenswrapper[4785]: I1126 15:31:41.883764 4785 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-647db694df-qnrxh" Nov 26 15:31:41 crc kubenswrapper[4785]: I1126 15:31:41.884236 4785 scope.go:117] "RemoveContainer" containerID="60d827a8d87a92fe2f68095a9b54f28b944efeca0837f67ff2d4bc193d4cb7d2" Nov 26 15:31:41 crc kubenswrapper[4785]: E1126 15:31:41.884429 4785 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 20s restarting failed container=manager pod=horizon-operator-controller-manager-647db694df-qnrxh_openstack-operators(3e7e6d72-e83c-4188-a2bc-11b7c16e6e8e)\"" pod="openstack-operators/horizon-operator-controller-manager-647db694df-qnrxh" podUID="3e7e6d72-e83c-4188-a2bc-11b7c16e6e8e" Nov 26 15:31:41 crc kubenswrapper[4785]: I1126 15:31:41.907285 4785 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Nov 26 15:31:41 crc kubenswrapper[4785]: I1126 15:31:41.995027 4785 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Nov 26 15:31:41 crc kubenswrapper[4785]: I1126 15:31:41.997025 4785 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Nov 26 15:31:42 crc kubenswrapper[4785]: I1126 15:31:42.008443 4785 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Nov 26 15:31:42 crc kubenswrapper[4785]: I1126 15:31:42.041587 4785 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Nov 26 15:31:42 crc kubenswrapper[4785]: I1126 15:31:42.105035 4785 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Nov 26 15:31:42 crc kubenswrapper[4785]: I1126 15:31:42.128660 4785 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Nov 26 15:31:42 crc kubenswrapper[4785]: I1126 15:31:42.144215 4785 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Nov 26 15:31:42 crc kubenswrapper[4785]: I1126 15:31:42.155466 4785 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Nov 26 15:31:42 crc kubenswrapper[4785]: I1126 15:31:42.252063 4785 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Nov 26 15:31:42 crc kubenswrapper[4785]: I1126 15:31:42.258498 4785 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Nov 26 15:31:42 crc kubenswrapper[4785]: I1126 15:31:42.271193 4785 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Nov 26 15:31:42 crc kubenswrapper[4785]: I1126 15:31:42.311939 4785 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Nov 26 15:31:42 crc kubenswrapper[4785]: I1126 15:31:42.444977 4785 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Nov 26 15:31:42 crc kubenswrapper[4785]: I1126 15:31:42.448121 4785 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Nov 26 15:31:42 crc kubenswrapper[4785]: I1126 15:31:42.564376 4785 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-service-cert" Nov 26 15:31:42 crc kubenswrapper[4785]: I1126 15:31:42.601497 4785 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Nov 26 15:31:42 crc kubenswrapper[4785]: I1126 15:31:42.630709 4785 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Nov 26 15:31:42 crc kubenswrapper[4785]: I1126 15:31:42.641107 4785 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-fktn4" Nov 26 15:31:42 crc kubenswrapper[4785]: I1126 15:31:42.650354 4785 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Nov 26 15:31:42 crc kubenswrapper[4785]: I1126 15:31:42.654886 4785 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Nov 26 15:31:42 crc kubenswrapper[4785]: I1126 15:31:42.660500 4785 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Nov 26 15:31:42 crc kubenswrapper[4785]: I1126 15:31:42.757876 4785 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Nov 26 15:31:42 crc kubenswrapper[4785]: I1126 15:31:42.775014 4785 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Nov 26 15:31:42 crc kubenswrapper[4785]: I1126 15:31:42.789636 4785 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-index-dockercfg-m98qt" Nov 26 15:31:42 crc kubenswrapper[4785]: I1126 15:31:42.796964 4785 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Nov 26 15:31:42 crc kubenswrapper[4785]: I1126 15:31:42.825425 4785 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Nov 26 15:31:42 crc kubenswrapper[4785]: I1126 15:31:42.917604 4785 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Nov 26 15:31:43 crc kubenswrapper[4785]: I1126 15:31:43.059838 4785 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Nov 26 15:31:43 crc kubenswrapper[4785]: I1126 15:31:43.062223 4785 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Nov 26 15:31:43 crc kubenswrapper[4785]: I1126 15:31:43.090423 4785 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Nov 26 15:31:43 crc kubenswrapper[4785]: I1126 15:31:43.097488 4785 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Nov 26 15:31:43 crc kubenswrapper[4785]: I1126 15:31:43.182164 4785 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Nov 26 15:31:43 crc kubenswrapper[4785]: I1126 15:31:43.229392 4785 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Nov 26 15:31:43 crc kubenswrapper[4785]: I1126 15:31:43.256599 4785 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Nov 26 15:31:43 crc kubenswrapper[4785]: I1126 15:31:43.299639 4785 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-glance-dockercfg-p94cf" Nov 26 15:31:43 crc kubenswrapper[4785]: I1126 15:31:43.315544 4785 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Nov 26 15:31:43 crc kubenswrapper[4785]: I1126 15:31:43.395698 4785 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"swift-proxy-config-data" Nov 26 15:31:43 crc kubenswrapper[4785]: I1126 15:31:43.435363 4785 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"keystone-scripts" Nov 26 15:31:43 crc kubenswrapper[4785]: I1126 15:31:43.448880 4785 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Nov 26 15:31:43 crc kubenswrapper[4785]: I1126 15:31:43.504049 4785 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Nov 26 15:31:43 crc kubenswrapper[4785]: I1126 15:31:43.602589 4785 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Nov 26 15:31:43 crc kubenswrapper[4785]: I1126 15:31:43.617621 4785 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Nov 26 15:31:43 crc kubenswrapper[4785]: I1126 15:31:43.627955 4785 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Nov 26 15:31:43 crc kubenswrapper[4785]: I1126 15:31:43.633386 4785 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Nov 26 15:31:43 crc kubenswrapper[4785]: I1126 15:31:43.645473 4785 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Nov 26 15:31:43 crc kubenswrapper[4785]: I1126 15:31:43.726884 4785 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-mlgnr" Nov 26 15:31:43 crc kubenswrapper[4785]: I1126 15:31:43.742701 4785 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Nov 26 15:31:43 crc kubenswrapper[4785]: I1126 15:31:43.758633 4785 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Nov 26 15:31:43 crc kubenswrapper[4785]: I1126 15:31:43.763061 4785 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Nov 26 15:31:43 crc kubenswrapper[4785]: I1126 15:31:43.793711 4785 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Nov 26 15:31:43 crc kubenswrapper[4785]: I1126 15:31:43.826104 4785 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Nov 26 15:31:43 crc kubenswrapper[4785]: I1126 15:31:43.872905 4785 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Nov 26 15:31:43 crc kubenswrapper[4785]: I1126 15:31:43.935138 4785 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Nov 26 15:31:43 crc kubenswrapper[4785]: I1126 15:31:43.939372 4785 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Nov 26 15:31:43 crc kubenswrapper[4785]: I1126 15:31:43.950824 4785 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Nov 26 15:31:43 crc kubenswrapper[4785]: I1126 15:31:43.968255 4785 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Nov 26 15:31:44 crc kubenswrapper[4785]: I1126 15:31:44.014489 4785 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Nov 26 15:31:44 crc kubenswrapper[4785]: I1126 15:31:44.019535 4785 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-service-cert" Nov 26 15:31:44 crc kubenswrapper[4785]: I1126 15:31:44.081675 4785 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Nov 26 15:31:44 crc kubenswrapper[4785]: I1126 15:31:44.109978 4785 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Nov 26 15:31:44 crc kubenswrapper[4785]: I1126 15:31:44.114541 4785 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Nov 26 15:31:44 crc kubenswrapper[4785]: I1126 15:31:44.132643 4785 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Nov 26 15:31:44 crc kubenswrapper[4785]: I1126 15:31:44.183050 4785 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Nov 26 15:31:44 crc kubenswrapper[4785]: I1126 15:31:44.256375 4785 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Nov 26 15:31:44 crc kubenswrapper[4785]: I1126 15:31:44.307919 4785 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Nov 26 15:31:44 crc kubenswrapper[4785]: I1126 15:31:44.337644 4785 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-service-cert" Nov 26 15:31:44 crc kubenswrapper[4785]: I1126 15:31:44.384727 4785 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-747fb5cb85-5slw2" Nov 26 15:31:44 crc kubenswrapper[4785]: I1126 15:31:44.384803 4785 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack-operators/mariadb-operator-controller-manager-747fb5cb85-5slw2" Nov 26 15:31:44 crc kubenswrapper[4785]: I1126 15:31:44.385298 4785 scope.go:117] "RemoveContainer" containerID="1c4d56368d21e55ca5e605cd5efcd1425e412efc9e9d1d8d4dd4cf2b910a1b69" Nov 26 15:31:44 crc kubenswrapper[4785]: E1126 15:31:44.385681 4785 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 20s restarting failed container=manager pod=mariadb-operator-controller-manager-747fb5cb85-5slw2_openstack-operators(8d72a0d6-d729-4c0f-90c7-2a5eca6fc32a)\"" pod="openstack-operators/mariadb-operator-controller-manager-747fb5cb85-5slw2" podUID="8d72a0d6-d729-4c0f-90c7-2a5eca6fc32a" Nov 26 15:31:44 crc kubenswrapper[4785]: I1126 15:31:44.405953 4785 reflector.go:368] Caches populated for *v1.ConfigMap from object-"glance-kuttl-tests"/"swift-ring-files" Nov 26 15:31:44 crc kubenswrapper[4785]: I1126 15:31:44.481343 4785 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-service-cert" Nov 26 15:31:44 crc kubenswrapper[4785]: I1126 15:31:44.524868 4785 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-service-cert" Nov 26 15:31:44 crc kubenswrapper[4785]: I1126 15:31:44.527206 4785 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Nov 26 15:31:44 crc kubenswrapper[4785]: I1126 15:31:44.575154 4785 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Nov 26 15:31:44 crc kubenswrapper[4785]: I1126 15:31:44.581599 4785 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Nov 26 15:31:44 crc kubenswrapper[4785]: I1126 15:31:44.614297 4785 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Nov 26 15:31:44 crc kubenswrapper[4785]: I1126 15:31:44.620789 4785 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Nov 26 15:31:44 crc kubenswrapper[4785]: I1126 15:31:44.677669 4785 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Nov 26 15:31:44 crc kubenswrapper[4785]: I1126 15:31:44.714508 4785 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Nov 26 15:31:44 crc kubenswrapper[4785]: I1126 15:31:44.876172 4785 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Nov 26 15:31:44 crc kubenswrapper[4785]: I1126 15:31:44.910879 4785 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Nov 26 15:31:44 crc kubenswrapper[4785]: I1126 15:31:44.927379 4785 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Nov 26 15:31:44 crc kubenswrapper[4785]: I1126 15:31:44.998781 4785 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Nov 26 15:31:45 crc kubenswrapper[4785]: I1126 15:31:45.033181 4785 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Nov 26 15:31:45 crc kubenswrapper[4785]: I1126 15:31:45.093009 4785 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Nov 26 15:31:45 crc kubenswrapper[4785]: I1126 15:31:45.127841 4785 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Nov 26 15:31:45 crc kubenswrapper[4785]: I1126 15:31:45.194797 4785 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Nov 26 15:31:45 crc kubenswrapper[4785]: I1126 15:31:45.215110 4785 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Nov 26 15:31:45 crc kubenswrapper[4785]: I1126 15:31:45.219726 4785 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Nov 26 15:31:45 crc kubenswrapper[4785]: I1126 15:31:45.244007 4785 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Nov 26 15:31:45 crc kubenswrapper[4785]: I1126 15:31:45.259073 4785 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Nov 26 15:31:45 crc kubenswrapper[4785]: I1126 15:31:45.287622 4785 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Nov 26 15:31:45 crc kubenswrapper[4785]: I1126 15:31:45.315156 4785 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Nov 26 15:31:45 crc kubenswrapper[4785]: I1126 15:31:45.315920 4785 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Nov 26 15:31:45 crc kubenswrapper[4785]: I1126 15:31:45.471417 4785 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Nov 26 15:31:45 crc kubenswrapper[4785]: I1126 15:31:45.534203 4785 reflector.go:368] Caches populated for *v1.ConfigMap from object-"glance-kuttl-tests"/"openstack-config-data" Nov 26 15:31:45 crc kubenswrapper[4785]: I1126 15:31:45.551240 4785 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Nov 26 15:31:45 crc kubenswrapper[4785]: I1126 15:31:45.552215 4785 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Nov 26 15:31:45 crc kubenswrapper[4785]: I1126 15:31:45.574407 4785 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Nov 26 15:31:45 crc kubenswrapper[4785]: I1126 15:31:45.644339 4785 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Nov 26 15:31:45 crc kubenswrapper[4785]: I1126 15:31:45.661898 4785 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-index-dockercfg-qq2jl" Nov 26 15:31:45 crc kubenswrapper[4785]: I1126 15:31:45.832158 4785 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Nov 26 15:31:45 crc kubenswrapper[4785]: I1126 15:31:45.882098 4785 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Nov 26 15:31:45 crc kubenswrapper[4785]: I1126 15:31:45.933182 4785 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Nov 26 15:31:45 crc kubenswrapper[4785]: I1126 15:31:45.969479 4785 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Nov 26 15:31:45 crc kubenswrapper[4785]: I1126 15:31:45.996745 4785 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Nov 26 15:31:46 crc kubenswrapper[4785]: I1126 15:31:46.028467 4785 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"keystone-keystone-dockercfg-gwfvc" Nov 26 15:31:46 crc kubenswrapper[4785]: I1126 15:31:46.037010 4785 scope.go:117] "RemoveContainer" containerID="0656cbbef2ca4d93172de86af1871129e2f9cdfe898e4a3ff4e71664e7cc41ee" Nov 26 15:31:46 crc kubenswrapper[4785]: E1126 15:31:46.037334 4785 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=operator pod=rabbitmq-cluster-operator-779fc9694b-6pwlx_openstack-operators(c39759f2-3183-48fa-aaee-14b24c5337d7)\"" pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-6pwlx" podUID="c39759f2-3183-48fa-aaee-14b24c5337d7" Nov 26 15:31:46 crc kubenswrapper[4785]: I1126 15:31:46.047706 4785 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Nov 26 15:31:46 crc kubenswrapper[4785]: I1126 15:31:46.077665 4785 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Nov 26 15:31:46 crc kubenswrapper[4785]: I1126 15:31:46.082105 4785 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-4nwqx" Nov 26 15:31:46 crc kubenswrapper[4785]: I1126 15:31:46.261042 4785 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Nov 26 15:31:46 crc kubenswrapper[4785]: I1126 15:31:46.262583 4785 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Nov 26 15:31:46 crc kubenswrapper[4785]: I1126 15:31:46.285938 4785 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-index-dockercfg-rzv55" Nov 26 15:31:46 crc kubenswrapper[4785]: I1126 15:31:46.287225 4785 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Nov 26 15:31:46 crc kubenswrapper[4785]: I1126 15:31:46.296496 4785 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-index-dockercfg-k26qg" Nov 26 15:31:46 crc kubenswrapper[4785]: I1126 15:31:46.340876 4785 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Nov 26 15:31:46 crc kubenswrapper[4785]: I1126 15:31:46.383103 4785 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Nov 26 15:31:46 crc kubenswrapper[4785]: I1126 15:31:46.385461 4785 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"memcached-memcached-dockercfg-qnhwd" Nov 26 15:31:46 crc kubenswrapper[4785]: I1126 15:31:46.386599 4785 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Nov 26 15:31:46 crc kubenswrapper[4785]: I1126 15:31:46.421123 4785 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Nov 26 15:31:46 crc kubenswrapper[4785]: I1126 15:31:46.486911 4785 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Nov 26 15:31:46 crc kubenswrapper[4785]: I1126 15:31:46.489741 4785 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Nov 26 15:31:46 crc kubenswrapper[4785]: I1126 15:31:46.534376 4785 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Nov 26 15:31:46 crc kubenswrapper[4785]: I1126 15:31:46.538480 4785 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Nov 26 15:31:46 crc kubenswrapper[4785]: I1126 15:31:46.554537 4785 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Nov 26 15:31:46 crc kubenswrapper[4785]: I1126 15:31:46.558175 4785 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Nov 26 15:31:46 crc kubenswrapper[4785]: I1126 15:31:46.587025 4785 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Nov 26 15:31:46 crc kubenswrapper[4785]: I1126 15:31:46.716393 4785 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Nov 26 15:31:46 crc kubenswrapper[4785]: I1126 15:31:46.721870 4785 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Nov 26 15:31:46 crc kubenswrapper[4785]: I1126 15:31:46.726162 4785 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Nov 26 15:31:46 crc kubenswrapper[4785]: I1126 15:31:46.751086 4785 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Nov 26 15:31:46 crc kubenswrapper[4785]: I1126 15:31:46.765099 4785 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Nov 26 15:31:46 crc kubenswrapper[4785]: I1126 15:31:46.767784 4785 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Nov 26 15:31:46 crc kubenswrapper[4785]: I1126 15:31:46.811807 4785 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Nov 26 15:31:46 crc kubenswrapper[4785]: I1126 15:31:46.834183 4785 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Nov 26 15:31:46 crc kubenswrapper[4785]: I1126 15:31:46.903965 4785 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Nov 26 15:31:47 crc kubenswrapper[4785]: I1126 15:31:47.045577 4785 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Nov 26 15:31:47 crc kubenswrapper[4785]: I1126 15:31:47.106879 4785 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Nov 26 15:31:47 crc kubenswrapper[4785]: I1126 15:31:47.123520 4785 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Nov 26 15:31:47 crc kubenswrapper[4785]: I1126 15:31:47.130009 4785 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack-operators/infra-operator-controller-manager-f68bdc44b-4p65x" Nov 26 15:31:47 crc kubenswrapper[4785]: I1126 15:31:47.130061 4785 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-f68bdc44b-4p65x" Nov 26 15:31:47 crc kubenswrapper[4785]: I1126 15:31:47.131033 4785 scope.go:117] "RemoveContainer" containerID="94ca74f05ace369e770af1e6ec9079e0bfd52006a01284ba7aa5d333cc167f5d" Nov 26 15:31:47 crc kubenswrapper[4785]: E1126 15:31:47.131441 4785 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 20s restarting failed container=manager pod=infra-operator-controller-manager-f68bdc44b-4p65x_openstack-operators(b37e9a18-1d3b-4a5b-aa23-d9cc2f6394e3)\"" pod="openstack-operators/infra-operator-controller-manager-f68bdc44b-4p65x" podUID="b37e9a18-1d3b-4a5b-aa23-d9cc2f6394e3" Nov 26 15:31:47 crc kubenswrapper[4785]: I1126 15:31:47.240436 4785 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Nov 26 15:31:47 crc kubenswrapper[4785]: I1126 15:31:47.254523 4785 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"galera-openstack-dockercfg-xjh5b" Nov 26 15:31:47 crc kubenswrapper[4785]: I1126 15:31:47.314595 4785 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"rabbitmq-server-dockercfg-4mfcx" Nov 26 15:31:47 crc kubenswrapper[4785]: I1126 15:31:47.315860 4785 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Nov 26 15:31:47 crc kubenswrapper[4785]: I1126 15:31:47.325739 4785 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Nov 26 15:31:47 crc kubenswrapper[4785]: I1126 15:31:47.341435 4785 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Nov 26 15:31:47 crc kubenswrapper[4785]: I1126 15:31:47.341714 4785 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" containerID="cri-o://b3efab6087bbac29595039c25728ab3ec31fc432f718c79b860154e99919b047" gracePeriod=5 Nov 26 15:31:47 crc kubenswrapper[4785]: I1126 15:31:47.345344 4785 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Nov 26 15:31:47 crc kubenswrapper[4785]: I1126 15:31:47.380917 4785 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Nov 26 15:31:47 crc kubenswrapper[4785]: I1126 15:31:47.414793 4785 reflector.go:368] Caches populated for *v1.ConfigMap from object-"glance-kuttl-tests"/"swift-storage-config-data" Nov 26 15:31:47 crc kubenswrapper[4785]: I1126 15:31:47.422887 4785 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Nov 26 15:31:47 crc kubenswrapper[4785]: I1126 15:31:47.442455 4785 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Nov 26 15:31:47 crc kubenswrapper[4785]: I1126 15:31:47.513284 4785 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Nov 26 15:31:47 crc kubenswrapper[4785]: I1126 15:31:47.515131 4785 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Nov 26 15:31:47 crc kubenswrapper[4785]: I1126 15:31:47.556421 4785 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Nov 26 15:31:47 crc kubenswrapper[4785]: I1126 15:31:47.750930 4785 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Nov 26 15:31:47 crc kubenswrapper[4785]: I1126 15:31:47.769948 4785 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Nov 26 15:31:47 crc kubenswrapper[4785]: I1126 15:31:47.785008 4785 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Nov 26 15:31:47 crc kubenswrapper[4785]: I1126 15:31:47.805565 4785 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Nov 26 15:31:47 crc kubenswrapper[4785]: I1126 15:31:47.832646 4785 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Nov 26 15:31:47 crc kubenswrapper[4785]: I1126 15:31:47.833993 4785 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Nov 26 15:31:47 crc kubenswrapper[4785]: I1126 15:31:47.945962 4785 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Nov 26 15:31:47 crc kubenswrapper[4785]: I1126 15:31:47.989702 4785 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Nov 26 15:31:48 crc kubenswrapper[4785]: I1126 15:31:48.104961 4785 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Nov 26 15:31:48 crc kubenswrapper[4785]: I1126 15:31:48.105482 4785 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Nov 26 15:31:48 crc kubenswrapper[4785]: I1126 15:31:48.127181 4785 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Nov 26 15:31:48 crc kubenswrapper[4785]: I1126 15:31:48.315964 4785 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Nov 26 15:31:48 crc kubenswrapper[4785]: I1126 15:31:48.321602 4785 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Nov 26 15:31:48 crc kubenswrapper[4785]: I1126 15:31:48.479793 4785 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Nov 26 15:31:48 crc kubenswrapper[4785]: I1126 15:31:48.661216 4785 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Nov 26 15:31:48 crc kubenswrapper[4785]: I1126 15:31:48.676636 4785 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-xj7wq" Nov 26 15:31:48 crc kubenswrapper[4785]: I1126 15:31:48.709271 4785 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Nov 26 15:31:48 crc kubenswrapper[4785]: I1126 15:31:48.745711 4785 reflector.go:368] Caches populated for *v1.ConfigMap from object-"glance-kuttl-tests"/"memcached-config-data" Nov 26 15:31:48 crc kubenswrapper[4785]: I1126 15:31:48.821459 4785 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Nov 26 15:31:48 crc kubenswrapper[4785]: I1126 15:31:48.903536 4785 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Nov 26 15:31:48 crc kubenswrapper[4785]: I1126 15:31:48.915790 4785 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Nov 26 15:31:48 crc kubenswrapper[4785]: I1126 15:31:48.977512 4785 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-index-dockercfg-48tqc" Nov 26 15:31:49 crc kubenswrapper[4785]: I1126 15:31:49.001749 4785 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-index-dockercfg-v6tlh" Nov 26 15:31:49 crc kubenswrapper[4785]: I1126 15:31:49.013838 4785 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Nov 26 15:31:49 crc kubenswrapper[4785]: I1126 15:31:49.053776 4785 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Nov 26 15:31:49 crc kubenswrapper[4785]: I1126 15:31:49.080073 4785 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Nov 26 15:31:49 crc kubenswrapper[4785]: I1126 15:31:49.176903 4785 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Nov 26 15:31:49 crc kubenswrapper[4785]: I1126 15:31:49.192198 4785 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Nov 26 15:31:49 crc kubenswrapper[4785]: I1126 15:31:49.203104 4785 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Nov 26 15:31:49 crc kubenswrapper[4785]: I1126 15:31:49.245577 4785 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Nov 26 15:31:49 crc kubenswrapper[4785]: I1126 15:31:49.266844 4785 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Nov 26 15:31:49 crc kubenswrapper[4785]: I1126 15:31:49.361575 4785 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Nov 26 15:31:49 crc kubenswrapper[4785]: I1126 15:31:49.364107 4785 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Nov 26 15:31:49 crc kubenswrapper[4785]: I1126 15:31:49.519686 4785 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Nov 26 15:31:49 crc kubenswrapper[4785]: I1126 15:31:49.544098 4785 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-kz6tr" Nov 26 15:31:49 crc kubenswrapper[4785]: I1126 15:31:49.603215 4785 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Nov 26 15:31:49 crc kubenswrapper[4785]: I1126 15:31:49.609456 4785 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Nov 26 15:31:49 crc kubenswrapper[4785]: I1126 15:31:49.647076 4785 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Nov 26 15:31:49 crc kubenswrapper[4785]: I1126 15:31:49.652621 4785 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Nov 26 15:31:49 crc kubenswrapper[4785]: I1126 15:31:49.800627 4785 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Nov 26 15:31:49 crc kubenswrapper[4785]: I1126 15:31:49.852416 4785 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"keystone" Nov 26 15:31:49 crc kubenswrapper[4785]: I1126 15:31:49.941862 4785 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"swift-swift-dockercfg-66h9w" Nov 26 15:31:49 crc kubenswrapper[4785]: I1126 15:31:49.960166 4785 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"rabbitmq-erlang-cookie" Nov 26 15:31:49 crc kubenswrapper[4785]: I1126 15:31:49.960678 4785 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Nov 26 15:31:49 crc kubenswrapper[4785]: I1126 15:31:49.977776 4785 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Nov 26 15:31:50 crc kubenswrapper[4785]: I1126 15:31:50.059114 4785 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Nov 26 15:31:50 crc kubenswrapper[4785]: I1126 15:31:50.113473 4785 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Nov 26 15:31:50 crc kubenswrapper[4785]: I1126 15:31:50.166477 4785 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Nov 26 15:31:50 crc kubenswrapper[4785]: I1126 15:31:50.425875 4785 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Nov 26 15:31:50 crc kubenswrapper[4785]: I1126 15:31:50.594757 4785 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Nov 26 15:31:50 crc kubenswrapper[4785]: I1126 15:31:50.630179 4785 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Nov 26 15:31:50 crc kubenswrapper[4785]: I1126 15:31:50.690366 4785 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Nov 26 15:31:50 crc kubenswrapper[4785]: I1126 15:31:50.706171 4785 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Nov 26 15:31:50 crc kubenswrapper[4785]: I1126 15:31:50.848431 4785 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-index-dockercfg-rmbn8" Nov 26 15:31:50 crc kubenswrapper[4785]: I1126 15:31:50.902361 4785 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack-operators/keystone-operator-controller-manager-68b4f95d6c-cpkqd" Nov 26 15:31:50 crc kubenswrapper[4785]: I1126 15:31:50.903123 4785 scope.go:117] "RemoveContainer" containerID="b49d0a231b02050ae980bc7bc9d683ace1a8cf024b23508af7ae9e4865580cdb" Nov 26 15:31:50 crc kubenswrapper[4785]: E1126 15:31:50.903417 4785 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 20s restarting failed container=manager pod=keystone-operator-controller-manager-68b4f95d6c-cpkqd_openstack-operators(bc8e3329-ae9c-48b1-a49c-92eeef6ae114)\"" pod="openstack-operators/keystone-operator-controller-manager-68b4f95d6c-cpkqd" podUID="bc8e3329-ae9c-48b1-a49c-92eeef6ae114" Nov 26 15:31:50 crc kubenswrapper[4785]: I1126 15:31:50.978370 4785 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-nhb66" Nov 26 15:31:51 crc kubenswrapper[4785]: I1126 15:31:51.047358 4785 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Nov 26 15:31:51 crc kubenswrapper[4785]: I1126 15:31:51.094467 4785 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Nov 26 15:31:51 crc kubenswrapper[4785]: I1126 15:31:51.220397 4785 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Nov 26 15:31:51 crc kubenswrapper[4785]: I1126 15:31:51.225608 4785 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-4hcqv" Nov 26 15:31:51 crc kubenswrapper[4785]: I1126 15:31:51.254649 4785 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"swift-conf" Nov 26 15:31:51 crc kubenswrapper[4785]: I1126 15:31:51.399452 4785 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Nov 26 15:31:51 crc kubenswrapper[4785]: I1126 15:31:51.466932 4785 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Nov 26 15:31:51 crc kubenswrapper[4785]: I1126 15:31:51.849880 4785 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Nov 26 15:31:51 crc kubenswrapper[4785]: I1126 15:31:51.876056 4785 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Nov 26 15:31:51 crc kubenswrapper[4785]: I1126 15:31:51.880464 4785 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Nov 26 15:31:51 crc kubenswrapper[4785]: I1126 15:31:51.881739 4785 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Nov 26 15:31:51 crc kubenswrapper[4785]: I1126 15:31:51.884097 4785 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack-operators/horizon-operator-controller-manager-647db694df-qnrxh" Nov 26 15:31:51 crc kubenswrapper[4785]: I1126 15:31:51.884857 4785 scope.go:117] "RemoveContainer" containerID="60d827a8d87a92fe2f68095a9b54f28b944efeca0837f67ff2d4bc193d4cb7d2" Nov 26 15:31:51 crc kubenswrapper[4785]: E1126 15:31:51.885108 4785 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 20s restarting failed container=manager pod=horizon-operator-controller-manager-647db694df-qnrxh_openstack-operators(3e7e6d72-e83c-4188-a2bc-11b7c16e6e8e)\"" pod="openstack-operators/horizon-operator-controller-manager-647db694df-qnrxh" podUID="3e7e6d72-e83c-4188-a2bc-11b7c16e6e8e" Nov 26 15:31:51 crc kubenswrapper[4785]: I1126 15:31:51.919295 4785 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Nov 26 15:31:51 crc kubenswrapper[4785]: I1126 15:31:51.948486 4785 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"keystone-config-data" Nov 26 15:31:52 crc kubenswrapper[4785]: I1126 15:31:52.171415 4785 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-dockercfg-xx9r4" Nov 26 15:31:52 crc kubenswrapper[4785]: I1126 15:31:52.204987 4785 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-default-internal-config-data" Nov 26 15:31:52 crc kubenswrapper[4785]: I1126 15:31:52.333377 4785 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Nov 26 15:31:52 crc kubenswrapper[4785]: I1126 15:31:52.475689 4785 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Nov 26 15:31:52 crc kubenswrapper[4785]: I1126 15:31:52.475756 4785 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 26 15:31:52 crc kubenswrapper[4785]: I1126 15:31:52.614904 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Nov 26 15:31:52 crc kubenswrapper[4785]: I1126 15:31:52.615019 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Nov 26 15:31:52 crc kubenswrapper[4785]: I1126 15:31:52.615045 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests" (OuterVolumeSpecName: "manifests") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 15:31:52 crc kubenswrapper[4785]: I1126 15:31:52.615082 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Nov 26 15:31:52 crc kubenswrapper[4785]: I1126 15:31:52.615103 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Nov 26 15:31:52 crc kubenswrapper[4785]: I1126 15:31:52.615106 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock" (OuterVolumeSpecName: "var-lock") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 15:31:52 crc kubenswrapper[4785]: I1126 15:31:52.615135 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log" (OuterVolumeSpecName: "var-log") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 15:31:52 crc kubenswrapper[4785]: I1126 15:31:52.615190 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Nov 26 15:31:52 crc kubenswrapper[4785]: I1126 15:31:52.615236 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 15:31:52 crc kubenswrapper[4785]: I1126 15:31:52.615510 4785 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") on node \"crc\" DevicePath \"\"" Nov 26 15:31:52 crc kubenswrapper[4785]: I1126 15:31:52.615525 4785 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") on node \"crc\" DevicePath \"\"" Nov 26 15:31:52 crc kubenswrapper[4785]: I1126 15:31:52.615537 4785 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") on node \"crc\" DevicePath \"\"" Nov 26 15:31:52 crc kubenswrapper[4785]: I1126 15:31:52.615596 4785 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") on node \"crc\" DevicePath \"\"" Nov 26 15:31:52 crc kubenswrapper[4785]: I1126 15:31:52.623986 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 26 15:31:52 crc kubenswrapper[4785]: I1126 15:31:52.717348 4785 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") on node \"crc\" DevicePath \"\"" Nov 26 15:31:52 crc kubenswrapper[4785]: I1126 15:31:52.874681 4785 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Nov 26 15:31:52 crc kubenswrapper[4785]: I1126 15:31:52.895081 4785 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Nov 26 15:31:53 crc kubenswrapper[4785]: I1126 15:31:53.036976 4785 scope.go:117] "RemoveContainer" containerID="d4af367c56f3c38355ea85841d9d9666b98fd70e7493e25b6c38880c8398b31f" Nov 26 15:31:53 crc kubenswrapper[4785]: I1126 15:31:53.047387 4785 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" path="/var/lib/kubelet/pods/f85e55b1a89d02b0cb034b1ea31ed45a/volumes" Nov 26 15:31:53 crc kubenswrapper[4785]: I1126 15:31:53.047669 4785 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="" Nov 26 15:31:53 crc kubenswrapper[4785]: I1126 15:31:53.058907 4785 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Nov 26 15:31:53 crc kubenswrapper[4785]: I1126 15:31:53.058960 4785 kubelet.go:2649] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" mirrorPodUID="b0d85bba-841a-4605-b159-cbb37d384c66" Nov 26 15:31:53 crc kubenswrapper[4785]: I1126 15:31:53.065176 4785 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Nov 26 15:31:53 crc kubenswrapper[4785]: I1126 15:31:53.065213 4785 kubelet.go:2673] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" mirrorPodUID="b0d85bba-841a-4605-b159-cbb37d384c66" Nov 26 15:31:53 crc kubenswrapper[4785]: I1126 15:31:53.276185 4785 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Nov 26 15:31:53 crc kubenswrapper[4785]: I1126 15:31:53.276279 4785 generic.go:334] "Generic (PLEG): container finished" podID="f85e55b1a89d02b0cb034b1ea31ed45a" containerID="b3efab6087bbac29595039c25728ab3ec31fc432f718c79b860154e99919b047" exitCode=137 Nov 26 15:31:53 crc kubenswrapper[4785]: I1126 15:31:53.276366 4785 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 26 15:31:53 crc kubenswrapper[4785]: I1126 15:31:53.276402 4785 scope.go:117] "RemoveContainer" containerID="b3efab6087bbac29595039c25728ab3ec31fc432f718c79b860154e99919b047" Nov 26 15:31:53 crc kubenswrapper[4785]: I1126 15:31:53.279493 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-84667dbb5-sslgl" event={"ID":"161001b1-a5be-49ea-8031-e2c11dd07800","Type":"ContainerStarted","Data":"8fa7bd9aa005831551b78fa8e8ddaa94df733db800caba108e744391c96dbf62"} Nov 26 15:31:53 crc kubenswrapper[4785]: I1126 15:31:53.279970 4785 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-84667dbb5-sslgl" Nov 26 15:31:53 crc kubenswrapper[4785]: I1126 15:31:53.295653 4785 scope.go:117] "RemoveContainer" containerID="b3efab6087bbac29595039c25728ab3ec31fc432f718c79b860154e99919b047" Nov 26 15:31:53 crc kubenswrapper[4785]: E1126 15:31:53.299946 4785 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b3efab6087bbac29595039c25728ab3ec31fc432f718c79b860154e99919b047\": container with ID starting with b3efab6087bbac29595039c25728ab3ec31fc432f718c79b860154e99919b047 not found: ID does not exist" containerID="b3efab6087bbac29595039c25728ab3ec31fc432f718c79b860154e99919b047" Nov 26 15:31:53 crc kubenswrapper[4785]: I1126 15:31:53.300288 4785 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b3efab6087bbac29595039c25728ab3ec31fc432f718c79b860154e99919b047"} err="failed to get container status \"b3efab6087bbac29595039c25728ab3ec31fc432f718c79b860154e99919b047\": rpc error: code = NotFound desc = could not find container \"b3efab6087bbac29595039c25728ab3ec31fc432f718c79b860154e99919b047\": container with ID starting with b3efab6087bbac29595039c25728ab3ec31fc432f718c79b860154e99919b047 not found: ID does not exist" Nov 26 15:31:53 crc kubenswrapper[4785]: I1126 15:31:53.769283 4785 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Nov 26 15:31:53 crc kubenswrapper[4785]: I1126 15:31:53.954729 4785 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Nov 26 15:31:54 crc kubenswrapper[4785]: I1126 15:31:54.036244 4785 scope.go:117] "RemoveContainer" containerID="b1f9cc1d9a271ab83902646afda36e5d9aa5c62a0003a566acebc65861362b7a" Nov 26 15:31:54 crc kubenswrapper[4785]: E1126 15:31:54.036642 4785 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 20s restarting failed container=manager pod=swift-operator-controller-manager-5d784fc5bb-kn67f_openstack-operators(60b24860-07b4-4841-9c4a-a5e6456a45dc)\"" pod="openstack-operators/swift-operator-controller-manager-5d784fc5bb-kn67f" podUID="60b24860-07b4-4841-9c4a-a5e6456a45dc" Nov 26 15:31:54 crc kubenswrapper[4785]: I1126 15:31:54.463670 4785 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Nov 26 15:31:56 crc kubenswrapper[4785]: I1126 15:31:56.036265 4785 scope.go:117] "RemoveContainer" containerID="1c4d56368d21e55ca5e605cd5efcd1425e412efc9e9d1d8d4dd4cf2b910a1b69" Nov 26 15:31:56 crc kubenswrapper[4785]: I1126 15:31:56.036493 4785 scope.go:117] "RemoveContainer" containerID="bfdbbfd53693ffb13e3109168263dbcd1dd7251f73c2975eec5193a202b71db5" Nov 26 15:31:56 crc kubenswrapper[4785]: E1126 15:31:56.036578 4785 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 20s restarting failed container=manager pod=mariadb-operator-controller-manager-747fb5cb85-5slw2_openstack-operators(8d72a0d6-d729-4c0f-90c7-2a5eca6fc32a)\"" pod="openstack-operators/mariadb-operator-controller-manager-747fb5cb85-5slw2" podUID="8d72a0d6-d729-4c0f-90c7-2a5eca6fc32a" Nov 26 15:31:56 crc kubenswrapper[4785]: E1126 15:31:56.036893 4785 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 20s restarting failed container=manager pod=glance-operator-controller-manager-56ccd5f88c-dzft5_openstack-operators(62cac43a-a147-46b5-bbd6-4b452a008291)\"" pod="openstack-operators/glance-operator-controller-manager-56ccd5f88c-dzft5" podUID="62cac43a-a147-46b5-bbd6-4b452a008291" Nov 26 15:31:58 crc kubenswrapper[4785]: I1126 15:31:58.417600 4785 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-tvs7f"] Nov 26 15:31:58 crc kubenswrapper[4785]: E1126 15:31:58.418273 4785 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Nov 26 15:31:58 crc kubenswrapper[4785]: I1126 15:31:58.418288 4785 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Nov 26 15:31:58 crc kubenswrapper[4785]: E1126 15:31:58.418314 4785 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c5ec6b55-6a88-4582-b241-fea4151ff61f" containerName="installer" Nov 26 15:31:58 crc kubenswrapper[4785]: I1126 15:31:58.418325 4785 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5ec6b55-6a88-4582-b241-fea4151ff61f" containerName="installer" Nov 26 15:31:58 crc kubenswrapper[4785]: I1126 15:31:58.418494 4785 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Nov 26 15:31:58 crc kubenswrapper[4785]: I1126 15:31:58.418513 4785 memory_manager.go:354] "RemoveStaleState removing state" podUID="c5ec6b55-6a88-4582-b241-fea4151ff61f" containerName="installer" Nov 26 15:31:58 crc kubenswrapper[4785]: I1126 15:31:58.419892 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tvs7f" Nov 26 15:31:58 crc kubenswrapper[4785]: I1126 15:31:58.430140 4785 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-tvs7f"] Nov 26 15:31:58 crc kubenswrapper[4785]: I1126 15:31:58.525146 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/850a62cf-3d25-4de4-8783-d692272ee55f-utilities\") pod \"redhat-operators-tvs7f\" (UID: \"850a62cf-3d25-4de4-8783-d692272ee55f\") " pod="openshift-marketplace/redhat-operators-tvs7f" Nov 26 15:31:58 crc kubenswrapper[4785]: I1126 15:31:58.525203 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/850a62cf-3d25-4de4-8783-d692272ee55f-catalog-content\") pod \"redhat-operators-tvs7f\" (UID: \"850a62cf-3d25-4de4-8783-d692272ee55f\") " pod="openshift-marketplace/redhat-operators-tvs7f" Nov 26 15:31:58 crc kubenswrapper[4785]: I1126 15:31:58.525225 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n72rk\" (UniqueName: \"kubernetes.io/projected/850a62cf-3d25-4de4-8783-d692272ee55f-kube-api-access-n72rk\") pod \"redhat-operators-tvs7f\" (UID: \"850a62cf-3d25-4de4-8783-d692272ee55f\") " pod="openshift-marketplace/redhat-operators-tvs7f" Nov 26 15:31:58 crc kubenswrapper[4785]: I1126 15:31:58.626855 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/850a62cf-3d25-4de4-8783-d692272ee55f-utilities\") pod \"redhat-operators-tvs7f\" (UID: \"850a62cf-3d25-4de4-8783-d692272ee55f\") " pod="openshift-marketplace/redhat-operators-tvs7f" Nov 26 15:31:58 crc kubenswrapper[4785]: I1126 15:31:58.626932 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/850a62cf-3d25-4de4-8783-d692272ee55f-catalog-content\") pod \"redhat-operators-tvs7f\" (UID: \"850a62cf-3d25-4de4-8783-d692272ee55f\") " pod="openshift-marketplace/redhat-operators-tvs7f" Nov 26 15:31:58 crc kubenswrapper[4785]: I1126 15:31:58.626980 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n72rk\" (UniqueName: \"kubernetes.io/projected/850a62cf-3d25-4de4-8783-d692272ee55f-kube-api-access-n72rk\") pod \"redhat-operators-tvs7f\" (UID: \"850a62cf-3d25-4de4-8783-d692272ee55f\") " pod="openshift-marketplace/redhat-operators-tvs7f" Nov 26 15:31:58 crc kubenswrapper[4785]: I1126 15:31:58.627755 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/850a62cf-3d25-4de4-8783-d692272ee55f-utilities\") pod \"redhat-operators-tvs7f\" (UID: \"850a62cf-3d25-4de4-8783-d692272ee55f\") " pod="openshift-marketplace/redhat-operators-tvs7f" Nov 26 15:31:58 crc kubenswrapper[4785]: I1126 15:31:58.627748 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/850a62cf-3d25-4de4-8783-d692272ee55f-catalog-content\") pod \"redhat-operators-tvs7f\" (UID: \"850a62cf-3d25-4de4-8783-d692272ee55f\") " pod="openshift-marketplace/redhat-operators-tvs7f" Nov 26 15:31:58 crc kubenswrapper[4785]: I1126 15:31:58.648169 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n72rk\" (UniqueName: \"kubernetes.io/projected/850a62cf-3d25-4de4-8783-d692272ee55f-kube-api-access-n72rk\") pod \"redhat-operators-tvs7f\" (UID: \"850a62cf-3d25-4de4-8783-d692272ee55f\") " pod="openshift-marketplace/redhat-operators-tvs7f" Nov 26 15:31:58 crc kubenswrapper[4785]: I1126 15:31:58.736643 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tvs7f" Nov 26 15:31:59 crc kubenswrapper[4785]: I1126 15:31:59.036064 4785 scope.go:117] "RemoveContainer" containerID="0656cbbef2ca4d93172de86af1871129e2f9cdfe898e4a3ff4e71664e7cc41ee" Nov 26 15:31:59 crc kubenswrapper[4785]: I1126 15:31:59.180265 4785 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-tvs7f"] Nov 26 15:31:59 crc kubenswrapper[4785]: W1126 15:31:59.184853 4785 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod850a62cf_3d25_4de4_8783_d692272ee55f.slice/crio-d4489e5117b7a142e4a8cb0ddb42d2bdaa2e73dd72a7c7d19953c4b1f17f8f2c WatchSource:0}: Error finding container d4489e5117b7a142e4a8cb0ddb42d2bdaa2e73dd72a7c7d19953c4b1f17f8f2c: Status 404 returned error can't find the container with id d4489e5117b7a142e4a8cb0ddb42d2bdaa2e73dd72a7c7d19953c4b1f17f8f2c Nov 26 15:31:59 crc kubenswrapper[4785]: I1126 15:31:59.334040 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tvs7f" event={"ID":"850a62cf-3d25-4de4-8783-d692272ee55f","Type":"ContainerStarted","Data":"5774977d7fb55468889bdb29eb42a4e1e2af8ce6b3424a149ad6a0ad16a8375e"} Nov 26 15:31:59 crc kubenswrapper[4785]: I1126 15:31:59.334091 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tvs7f" event={"ID":"850a62cf-3d25-4de4-8783-d692272ee55f","Type":"ContainerStarted","Data":"d4489e5117b7a142e4a8cb0ddb42d2bdaa2e73dd72a7c7d19953c4b1f17f8f2c"} Nov 26 15:31:59 crc kubenswrapper[4785]: I1126 15:31:59.336311 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-6pwlx" event={"ID":"c39759f2-3183-48fa-aaee-14b24c5337d7","Type":"ContainerStarted","Data":"f34cce391cde66f4f6fdebe699731b8a4987a451873bd62da6182d22a4c70977"} Nov 26 15:32:00 crc kubenswrapper[4785]: I1126 15:32:00.356308 4785 generic.go:334] "Generic (PLEG): container finished" podID="850a62cf-3d25-4de4-8783-d692272ee55f" containerID="5774977d7fb55468889bdb29eb42a4e1e2af8ce6b3424a149ad6a0ad16a8375e" exitCode=0 Nov 26 15:32:00 crc kubenswrapper[4785]: I1126 15:32:00.356667 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tvs7f" event={"ID":"850a62cf-3d25-4de4-8783-d692272ee55f","Type":"ContainerDied","Data":"5774977d7fb55468889bdb29eb42a4e1e2af8ce6b3424a149ad6a0ad16a8375e"} Nov 26 15:32:00 crc kubenswrapper[4785]: I1126 15:32:00.361858 4785 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Nov 26 15:32:01 crc kubenswrapper[4785]: I1126 15:32:01.036269 4785 scope.go:117] "RemoveContainer" containerID="94ca74f05ace369e770af1e6ec9079e0bfd52006a01284ba7aa5d333cc167f5d" Nov 26 15:32:01 crc kubenswrapper[4785]: I1126 15:32:01.078967 4785 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-cmq9d"] Nov 26 15:32:01 crc kubenswrapper[4785]: I1126 15:32:01.080866 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-cmq9d" Nov 26 15:32:01 crc kubenswrapper[4785]: I1126 15:32:01.088926 4785 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-cmq9d"] Nov 26 15:32:01 crc kubenswrapper[4785]: I1126 15:32:01.174542 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/98cf1e09-3af0-4286-aa6f-43dbff032ba1-catalog-content\") pod \"certified-operators-cmq9d\" (UID: \"98cf1e09-3af0-4286-aa6f-43dbff032ba1\") " pod="openshift-marketplace/certified-operators-cmq9d" Nov 26 15:32:01 crc kubenswrapper[4785]: I1126 15:32:01.174622 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/98cf1e09-3af0-4286-aa6f-43dbff032ba1-utilities\") pod \"certified-operators-cmq9d\" (UID: \"98cf1e09-3af0-4286-aa6f-43dbff032ba1\") " pod="openshift-marketplace/certified-operators-cmq9d" Nov 26 15:32:01 crc kubenswrapper[4785]: I1126 15:32:01.174782 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x46mq\" (UniqueName: \"kubernetes.io/projected/98cf1e09-3af0-4286-aa6f-43dbff032ba1-kube-api-access-x46mq\") pod \"certified-operators-cmq9d\" (UID: \"98cf1e09-3af0-4286-aa6f-43dbff032ba1\") " pod="openshift-marketplace/certified-operators-cmq9d" Nov 26 15:32:01 crc kubenswrapper[4785]: I1126 15:32:01.276485 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x46mq\" (UniqueName: \"kubernetes.io/projected/98cf1e09-3af0-4286-aa6f-43dbff032ba1-kube-api-access-x46mq\") pod \"certified-operators-cmq9d\" (UID: \"98cf1e09-3af0-4286-aa6f-43dbff032ba1\") " pod="openshift-marketplace/certified-operators-cmq9d" Nov 26 15:32:01 crc kubenswrapper[4785]: I1126 15:32:01.276667 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/98cf1e09-3af0-4286-aa6f-43dbff032ba1-catalog-content\") pod \"certified-operators-cmq9d\" (UID: \"98cf1e09-3af0-4286-aa6f-43dbff032ba1\") " pod="openshift-marketplace/certified-operators-cmq9d" Nov 26 15:32:01 crc kubenswrapper[4785]: I1126 15:32:01.276694 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/98cf1e09-3af0-4286-aa6f-43dbff032ba1-utilities\") pod \"certified-operators-cmq9d\" (UID: \"98cf1e09-3af0-4286-aa6f-43dbff032ba1\") " pod="openshift-marketplace/certified-operators-cmq9d" Nov 26 15:32:01 crc kubenswrapper[4785]: I1126 15:32:01.277145 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/98cf1e09-3af0-4286-aa6f-43dbff032ba1-catalog-content\") pod \"certified-operators-cmq9d\" (UID: \"98cf1e09-3af0-4286-aa6f-43dbff032ba1\") " pod="openshift-marketplace/certified-operators-cmq9d" Nov 26 15:32:01 crc kubenswrapper[4785]: I1126 15:32:01.277248 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/98cf1e09-3af0-4286-aa6f-43dbff032ba1-utilities\") pod \"certified-operators-cmq9d\" (UID: \"98cf1e09-3af0-4286-aa6f-43dbff032ba1\") " pod="openshift-marketplace/certified-operators-cmq9d" Nov 26 15:32:01 crc kubenswrapper[4785]: I1126 15:32:01.300619 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x46mq\" (UniqueName: \"kubernetes.io/projected/98cf1e09-3af0-4286-aa6f-43dbff032ba1-kube-api-access-x46mq\") pod \"certified-operators-cmq9d\" (UID: \"98cf1e09-3af0-4286-aa6f-43dbff032ba1\") " pod="openshift-marketplace/certified-operators-cmq9d" Nov 26 15:32:01 crc kubenswrapper[4785]: I1126 15:32:01.365723 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-f68bdc44b-4p65x" event={"ID":"b37e9a18-1d3b-4a5b-aa23-d9cc2f6394e3","Type":"ContainerStarted","Data":"13c254bfef8d299f0484915d7342f848b4be2fb74df117d2b4be418c7839156c"} Nov 26 15:32:01 crc kubenswrapper[4785]: I1126 15:32:01.365974 4785 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-f68bdc44b-4p65x" Nov 26 15:32:01 crc kubenswrapper[4785]: I1126 15:32:01.367965 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tvs7f" event={"ID":"850a62cf-3d25-4de4-8783-d692272ee55f","Type":"ContainerStarted","Data":"17daf3f3a3eb2c0a151f300e7066c3d675628e5d655e67e2857d35d3cd168bcd"} Nov 26 15:32:01 crc kubenswrapper[4785]: I1126 15:32:01.474652 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-cmq9d" Nov 26 15:32:01 crc kubenswrapper[4785]: I1126 15:32:01.942864 4785 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-cmq9d"] Nov 26 15:32:01 crc kubenswrapper[4785]: W1126 15:32:01.949690 4785 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod98cf1e09_3af0_4286_aa6f_43dbff032ba1.slice/crio-7bde13b94a28351dd6efa79b3720dd393248618d1b950614b3b12f2a0a055e06 WatchSource:0}: Error finding container 7bde13b94a28351dd6efa79b3720dd393248618d1b950614b3b12f2a0a055e06: Status 404 returned error can't find the container with id 7bde13b94a28351dd6efa79b3720dd393248618d1b950614b3b12f2a0a055e06 Nov 26 15:32:02 crc kubenswrapper[4785]: I1126 15:32:02.379990 4785 generic.go:334] "Generic (PLEG): container finished" podID="98cf1e09-3af0-4286-aa6f-43dbff032ba1" containerID="0ecd86ffa7062aecf01aca5bb7597113bb22e8dc98ff3a7f2a982517eace1de3" exitCode=0 Nov 26 15:32:02 crc kubenswrapper[4785]: I1126 15:32:02.380096 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cmq9d" event={"ID":"98cf1e09-3af0-4286-aa6f-43dbff032ba1","Type":"ContainerDied","Data":"0ecd86ffa7062aecf01aca5bb7597113bb22e8dc98ff3a7f2a982517eace1de3"} Nov 26 15:32:02 crc kubenswrapper[4785]: I1126 15:32:02.380123 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cmq9d" event={"ID":"98cf1e09-3af0-4286-aa6f-43dbff032ba1","Type":"ContainerStarted","Data":"7bde13b94a28351dd6efa79b3720dd393248618d1b950614b3b12f2a0a055e06"} Nov 26 15:32:02 crc kubenswrapper[4785]: I1126 15:32:02.382708 4785 generic.go:334] "Generic (PLEG): container finished" podID="850a62cf-3d25-4de4-8783-d692272ee55f" containerID="17daf3f3a3eb2c0a151f300e7066c3d675628e5d655e67e2857d35d3cd168bcd" exitCode=0 Nov 26 15:32:02 crc kubenswrapper[4785]: I1126 15:32:02.382760 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tvs7f" event={"ID":"850a62cf-3d25-4de4-8783-d692272ee55f","Type":"ContainerDied","Data":"17daf3f3a3eb2c0a151f300e7066c3d675628e5d655e67e2857d35d3cd168bcd"} Nov 26 15:32:03 crc kubenswrapper[4785]: I1126 15:32:03.040164 4785 scope.go:117] "RemoveContainer" containerID="b49d0a231b02050ae980bc7bc9d683ace1a8cf024b23508af7ae9e4865580cdb" Nov 26 15:32:03 crc kubenswrapper[4785]: I1126 15:32:03.392307 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tvs7f" event={"ID":"850a62cf-3d25-4de4-8783-d692272ee55f","Type":"ContainerStarted","Data":"4503f975a44882cb04bdb348b1fca023dd249dee83c6c45974d02531062d1d05"} Nov 26 15:32:03 crc kubenswrapper[4785]: I1126 15:32:03.394845 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-68b4f95d6c-cpkqd" event={"ID":"bc8e3329-ae9c-48b1-a49c-92eeef6ae114","Type":"ContainerStarted","Data":"623f45591e88d84736e9990eab1c18ed1a838839947b55136461a84c9a8d0436"} Nov 26 15:32:03 crc kubenswrapper[4785]: I1126 15:32:03.395070 4785 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-68b4f95d6c-cpkqd" Nov 26 15:32:03 crc kubenswrapper[4785]: I1126 15:32:03.397323 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cmq9d" event={"ID":"98cf1e09-3af0-4286-aa6f-43dbff032ba1","Type":"ContainerStarted","Data":"05b1732db240efced8921afba290b21f920c035ce14d008e12a64d698231a273"} Nov 26 15:32:03 crc kubenswrapper[4785]: I1126 15:32:03.415007 4785 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-tvs7f" podStartSLOduration=2.7529330720000003 podStartE2EDuration="5.414990847s" podCreationTimestamp="2025-11-26 15:31:58 +0000 UTC" firstStartedPulling="2025-11-26 15:32:00.361622939 +0000 UTC m=+1484.039988703" lastFinishedPulling="2025-11-26 15:32:03.023680714 +0000 UTC m=+1486.702046478" observedRunningTime="2025-11-26 15:32:03.4129027 +0000 UTC m=+1487.091268494" watchObservedRunningTime="2025-11-26 15:32:03.414990847 +0000 UTC m=+1487.093356601" Nov 26 15:32:04 crc kubenswrapper[4785]: I1126 15:32:04.036689 4785 scope.go:117] "RemoveContainer" containerID="60d827a8d87a92fe2f68095a9b54f28b944efeca0837f67ff2d4bc193d4cb7d2" Nov 26 15:32:04 crc kubenswrapper[4785]: I1126 15:32:04.407758 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-647db694df-qnrxh" event={"ID":"3e7e6d72-e83c-4188-a2bc-11b7c16e6e8e","Type":"ContainerStarted","Data":"1a9087812053c098ed961d05f04a782b64acb8a8eed976b5550d977c2b08e029"} Nov 26 15:32:04 crc kubenswrapper[4785]: I1126 15:32:04.408204 4785 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-647db694df-qnrxh" Nov 26 15:32:04 crc kubenswrapper[4785]: I1126 15:32:04.410601 4785 generic.go:334] "Generic (PLEG): container finished" podID="98cf1e09-3af0-4286-aa6f-43dbff032ba1" containerID="05b1732db240efced8921afba290b21f920c035ce14d008e12a64d698231a273" exitCode=0 Nov 26 15:32:04 crc kubenswrapper[4785]: I1126 15:32:04.410754 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cmq9d" event={"ID":"98cf1e09-3af0-4286-aa6f-43dbff032ba1","Type":"ContainerDied","Data":"05b1732db240efced8921afba290b21f920c035ce14d008e12a64d698231a273"} Nov 26 15:32:05 crc kubenswrapper[4785]: I1126 15:32:05.035979 4785 scope.go:117] "RemoveContainer" containerID="b1f9cc1d9a271ab83902646afda36e5d9aa5c62a0003a566acebc65861362b7a" Nov 26 15:32:05 crc kubenswrapper[4785]: I1126 15:32:05.422591 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cmq9d" event={"ID":"98cf1e09-3af0-4286-aa6f-43dbff032ba1","Type":"ContainerStarted","Data":"7cbdb5c31aec060d44293097f056a2c99a25eec7e8178a0c0a0ed57148b9259f"} Nov 26 15:32:05 crc kubenswrapper[4785]: I1126 15:32:05.424690 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-5d784fc5bb-kn67f" event={"ID":"60b24860-07b4-4841-9c4a-a5e6456a45dc","Type":"ContainerStarted","Data":"dddbe58eed2e8021dd8f83c3afc49e50dff5c0e8a6ea5ddf37bf5783b721ef9a"} Nov 26 15:32:05 crc kubenswrapper[4785]: I1126 15:32:05.425025 4785 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-5d784fc5bb-kn67f" Nov 26 15:32:05 crc kubenswrapper[4785]: I1126 15:32:05.453481 4785 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-cmq9d" podStartSLOduration=1.9988334779999999 podStartE2EDuration="4.453465826s" podCreationTimestamp="2025-11-26 15:32:01 +0000 UTC" firstStartedPulling="2025-11-26 15:32:02.383462806 +0000 UTC m=+1486.061828580" lastFinishedPulling="2025-11-26 15:32:04.838095144 +0000 UTC m=+1488.516460928" observedRunningTime="2025-11-26 15:32:05.449537779 +0000 UTC m=+1489.127903563" watchObservedRunningTime="2025-11-26 15:32:05.453465826 +0000 UTC m=+1489.131831590" Nov 26 15:32:07 crc kubenswrapper[4785]: I1126 15:32:07.042537 4785 scope.go:117] "RemoveContainer" containerID="bfdbbfd53693ffb13e3109168263dbcd1dd7251f73c2975eec5193a202b71db5" Nov 26 15:32:07 crc kubenswrapper[4785]: I1126 15:32:07.135955 4785 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-f68bdc44b-4p65x" Nov 26 15:32:07 crc kubenswrapper[4785]: I1126 15:32:07.288806 4785 patch_prober.go:28] interesting pod/machine-config-daemon-gkxdl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 26 15:32:07 crc kubenswrapper[4785]: I1126 15:32:07.289205 4785 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-gkxdl" podUID="5539d39a-e2bc-4e7f-8b5a-a5e3e10c4ba4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 26 15:32:07 crc kubenswrapper[4785]: I1126 15:32:07.445739 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-56ccd5f88c-dzft5" event={"ID":"62cac43a-a147-46b5-bbd6-4b452a008291","Type":"ContainerStarted","Data":"c5dc74e9a3bd7dfae391a9b69cb2b731eb960635b3e32f91bccc67d4b74d7135"} Nov 26 15:32:07 crc kubenswrapper[4785]: I1126 15:32:07.445978 4785 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-56ccd5f88c-dzft5" Nov 26 15:32:08 crc kubenswrapper[4785]: I1126 15:32:08.737845 4785 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-tvs7f" Nov 26 15:32:08 crc kubenswrapper[4785]: I1126 15:32:08.737902 4785 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-tvs7f" Nov 26 15:32:08 crc kubenswrapper[4785]: I1126 15:32:08.779501 4785 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-tvs7f" Nov 26 15:32:09 crc kubenswrapper[4785]: I1126 15:32:09.540836 4785 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-tvs7f" Nov 26 15:32:10 crc kubenswrapper[4785]: I1126 15:32:10.908154 4785 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-68b4f95d6c-cpkqd" Nov 26 15:32:10 crc kubenswrapper[4785]: I1126 15:32:10.984974 4785 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-5d784fc5bb-kn67f" Nov 26 15:32:11 crc kubenswrapper[4785]: I1126 15:32:11.036133 4785 scope.go:117] "RemoveContainer" containerID="1c4d56368d21e55ca5e605cd5efcd1425e412efc9e9d1d8d4dd4cf2b910a1b69" Nov 26 15:32:11 crc kubenswrapper[4785]: I1126 15:32:11.475322 4785 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-cmq9d" Nov 26 15:32:11 crc kubenswrapper[4785]: I1126 15:32:11.476046 4785 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-cmq9d" Nov 26 15:32:11 crc kubenswrapper[4785]: I1126 15:32:11.488525 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-747fb5cb85-5slw2" event={"ID":"8d72a0d6-d729-4c0f-90c7-2a5eca6fc32a","Type":"ContainerStarted","Data":"c79db790ad328380552b3a824cdd061285b58deae8e08567e1691cb92aeb8dd3"} Nov 26 15:32:11 crc kubenswrapper[4785]: I1126 15:32:11.489657 4785 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-747fb5cb85-5slw2" Nov 26 15:32:11 crc kubenswrapper[4785]: I1126 15:32:11.538777 4785 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-cmq9d" Nov 26 15:32:11 crc kubenswrapper[4785]: I1126 15:32:11.889030 4785 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-647db694df-qnrxh" Nov 26 15:32:12 crc kubenswrapper[4785]: I1126 15:32:12.544889 4785 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-cmq9d" Nov 26 15:32:17 crc kubenswrapper[4785]: I1126 15:32:17.572888 4785 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-jvcvt"] Nov 26 15:32:17 crc kubenswrapper[4785]: I1126 15:32:17.575162 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jvcvt" Nov 26 15:32:17 crc kubenswrapper[4785]: I1126 15:32:17.589382 4785 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-jvcvt"] Nov 26 15:32:17 crc kubenswrapper[4785]: I1126 15:32:17.730345 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8abc4613-7050-40be-be6d-4ffc3e686b80-utilities\") pod \"certified-operators-jvcvt\" (UID: \"8abc4613-7050-40be-be6d-4ffc3e686b80\") " pod="openshift-marketplace/certified-operators-jvcvt" Nov 26 15:32:17 crc kubenswrapper[4785]: I1126 15:32:17.731002 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2rvrk\" (UniqueName: \"kubernetes.io/projected/8abc4613-7050-40be-be6d-4ffc3e686b80-kube-api-access-2rvrk\") pod \"certified-operators-jvcvt\" (UID: \"8abc4613-7050-40be-be6d-4ffc3e686b80\") " pod="openshift-marketplace/certified-operators-jvcvt" Nov 26 15:32:17 crc kubenswrapper[4785]: I1126 15:32:17.731191 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8abc4613-7050-40be-be6d-4ffc3e686b80-catalog-content\") pod \"certified-operators-jvcvt\" (UID: \"8abc4613-7050-40be-be6d-4ffc3e686b80\") " pod="openshift-marketplace/certified-operators-jvcvt" Nov 26 15:32:17 crc kubenswrapper[4785]: I1126 15:32:17.799453 4785 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-5t2vd"] Nov 26 15:32:17 crc kubenswrapper[4785]: I1126 15:32:17.800745 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5t2vd" Nov 26 15:32:17 crc kubenswrapper[4785]: I1126 15:32:17.813042 4785 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-5t2vd"] Nov 26 15:32:17 crc kubenswrapper[4785]: I1126 15:32:17.832808 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8abc4613-7050-40be-be6d-4ffc3e686b80-utilities\") pod \"certified-operators-jvcvt\" (UID: \"8abc4613-7050-40be-be6d-4ffc3e686b80\") " pod="openshift-marketplace/certified-operators-jvcvt" Nov 26 15:32:17 crc kubenswrapper[4785]: I1126 15:32:17.832874 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2rvrk\" (UniqueName: \"kubernetes.io/projected/8abc4613-7050-40be-be6d-4ffc3e686b80-kube-api-access-2rvrk\") pod \"certified-operators-jvcvt\" (UID: \"8abc4613-7050-40be-be6d-4ffc3e686b80\") " pod="openshift-marketplace/certified-operators-jvcvt" Nov 26 15:32:17 crc kubenswrapper[4785]: I1126 15:32:17.832915 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8abc4613-7050-40be-be6d-4ffc3e686b80-catalog-content\") pod \"certified-operators-jvcvt\" (UID: \"8abc4613-7050-40be-be6d-4ffc3e686b80\") " pod="openshift-marketplace/certified-operators-jvcvt" Nov 26 15:32:17 crc kubenswrapper[4785]: I1126 15:32:17.833388 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8abc4613-7050-40be-be6d-4ffc3e686b80-utilities\") pod \"certified-operators-jvcvt\" (UID: \"8abc4613-7050-40be-be6d-4ffc3e686b80\") " pod="openshift-marketplace/certified-operators-jvcvt" Nov 26 15:32:17 crc kubenswrapper[4785]: I1126 15:32:17.833457 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8abc4613-7050-40be-be6d-4ffc3e686b80-catalog-content\") pod \"certified-operators-jvcvt\" (UID: \"8abc4613-7050-40be-be6d-4ffc3e686b80\") " pod="openshift-marketplace/certified-operators-jvcvt" Nov 26 15:32:17 crc kubenswrapper[4785]: I1126 15:32:17.855814 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2rvrk\" (UniqueName: \"kubernetes.io/projected/8abc4613-7050-40be-be6d-4ffc3e686b80-kube-api-access-2rvrk\") pod \"certified-operators-jvcvt\" (UID: \"8abc4613-7050-40be-be6d-4ffc3e686b80\") " pod="openshift-marketplace/certified-operators-jvcvt" Nov 26 15:32:17 crc kubenswrapper[4785]: I1126 15:32:17.898626 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jvcvt" Nov 26 15:32:17 crc kubenswrapper[4785]: I1126 15:32:17.934659 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-st5ll\" (UniqueName: \"kubernetes.io/projected/eb2fc6d0-088f-46cb-b99e-9650ecfbba7f-kube-api-access-st5ll\") pod \"community-operators-5t2vd\" (UID: \"eb2fc6d0-088f-46cb-b99e-9650ecfbba7f\") " pod="openshift-marketplace/community-operators-5t2vd" Nov 26 15:32:17 crc kubenswrapper[4785]: I1126 15:32:17.934731 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eb2fc6d0-088f-46cb-b99e-9650ecfbba7f-catalog-content\") pod \"community-operators-5t2vd\" (UID: \"eb2fc6d0-088f-46cb-b99e-9650ecfbba7f\") " pod="openshift-marketplace/community-operators-5t2vd" Nov 26 15:32:17 crc kubenswrapper[4785]: I1126 15:32:17.934834 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eb2fc6d0-088f-46cb-b99e-9650ecfbba7f-utilities\") pod \"community-operators-5t2vd\" (UID: \"eb2fc6d0-088f-46cb-b99e-9650ecfbba7f\") " pod="openshift-marketplace/community-operators-5t2vd" Nov 26 15:32:18 crc kubenswrapper[4785]: I1126 15:32:18.036796 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-st5ll\" (UniqueName: \"kubernetes.io/projected/eb2fc6d0-088f-46cb-b99e-9650ecfbba7f-kube-api-access-st5ll\") pod \"community-operators-5t2vd\" (UID: \"eb2fc6d0-088f-46cb-b99e-9650ecfbba7f\") " pod="openshift-marketplace/community-operators-5t2vd" Nov 26 15:32:18 crc kubenswrapper[4785]: I1126 15:32:18.037190 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eb2fc6d0-088f-46cb-b99e-9650ecfbba7f-catalog-content\") pod \"community-operators-5t2vd\" (UID: \"eb2fc6d0-088f-46cb-b99e-9650ecfbba7f\") " pod="openshift-marketplace/community-operators-5t2vd" Nov 26 15:32:18 crc kubenswrapper[4785]: I1126 15:32:18.037913 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eb2fc6d0-088f-46cb-b99e-9650ecfbba7f-catalog-content\") pod \"community-operators-5t2vd\" (UID: \"eb2fc6d0-088f-46cb-b99e-9650ecfbba7f\") " pod="openshift-marketplace/community-operators-5t2vd" Nov 26 15:32:18 crc kubenswrapper[4785]: I1126 15:32:18.038026 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eb2fc6d0-088f-46cb-b99e-9650ecfbba7f-utilities\") pod \"community-operators-5t2vd\" (UID: \"eb2fc6d0-088f-46cb-b99e-9650ecfbba7f\") " pod="openshift-marketplace/community-operators-5t2vd" Nov 26 15:32:18 crc kubenswrapper[4785]: I1126 15:32:18.038285 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eb2fc6d0-088f-46cb-b99e-9650ecfbba7f-utilities\") pod \"community-operators-5t2vd\" (UID: \"eb2fc6d0-088f-46cb-b99e-9650ecfbba7f\") " pod="openshift-marketplace/community-operators-5t2vd" Nov 26 15:32:18 crc kubenswrapper[4785]: I1126 15:32:18.045547 4785 scope.go:117] "RemoveContainer" containerID="c8b85bd7f94737f2e6c8a48ba50cac91415821b1776a1bc6e3adbaf10332887e" Nov 26 15:32:18 crc kubenswrapper[4785]: I1126 15:32:18.058306 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-st5ll\" (UniqueName: \"kubernetes.io/projected/eb2fc6d0-088f-46cb-b99e-9650ecfbba7f-kube-api-access-st5ll\") pod \"community-operators-5t2vd\" (UID: \"eb2fc6d0-088f-46cb-b99e-9650ecfbba7f\") " pod="openshift-marketplace/community-operators-5t2vd" Nov 26 15:32:18 crc kubenswrapper[4785]: I1126 15:32:18.116394 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5t2vd" Nov 26 15:32:18 crc kubenswrapper[4785]: I1126 15:32:18.326817 4785 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-jvcvt"] Nov 26 15:32:18 crc kubenswrapper[4785]: I1126 15:32:18.551598 4785 generic.go:334] "Generic (PLEG): container finished" podID="8abc4613-7050-40be-be6d-4ffc3e686b80" containerID="be3d2bb6b395a2fbe95a4008af727eedf77fa6ed7ba69668b0763fcf5ac04493" exitCode=0 Nov 26 15:32:18 crc kubenswrapper[4785]: I1126 15:32:18.551891 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jvcvt" event={"ID":"8abc4613-7050-40be-be6d-4ffc3e686b80","Type":"ContainerDied","Data":"be3d2bb6b395a2fbe95a4008af727eedf77fa6ed7ba69668b0763fcf5ac04493"} Nov 26 15:32:18 crc kubenswrapper[4785]: I1126 15:32:18.551921 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jvcvt" event={"ID":"8abc4613-7050-40be-be6d-4ffc3e686b80","Type":"ContainerStarted","Data":"7df2beb3b04966441d2d33839e4283af114a8167afebec8616997839eb66a987"} Nov 26 15:32:18 crc kubenswrapper[4785]: I1126 15:32:18.587598 4785 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-5t2vd"] Nov 26 15:32:18 crc kubenswrapper[4785]: W1126 15:32:18.596477 4785 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podeb2fc6d0_088f_46cb_b99e_9650ecfbba7f.slice/crio-84862910b35164c44c6e8f4f6c7011dc460d02e77fb695b0c6495d9a8805a269 WatchSource:0}: Error finding container 84862910b35164c44c6e8f4f6c7011dc460d02e77fb695b0c6495d9a8805a269: Status 404 returned error can't find the container with id 84862910b35164c44c6e8f4f6c7011dc460d02e77fb695b0c6495d9a8805a269 Nov 26 15:32:19 crc kubenswrapper[4785]: I1126 15:32:19.560803 4785 generic.go:334] "Generic (PLEG): container finished" podID="eb2fc6d0-088f-46cb-b99e-9650ecfbba7f" containerID="cb08ad323fd0d86944c6b644cd09dd47ae8b5973cb650f6d970dc79ad0627e14" exitCode=0 Nov 26 15:32:19 crc kubenswrapper[4785]: I1126 15:32:19.560887 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5t2vd" event={"ID":"eb2fc6d0-088f-46cb-b99e-9650ecfbba7f","Type":"ContainerDied","Data":"cb08ad323fd0d86944c6b644cd09dd47ae8b5973cb650f6d970dc79ad0627e14"} Nov 26 15:32:19 crc kubenswrapper[4785]: I1126 15:32:19.561222 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5t2vd" event={"ID":"eb2fc6d0-088f-46cb-b99e-9650ecfbba7f","Type":"ContainerStarted","Data":"84862910b35164c44c6e8f4f6c7011dc460d02e77fb695b0c6495d9a8805a269"} Nov 26 15:32:19 crc kubenswrapper[4785]: I1126 15:32:19.563339 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jvcvt" event={"ID":"8abc4613-7050-40be-be6d-4ffc3e686b80","Type":"ContainerStarted","Data":"339d2350548f7fae92c2350dba65d706bf8c57db15e8bd46d1c1c3243793bde5"} Nov 26 15:32:19 crc kubenswrapper[4785]: I1126 15:32:19.971143 4785 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-xx9nq"] Nov 26 15:32:19 crc kubenswrapper[4785]: I1126 15:32:19.972694 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xx9nq" Nov 26 15:32:19 crc kubenswrapper[4785]: I1126 15:32:19.980319 4785 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-xx9nq"] Nov 26 15:32:20 crc kubenswrapper[4785]: I1126 15:32:20.068100 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bb46eb70-9020-45d0-8679-a3ea2e75df85-catalog-content\") pod \"redhat-marketplace-xx9nq\" (UID: \"bb46eb70-9020-45d0-8679-a3ea2e75df85\") " pod="openshift-marketplace/redhat-marketplace-xx9nq" Nov 26 15:32:20 crc kubenswrapper[4785]: I1126 15:32:20.068160 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qh2wf\" (UniqueName: \"kubernetes.io/projected/bb46eb70-9020-45d0-8679-a3ea2e75df85-kube-api-access-qh2wf\") pod \"redhat-marketplace-xx9nq\" (UID: \"bb46eb70-9020-45d0-8679-a3ea2e75df85\") " pod="openshift-marketplace/redhat-marketplace-xx9nq" Nov 26 15:32:20 crc kubenswrapper[4785]: I1126 15:32:20.068179 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bb46eb70-9020-45d0-8679-a3ea2e75df85-utilities\") pod \"redhat-marketplace-xx9nq\" (UID: \"bb46eb70-9020-45d0-8679-a3ea2e75df85\") " pod="openshift-marketplace/redhat-marketplace-xx9nq" Nov 26 15:32:20 crc kubenswrapper[4785]: I1126 15:32:20.169426 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bb46eb70-9020-45d0-8679-a3ea2e75df85-catalog-content\") pod \"redhat-marketplace-xx9nq\" (UID: \"bb46eb70-9020-45d0-8679-a3ea2e75df85\") " pod="openshift-marketplace/redhat-marketplace-xx9nq" Nov 26 15:32:20 crc kubenswrapper[4785]: I1126 15:32:20.169529 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qh2wf\" (UniqueName: \"kubernetes.io/projected/bb46eb70-9020-45d0-8679-a3ea2e75df85-kube-api-access-qh2wf\") pod \"redhat-marketplace-xx9nq\" (UID: \"bb46eb70-9020-45d0-8679-a3ea2e75df85\") " pod="openshift-marketplace/redhat-marketplace-xx9nq" Nov 26 15:32:20 crc kubenswrapper[4785]: I1126 15:32:20.169578 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bb46eb70-9020-45d0-8679-a3ea2e75df85-utilities\") pod \"redhat-marketplace-xx9nq\" (UID: \"bb46eb70-9020-45d0-8679-a3ea2e75df85\") " pod="openshift-marketplace/redhat-marketplace-xx9nq" Nov 26 15:32:20 crc kubenswrapper[4785]: I1126 15:32:20.169984 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bb46eb70-9020-45d0-8679-a3ea2e75df85-catalog-content\") pod \"redhat-marketplace-xx9nq\" (UID: \"bb46eb70-9020-45d0-8679-a3ea2e75df85\") " pod="openshift-marketplace/redhat-marketplace-xx9nq" Nov 26 15:32:20 crc kubenswrapper[4785]: I1126 15:32:20.170137 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bb46eb70-9020-45d0-8679-a3ea2e75df85-utilities\") pod \"redhat-marketplace-xx9nq\" (UID: \"bb46eb70-9020-45d0-8679-a3ea2e75df85\") " pod="openshift-marketplace/redhat-marketplace-xx9nq" Nov 26 15:32:20 crc kubenswrapper[4785]: I1126 15:32:20.189612 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qh2wf\" (UniqueName: \"kubernetes.io/projected/bb46eb70-9020-45d0-8679-a3ea2e75df85-kube-api-access-qh2wf\") pod \"redhat-marketplace-xx9nq\" (UID: \"bb46eb70-9020-45d0-8679-a3ea2e75df85\") " pod="openshift-marketplace/redhat-marketplace-xx9nq" Nov 26 15:32:20 crc kubenswrapper[4785]: I1126 15:32:20.334709 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xx9nq" Nov 26 15:32:20 crc kubenswrapper[4785]: I1126 15:32:20.378289 4785 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-n9d7v"] Nov 26 15:32:20 crc kubenswrapper[4785]: I1126 15:32:20.379926 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-n9d7v" Nov 26 15:32:20 crc kubenswrapper[4785]: I1126 15:32:20.389895 4785 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-n9d7v"] Nov 26 15:32:20 crc kubenswrapper[4785]: I1126 15:32:20.476488 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2vcpt\" (UniqueName: \"kubernetes.io/projected/827906ee-a764-4950-849d-788b5f263040-kube-api-access-2vcpt\") pod \"redhat-operators-n9d7v\" (UID: \"827906ee-a764-4950-849d-788b5f263040\") " pod="openshift-marketplace/redhat-operators-n9d7v" Nov 26 15:32:20 crc kubenswrapper[4785]: I1126 15:32:20.477028 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/827906ee-a764-4950-849d-788b5f263040-utilities\") pod \"redhat-operators-n9d7v\" (UID: \"827906ee-a764-4950-849d-788b5f263040\") " pod="openshift-marketplace/redhat-operators-n9d7v" Nov 26 15:32:20 crc kubenswrapper[4785]: I1126 15:32:20.477061 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/827906ee-a764-4950-849d-788b5f263040-catalog-content\") pod \"redhat-operators-n9d7v\" (UID: \"827906ee-a764-4950-849d-788b5f263040\") " pod="openshift-marketplace/redhat-operators-n9d7v" Nov 26 15:32:20 crc kubenswrapper[4785]: I1126 15:32:20.574308 4785 generic.go:334] "Generic (PLEG): container finished" podID="8abc4613-7050-40be-be6d-4ffc3e686b80" containerID="339d2350548f7fae92c2350dba65d706bf8c57db15e8bd46d1c1c3243793bde5" exitCode=0 Nov 26 15:32:20 crc kubenswrapper[4785]: I1126 15:32:20.574425 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jvcvt" event={"ID":"8abc4613-7050-40be-be6d-4ffc3e686b80","Type":"ContainerDied","Data":"339d2350548f7fae92c2350dba65d706bf8c57db15e8bd46d1c1c3243793bde5"} Nov 26 15:32:20 crc kubenswrapper[4785]: I1126 15:32:20.587298 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5t2vd" event={"ID":"eb2fc6d0-088f-46cb-b99e-9650ecfbba7f","Type":"ContainerStarted","Data":"b593c6fb90d9b6de665762269dfc77d1407d10cc6488d05d2ff5c64efde7f361"} Nov 26 15:32:20 crc kubenswrapper[4785]: I1126 15:32:20.604749 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/827906ee-a764-4950-849d-788b5f263040-catalog-content\") pod \"redhat-operators-n9d7v\" (UID: \"827906ee-a764-4950-849d-788b5f263040\") " pod="openshift-marketplace/redhat-operators-n9d7v" Nov 26 15:32:20 crc kubenswrapper[4785]: I1126 15:32:20.604970 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2vcpt\" (UniqueName: \"kubernetes.io/projected/827906ee-a764-4950-849d-788b5f263040-kube-api-access-2vcpt\") pod \"redhat-operators-n9d7v\" (UID: \"827906ee-a764-4950-849d-788b5f263040\") " pod="openshift-marketplace/redhat-operators-n9d7v" Nov 26 15:32:20 crc kubenswrapper[4785]: I1126 15:32:20.605046 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/827906ee-a764-4950-849d-788b5f263040-utilities\") pod \"redhat-operators-n9d7v\" (UID: \"827906ee-a764-4950-849d-788b5f263040\") " pod="openshift-marketplace/redhat-operators-n9d7v" Nov 26 15:32:20 crc kubenswrapper[4785]: I1126 15:32:20.605656 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/827906ee-a764-4950-849d-788b5f263040-utilities\") pod \"redhat-operators-n9d7v\" (UID: \"827906ee-a764-4950-849d-788b5f263040\") " pod="openshift-marketplace/redhat-operators-n9d7v" Nov 26 15:32:20 crc kubenswrapper[4785]: I1126 15:32:20.605661 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/827906ee-a764-4950-849d-788b5f263040-catalog-content\") pod \"redhat-operators-n9d7v\" (UID: \"827906ee-a764-4950-849d-788b5f263040\") " pod="openshift-marketplace/redhat-operators-n9d7v" Nov 26 15:32:20 crc kubenswrapper[4785]: I1126 15:32:20.634749 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2vcpt\" (UniqueName: \"kubernetes.io/projected/827906ee-a764-4950-849d-788b5f263040-kube-api-access-2vcpt\") pod \"redhat-operators-n9d7v\" (UID: \"827906ee-a764-4950-849d-788b5f263040\") " pod="openshift-marketplace/redhat-operators-n9d7v" Nov 26 15:32:20 crc kubenswrapper[4785]: I1126 15:32:20.760544 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-n9d7v" Nov 26 15:32:20 crc kubenswrapper[4785]: I1126 15:32:20.802247 4785 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-xx9nq"] Nov 26 15:32:20 crc kubenswrapper[4785]: W1126 15:32:20.822923 4785 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbb46eb70_9020_45d0_8679_a3ea2e75df85.slice/crio-9da744eaae5d3b1a2d6022353b56278aa8926fe043260e964dfc381ef8401b90 WatchSource:0}: Error finding container 9da744eaae5d3b1a2d6022353b56278aa8926fe043260e964dfc381ef8401b90: Status 404 returned error can't find the container with id 9da744eaae5d3b1a2d6022353b56278aa8926fe043260e964dfc381ef8401b90 Nov 26 15:32:21 crc kubenswrapper[4785]: I1126 15:32:21.009752 4785 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-n9d7v"] Nov 26 15:32:21 crc kubenswrapper[4785]: W1126 15:32:21.025375 4785 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod827906ee_a764_4950_849d_788b5f263040.slice/crio-04db5effbc50aedfce6e419651a56228eb82ef23039673861e7e8bca262bbd48 WatchSource:0}: Error finding container 04db5effbc50aedfce6e419651a56228eb82ef23039673861e7e8bca262bbd48: Status 404 returned error can't find the container with id 04db5effbc50aedfce6e419651a56228eb82ef23039673861e7e8bca262bbd48 Nov 26 15:32:21 crc kubenswrapper[4785]: I1126 15:32:21.148995 4785 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-56ccd5f88c-dzft5" Nov 26 15:32:21 crc kubenswrapper[4785]: I1126 15:32:21.595954 4785 generic.go:334] "Generic (PLEG): container finished" podID="bb46eb70-9020-45d0-8679-a3ea2e75df85" containerID="c4894aba2092228c9c5f84654c45f4f36c056c2c042553442eff9b4b8678152c" exitCode=0 Nov 26 15:32:21 crc kubenswrapper[4785]: I1126 15:32:21.596011 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xx9nq" event={"ID":"bb46eb70-9020-45d0-8679-a3ea2e75df85","Type":"ContainerDied","Data":"c4894aba2092228c9c5f84654c45f4f36c056c2c042553442eff9b4b8678152c"} Nov 26 15:32:21 crc kubenswrapper[4785]: I1126 15:32:21.596372 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xx9nq" event={"ID":"bb46eb70-9020-45d0-8679-a3ea2e75df85","Type":"ContainerStarted","Data":"9da744eaae5d3b1a2d6022353b56278aa8926fe043260e964dfc381ef8401b90"} Nov 26 15:32:21 crc kubenswrapper[4785]: I1126 15:32:21.598442 4785 generic.go:334] "Generic (PLEG): container finished" podID="eb2fc6d0-088f-46cb-b99e-9650ecfbba7f" containerID="b593c6fb90d9b6de665762269dfc77d1407d10cc6488d05d2ff5c64efde7f361" exitCode=0 Nov 26 15:32:21 crc kubenswrapper[4785]: I1126 15:32:21.598757 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5t2vd" event={"ID":"eb2fc6d0-088f-46cb-b99e-9650ecfbba7f","Type":"ContainerDied","Data":"b593c6fb90d9b6de665762269dfc77d1407d10cc6488d05d2ff5c64efde7f361"} Nov 26 15:32:21 crc kubenswrapper[4785]: I1126 15:32:21.601418 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jvcvt" event={"ID":"8abc4613-7050-40be-be6d-4ffc3e686b80","Type":"ContainerStarted","Data":"4ad67c5e3ac765760b3dc50e08a9a706a85f69686b726d8c0dd9f81075238144"} Nov 26 15:32:21 crc kubenswrapper[4785]: I1126 15:32:21.603343 4785 generic.go:334] "Generic (PLEG): container finished" podID="827906ee-a764-4950-849d-788b5f263040" containerID="ac5da10667af34211e75ac5f0e2c3eb3a8e0c633d36a0112911c64c06df4de73" exitCode=0 Nov 26 15:32:21 crc kubenswrapper[4785]: I1126 15:32:21.603395 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n9d7v" event={"ID":"827906ee-a764-4950-849d-788b5f263040","Type":"ContainerDied","Data":"ac5da10667af34211e75ac5f0e2c3eb3a8e0c633d36a0112911c64c06df4de73"} Nov 26 15:32:21 crc kubenswrapper[4785]: I1126 15:32:21.603422 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n9d7v" event={"ID":"827906ee-a764-4950-849d-788b5f263040","Type":"ContainerStarted","Data":"04db5effbc50aedfce6e419651a56228eb82ef23039673861e7e8bca262bbd48"} Nov 26 15:32:21 crc kubenswrapper[4785]: I1126 15:32:21.640071 4785 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-jvcvt" podStartSLOduration=1.8588489259999998 podStartE2EDuration="4.640052074s" podCreationTimestamp="2025-11-26 15:32:17 +0000 UTC" firstStartedPulling="2025-11-26 15:32:18.553587356 +0000 UTC m=+1502.231953140" lastFinishedPulling="2025-11-26 15:32:21.334790514 +0000 UTC m=+1505.013156288" observedRunningTime="2025-11-26 15:32:21.637956127 +0000 UTC m=+1505.316321901" watchObservedRunningTime="2025-11-26 15:32:21.640052074 +0000 UTC m=+1505.318417838" Nov 26 15:32:22 crc kubenswrapper[4785]: I1126 15:32:22.612373 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5t2vd" event={"ID":"eb2fc6d0-088f-46cb-b99e-9650ecfbba7f","Type":"ContainerStarted","Data":"cd1463dad96250b093df44e7c6ab9f6673afce5b0de80995e3b6a8c16db50c2e"} Nov 26 15:32:22 crc kubenswrapper[4785]: I1126 15:32:22.639072 4785 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-5t2vd" podStartSLOduration=3.084692434 podStartE2EDuration="5.639055657s" podCreationTimestamp="2025-11-26 15:32:17 +0000 UTC" firstStartedPulling="2025-11-26 15:32:19.562564621 +0000 UTC m=+1503.240930385" lastFinishedPulling="2025-11-26 15:32:22.116927844 +0000 UTC m=+1505.795293608" observedRunningTime="2025-11-26 15:32:22.635887071 +0000 UTC m=+1506.314252855" watchObservedRunningTime="2025-11-26 15:32:22.639055657 +0000 UTC m=+1506.317421421" Nov 26 15:32:23 crc kubenswrapper[4785]: I1126 15:32:23.621443 4785 generic.go:334] "Generic (PLEG): container finished" podID="827906ee-a764-4950-849d-788b5f263040" containerID="ee5fc3fdb982fa41c5327a62679d8dcdeae2d0ffd3e6970e344e583cb549824e" exitCode=0 Nov 26 15:32:23 crc kubenswrapper[4785]: I1126 15:32:23.621537 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n9d7v" event={"ID":"827906ee-a764-4950-849d-788b5f263040","Type":"ContainerDied","Data":"ee5fc3fdb982fa41c5327a62679d8dcdeae2d0ffd3e6970e344e583cb549824e"} Nov 26 15:32:23 crc kubenswrapper[4785]: I1126 15:32:23.624427 4785 generic.go:334] "Generic (PLEG): container finished" podID="bb46eb70-9020-45d0-8679-a3ea2e75df85" containerID="2513385f72e1929426bd1ac51124ef014d23bdd9a9c847d93cfa816ad4365721" exitCode=0 Nov 26 15:32:23 crc kubenswrapper[4785]: I1126 15:32:23.626092 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xx9nq" event={"ID":"bb46eb70-9020-45d0-8679-a3ea2e75df85","Type":"ContainerDied","Data":"2513385f72e1929426bd1ac51124ef014d23bdd9a9c847d93cfa816ad4365721"} Nov 26 15:32:24 crc kubenswrapper[4785]: I1126 15:32:24.389084 4785 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-747fb5cb85-5slw2" Nov 26 15:32:24 crc kubenswrapper[4785]: I1126 15:32:24.632826 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xx9nq" event={"ID":"bb46eb70-9020-45d0-8679-a3ea2e75df85","Type":"ContainerStarted","Data":"5a4d0c0ccd7d0e70bcf82c596b756f23aa9f5068e0e7a249a13848cffece44f2"} Nov 26 15:32:24 crc kubenswrapper[4785]: I1126 15:32:24.636607 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n9d7v" event={"ID":"827906ee-a764-4950-849d-788b5f263040","Type":"ContainerStarted","Data":"4cc649213cfd442576cdf2538352d045193302df8bc7494c6a0509d2306624c0"} Nov 26 15:32:24 crc kubenswrapper[4785]: I1126 15:32:24.658378 4785 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-xx9nq" podStartSLOduration=2.942874387 podStartE2EDuration="5.658359565s" podCreationTimestamp="2025-11-26 15:32:19 +0000 UTC" firstStartedPulling="2025-11-26 15:32:21.597625219 +0000 UTC m=+1505.275990993" lastFinishedPulling="2025-11-26 15:32:24.313110387 +0000 UTC m=+1507.991476171" observedRunningTime="2025-11-26 15:32:24.652593938 +0000 UTC m=+1508.330959712" watchObservedRunningTime="2025-11-26 15:32:24.658359565 +0000 UTC m=+1508.336725329" Nov 26 15:32:24 crc kubenswrapper[4785]: I1126 15:32:24.668160 4785 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-n9d7v" podStartSLOduration=1.9871980329999999 podStartE2EDuration="4.668141511s" podCreationTimestamp="2025-11-26 15:32:20 +0000 UTC" firstStartedPulling="2025-11-26 15:32:21.605287327 +0000 UTC m=+1505.283653091" lastFinishedPulling="2025-11-26 15:32:24.286230805 +0000 UTC m=+1507.964596569" observedRunningTime="2025-11-26 15:32:24.665861019 +0000 UTC m=+1508.344226803" watchObservedRunningTime="2025-11-26 15:32:24.668141511 +0000 UTC m=+1508.346507275" Nov 26 15:32:27 crc kubenswrapper[4785]: I1126 15:32:27.898978 4785 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-jvcvt" Nov 26 15:32:27 crc kubenswrapper[4785]: I1126 15:32:27.899297 4785 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-jvcvt" Nov 26 15:32:27 crc kubenswrapper[4785]: I1126 15:32:27.943831 4785 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-jvcvt" Nov 26 15:32:28 crc kubenswrapper[4785]: I1126 15:32:28.116884 4785 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-5t2vd" Nov 26 15:32:28 crc kubenswrapper[4785]: I1126 15:32:28.116935 4785 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-5t2vd" Nov 26 15:32:28 crc kubenswrapper[4785]: I1126 15:32:28.160929 4785 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-5t2vd" Nov 26 15:32:28 crc kubenswrapper[4785]: I1126 15:32:28.585241 4785 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-h295v"] Nov 26 15:32:28 crc kubenswrapper[4785]: I1126 15:32:28.587310 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-h295v" Nov 26 15:32:28 crc kubenswrapper[4785]: I1126 15:32:28.595856 4785 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-h295v"] Nov 26 15:32:28 crc kubenswrapper[4785]: I1126 15:32:28.711526 4785 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-5t2vd" Nov 26 15:32:28 crc kubenswrapper[4785]: I1126 15:32:28.714738 4785 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-jvcvt" Nov 26 15:32:28 crc kubenswrapper[4785]: I1126 15:32:28.756239 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x6gsk\" (UniqueName: \"kubernetes.io/projected/e16ee24a-ec88-4d48-8a64-1136e673aa8d-kube-api-access-x6gsk\") pod \"certified-operators-h295v\" (UID: \"e16ee24a-ec88-4d48-8a64-1136e673aa8d\") " pod="openshift-marketplace/certified-operators-h295v" Nov 26 15:32:28 crc kubenswrapper[4785]: I1126 15:32:28.756278 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e16ee24a-ec88-4d48-8a64-1136e673aa8d-catalog-content\") pod \"certified-operators-h295v\" (UID: \"e16ee24a-ec88-4d48-8a64-1136e673aa8d\") " pod="openshift-marketplace/certified-operators-h295v" Nov 26 15:32:28 crc kubenswrapper[4785]: I1126 15:32:28.756315 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e16ee24a-ec88-4d48-8a64-1136e673aa8d-utilities\") pod \"certified-operators-h295v\" (UID: \"e16ee24a-ec88-4d48-8a64-1136e673aa8d\") " pod="openshift-marketplace/certified-operators-h295v" Nov 26 15:32:28 crc kubenswrapper[4785]: I1126 15:32:28.858061 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x6gsk\" (UniqueName: \"kubernetes.io/projected/e16ee24a-ec88-4d48-8a64-1136e673aa8d-kube-api-access-x6gsk\") pod \"certified-operators-h295v\" (UID: \"e16ee24a-ec88-4d48-8a64-1136e673aa8d\") " pod="openshift-marketplace/certified-operators-h295v" Nov 26 15:32:28 crc kubenswrapper[4785]: I1126 15:32:28.858100 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e16ee24a-ec88-4d48-8a64-1136e673aa8d-catalog-content\") pod \"certified-operators-h295v\" (UID: \"e16ee24a-ec88-4d48-8a64-1136e673aa8d\") " pod="openshift-marketplace/certified-operators-h295v" Nov 26 15:32:28 crc kubenswrapper[4785]: I1126 15:32:28.858135 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e16ee24a-ec88-4d48-8a64-1136e673aa8d-utilities\") pod \"certified-operators-h295v\" (UID: \"e16ee24a-ec88-4d48-8a64-1136e673aa8d\") " pod="openshift-marketplace/certified-operators-h295v" Nov 26 15:32:28 crc kubenswrapper[4785]: I1126 15:32:28.858534 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e16ee24a-ec88-4d48-8a64-1136e673aa8d-utilities\") pod \"certified-operators-h295v\" (UID: \"e16ee24a-ec88-4d48-8a64-1136e673aa8d\") " pod="openshift-marketplace/certified-operators-h295v" Nov 26 15:32:28 crc kubenswrapper[4785]: I1126 15:32:28.858760 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e16ee24a-ec88-4d48-8a64-1136e673aa8d-catalog-content\") pod \"certified-operators-h295v\" (UID: \"e16ee24a-ec88-4d48-8a64-1136e673aa8d\") " pod="openshift-marketplace/certified-operators-h295v" Nov 26 15:32:28 crc kubenswrapper[4785]: I1126 15:32:28.876436 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x6gsk\" (UniqueName: \"kubernetes.io/projected/e16ee24a-ec88-4d48-8a64-1136e673aa8d-kube-api-access-x6gsk\") pod \"certified-operators-h295v\" (UID: \"e16ee24a-ec88-4d48-8a64-1136e673aa8d\") " pod="openshift-marketplace/certified-operators-h295v" Nov 26 15:32:28 crc kubenswrapper[4785]: I1126 15:32:28.931350 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-h295v" Nov 26 15:32:29 crc kubenswrapper[4785]: I1126 15:32:29.216065 4785 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-84667dbb5-sslgl" Nov 26 15:32:29 crc kubenswrapper[4785]: I1126 15:32:29.375010 4785 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-rcm4r"] Nov 26 15:32:29 crc kubenswrapper[4785]: I1126 15:32:29.376858 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rcm4r" Nov 26 15:32:29 crc kubenswrapper[4785]: I1126 15:32:29.391378 4785 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-rcm4r"] Nov 26 15:32:29 crc kubenswrapper[4785]: I1126 15:32:29.451587 4785 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-h295v"] Nov 26 15:32:29 crc kubenswrapper[4785]: I1126 15:32:29.467595 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/937be872-928d-4374-8605-eb4e529dbebe-utilities\") pod \"redhat-operators-rcm4r\" (UID: \"937be872-928d-4374-8605-eb4e529dbebe\") " pod="openshift-marketplace/redhat-operators-rcm4r" Nov 26 15:32:29 crc kubenswrapper[4785]: I1126 15:32:29.467640 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7t2tt\" (UniqueName: \"kubernetes.io/projected/937be872-928d-4374-8605-eb4e529dbebe-kube-api-access-7t2tt\") pod \"redhat-operators-rcm4r\" (UID: \"937be872-928d-4374-8605-eb4e529dbebe\") " pod="openshift-marketplace/redhat-operators-rcm4r" Nov 26 15:32:29 crc kubenswrapper[4785]: I1126 15:32:29.467683 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/937be872-928d-4374-8605-eb4e529dbebe-catalog-content\") pod \"redhat-operators-rcm4r\" (UID: \"937be872-928d-4374-8605-eb4e529dbebe\") " pod="openshift-marketplace/redhat-operators-rcm4r" Nov 26 15:32:29 crc kubenswrapper[4785]: I1126 15:32:29.568941 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7t2tt\" (UniqueName: \"kubernetes.io/projected/937be872-928d-4374-8605-eb4e529dbebe-kube-api-access-7t2tt\") pod \"redhat-operators-rcm4r\" (UID: \"937be872-928d-4374-8605-eb4e529dbebe\") " pod="openshift-marketplace/redhat-operators-rcm4r" Nov 26 15:32:29 crc kubenswrapper[4785]: I1126 15:32:29.569188 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/937be872-928d-4374-8605-eb4e529dbebe-utilities\") pod \"redhat-operators-rcm4r\" (UID: \"937be872-928d-4374-8605-eb4e529dbebe\") " pod="openshift-marketplace/redhat-operators-rcm4r" Nov 26 15:32:29 crc kubenswrapper[4785]: I1126 15:32:29.569292 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/937be872-928d-4374-8605-eb4e529dbebe-catalog-content\") pod \"redhat-operators-rcm4r\" (UID: \"937be872-928d-4374-8605-eb4e529dbebe\") " pod="openshift-marketplace/redhat-operators-rcm4r" Nov 26 15:32:29 crc kubenswrapper[4785]: I1126 15:32:29.569868 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/937be872-928d-4374-8605-eb4e529dbebe-utilities\") pod \"redhat-operators-rcm4r\" (UID: \"937be872-928d-4374-8605-eb4e529dbebe\") " pod="openshift-marketplace/redhat-operators-rcm4r" Nov 26 15:32:29 crc kubenswrapper[4785]: I1126 15:32:29.570003 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/937be872-928d-4374-8605-eb4e529dbebe-catalog-content\") pod \"redhat-operators-rcm4r\" (UID: \"937be872-928d-4374-8605-eb4e529dbebe\") " pod="openshift-marketplace/redhat-operators-rcm4r" Nov 26 15:32:29 crc kubenswrapper[4785]: I1126 15:32:29.592464 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7t2tt\" (UniqueName: \"kubernetes.io/projected/937be872-928d-4374-8605-eb4e529dbebe-kube-api-access-7t2tt\") pod \"redhat-operators-rcm4r\" (UID: \"937be872-928d-4374-8605-eb4e529dbebe\") " pod="openshift-marketplace/redhat-operators-rcm4r" Nov 26 15:32:29 crc kubenswrapper[4785]: I1126 15:32:29.696014 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rcm4r" Nov 26 15:32:29 crc kubenswrapper[4785]: I1126 15:32:29.709726 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-h295v" event={"ID":"e16ee24a-ec88-4d48-8a64-1136e673aa8d","Type":"ContainerStarted","Data":"c5b3bd83719eec7acbb5af7518744db1e6504c045f5e196b0d9a75692f817d75"} Nov 26 15:32:30 crc kubenswrapper[4785]: I1126 15:32:30.269728 4785 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-rcm4r"] Nov 26 15:32:30 crc kubenswrapper[4785]: W1126 15:32:30.271735 4785 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod937be872_928d_4374_8605_eb4e529dbebe.slice/crio-ae51c466262f853014d46a59723d8bca6442e60974dce6dea0996367b5234285 WatchSource:0}: Error finding container ae51c466262f853014d46a59723d8bca6442e60974dce6dea0996367b5234285: Status 404 returned error can't find the container with id ae51c466262f853014d46a59723d8bca6442e60974dce6dea0996367b5234285 Nov 26 15:32:30 crc kubenswrapper[4785]: I1126 15:32:30.335759 4785 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-xx9nq" Nov 26 15:32:30 crc kubenswrapper[4785]: I1126 15:32:30.335805 4785 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-xx9nq" Nov 26 15:32:30 crc kubenswrapper[4785]: I1126 15:32:30.411285 4785 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-xx9nq" Nov 26 15:32:30 crc kubenswrapper[4785]: I1126 15:32:30.719674 4785 generic.go:334] "Generic (PLEG): container finished" podID="e16ee24a-ec88-4d48-8a64-1136e673aa8d" containerID="85198a4a0b020c0771901c8d52ecdc9a47a1b24b484dbd636cadcf60cb843423" exitCode=0 Nov 26 15:32:30 crc kubenswrapper[4785]: I1126 15:32:30.719741 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-h295v" event={"ID":"e16ee24a-ec88-4d48-8a64-1136e673aa8d","Type":"ContainerDied","Data":"85198a4a0b020c0771901c8d52ecdc9a47a1b24b484dbd636cadcf60cb843423"} Nov 26 15:32:30 crc kubenswrapper[4785]: I1126 15:32:30.722372 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rcm4r" event={"ID":"937be872-928d-4374-8605-eb4e529dbebe","Type":"ContainerStarted","Data":"e55b4ee11e5e473c97d4e0673496965c4994e8eb405dc28767417dc94ab0093d"} Nov 26 15:32:30 crc kubenswrapper[4785]: I1126 15:32:30.722443 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rcm4r" event={"ID":"937be872-928d-4374-8605-eb4e529dbebe","Type":"ContainerStarted","Data":"ae51c466262f853014d46a59723d8bca6442e60974dce6dea0996367b5234285"} Nov 26 15:32:30 crc kubenswrapper[4785]: I1126 15:32:30.762439 4785 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-n9d7v" Nov 26 15:32:30 crc kubenswrapper[4785]: I1126 15:32:30.762475 4785 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-n9d7v" Nov 26 15:32:30 crc kubenswrapper[4785]: I1126 15:32:30.804296 4785 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-xx9nq" Nov 26 15:32:30 crc kubenswrapper[4785]: I1126 15:32:30.816625 4785 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-n9d7v" Nov 26 15:32:31 crc kubenswrapper[4785]: I1126 15:32:31.786434 4785 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-n9d7v" Nov 26 15:32:32 crc kubenswrapper[4785]: I1126 15:32:32.745123 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-h295v" event={"ID":"e16ee24a-ec88-4d48-8a64-1136e673aa8d","Type":"ContainerStarted","Data":"e3f5c65ba95b26d1475bb9e2dd916bd6c2a42d6816987e2943c8b1bf84c9985a"} Nov 26 15:32:32 crc kubenswrapper[4785]: I1126 15:32:32.747869 4785 generic.go:334] "Generic (PLEG): container finished" podID="937be872-928d-4374-8605-eb4e529dbebe" containerID="e55b4ee11e5e473c97d4e0673496965c4994e8eb405dc28767417dc94ab0093d" exitCode=0 Nov 26 15:32:32 crc kubenswrapper[4785]: I1126 15:32:32.748298 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rcm4r" event={"ID":"937be872-928d-4374-8605-eb4e529dbebe","Type":"ContainerDied","Data":"e55b4ee11e5e473c97d4e0673496965c4994e8eb405dc28767417dc94ab0093d"} Nov 26 15:32:33 crc kubenswrapper[4785]: I1126 15:32:33.760033 4785 generic.go:334] "Generic (PLEG): container finished" podID="e16ee24a-ec88-4d48-8a64-1136e673aa8d" containerID="e3f5c65ba95b26d1475bb9e2dd916bd6c2a42d6816987e2943c8b1bf84c9985a" exitCode=0 Nov 26 15:32:33 crc kubenswrapper[4785]: I1126 15:32:33.760255 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-h295v" event={"ID":"e16ee24a-ec88-4d48-8a64-1136e673aa8d","Type":"ContainerDied","Data":"e3f5c65ba95b26d1475bb9e2dd916bd6c2a42d6816987e2943c8b1bf84c9985a"} Nov 26 15:32:33 crc kubenswrapper[4785]: I1126 15:32:33.765532 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rcm4r" event={"ID":"937be872-928d-4374-8605-eb4e529dbebe","Type":"ContainerStarted","Data":"0b00758836c3d082cdd413c78f182d44c11557ebe4d9ab16db0a702d7790eace"} Nov 26 15:32:34 crc kubenswrapper[4785]: I1126 15:32:34.174125 4785 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-5lfl5"] Nov 26 15:32:34 crc kubenswrapper[4785]: I1126 15:32:34.175763 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5lfl5" Nov 26 15:32:34 crc kubenswrapper[4785]: I1126 15:32:34.194528 4785 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-5lfl5"] Nov 26 15:32:34 crc kubenswrapper[4785]: I1126 15:32:34.370736 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cngvm\" (UniqueName: \"kubernetes.io/projected/42779d5c-2b5c-4058-9b9d-995285d02e80-kube-api-access-cngvm\") pod \"certified-operators-5lfl5\" (UID: \"42779d5c-2b5c-4058-9b9d-995285d02e80\") " pod="openshift-marketplace/certified-operators-5lfl5" Nov 26 15:32:34 crc kubenswrapper[4785]: I1126 15:32:34.371191 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/42779d5c-2b5c-4058-9b9d-995285d02e80-catalog-content\") pod \"certified-operators-5lfl5\" (UID: \"42779d5c-2b5c-4058-9b9d-995285d02e80\") " pod="openshift-marketplace/certified-operators-5lfl5" Nov 26 15:32:34 crc kubenswrapper[4785]: I1126 15:32:34.371341 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/42779d5c-2b5c-4058-9b9d-995285d02e80-utilities\") pod \"certified-operators-5lfl5\" (UID: \"42779d5c-2b5c-4058-9b9d-995285d02e80\") " pod="openshift-marketplace/certified-operators-5lfl5" Nov 26 15:32:34 crc kubenswrapper[4785]: I1126 15:32:34.472797 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/42779d5c-2b5c-4058-9b9d-995285d02e80-utilities\") pod \"certified-operators-5lfl5\" (UID: \"42779d5c-2b5c-4058-9b9d-995285d02e80\") " pod="openshift-marketplace/certified-operators-5lfl5" Nov 26 15:32:34 crc kubenswrapper[4785]: I1126 15:32:34.472903 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cngvm\" (UniqueName: \"kubernetes.io/projected/42779d5c-2b5c-4058-9b9d-995285d02e80-kube-api-access-cngvm\") pod \"certified-operators-5lfl5\" (UID: \"42779d5c-2b5c-4058-9b9d-995285d02e80\") " pod="openshift-marketplace/certified-operators-5lfl5" Nov 26 15:32:34 crc kubenswrapper[4785]: I1126 15:32:34.472946 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/42779d5c-2b5c-4058-9b9d-995285d02e80-catalog-content\") pod \"certified-operators-5lfl5\" (UID: \"42779d5c-2b5c-4058-9b9d-995285d02e80\") " pod="openshift-marketplace/certified-operators-5lfl5" Nov 26 15:32:34 crc kubenswrapper[4785]: I1126 15:32:34.473604 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/42779d5c-2b5c-4058-9b9d-995285d02e80-catalog-content\") pod \"certified-operators-5lfl5\" (UID: \"42779d5c-2b5c-4058-9b9d-995285d02e80\") " pod="openshift-marketplace/certified-operators-5lfl5" Nov 26 15:32:34 crc kubenswrapper[4785]: I1126 15:32:34.473817 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/42779d5c-2b5c-4058-9b9d-995285d02e80-utilities\") pod \"certified-operators-5lfl5\" (UID: \"42779d5c-2b5c-4058-9b9d-995285d02e80\") " pod="openshift-marketplace/certified-operators-5lfl5" Nov 26 15:32:34 crc kubenswrapper[4785]: I1126 15:32:34.492008 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cngvm\" (UniqueName: \"kubernetes.io/projected/42779d5c-2b5c-4058-9b9d-995285d02e80-kube-api-access-cngvm\") pod \"certified-operators-5lfl5\" (UID: \"42779d5c-2b5c-4058-9b9d-995285d02e80\") " pod="openshift-marketplace/certified-operators-5lfl5" Nov 26 15:32:34 crc kubenswrapper[4785]: I1126 15:32:34.774829 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-h295v" event={"ID":"e16ee24a-ec88-4d48-8a64-1136e673aa8d","Type":"ContainerStarted","Data":"90c5acb30ad680d4d4e83c48bedd33d933eabe341995fc406429a70569aeead2"} Nov 26 15:32:34 crc kubenswrapper[4785]: I1126 15:32:34.778721 4785 generic.go:334] "Generic (PLEG): container finished" podID="937be872-928d-4374-8605-eb4e529dbebe" containerID="0b00758836c3d082cdd413c78f182d44c11557ebe4d9ab16db0a702d7790eace" exitCode=0 Nov 26 15:32:34 crc kubenswrapper[4785]: I1126 15:32:34.778775 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rcm4r" event={"ID":"937be872-928d-4374-8605-eb4e529dbebe","Type":"ContainerDied","Data":"0b00758836c3d082cdd413c78f182d44c11557ebe4d9ab16db0a702d7790eace"} Nov 26 15:32:34 crc kubenswrapper[4785]: I1126 15:32:34.782127 4785 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-cnlfc"] Nov 26 15:32:34 crc kubenswrapper[4785]: I1126 15:32:34.784001 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cnlfc" Nov 26 15:32:34 crc kubenswrapper[4785]: I1126 15:32:34.791037 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5lfl5" Nov 26 15:32:34 crc kubenswrapper[4785]: I1126 15:32:34.799062 4785 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-cnlfc"] Nov 26 15:32:34 crc kubenswrapper[4785]: I1126 15:32:34.803454 4785 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-h295v" podStartSLOduration=3.244112216 podStartE2EDuration="6.803433895s" podCreationTimestamp="2025-11-26 15:32:28 +0000 UTC" firstStartedPulling="2025-11-26 15:32:30.722278091 +0000 UTC m=+1514.400643905" lastFinishedPulling="2025-11-26 15:32:34.28159983 +0000 UTC m=+1517.959965584" observedRunningTime="2025-11-26 15:32:34.800087024 +0000 UTC m=+1518.478452808" watchObservedRunningTime="2025-11-26 15:32:34.803433895 +0000 UTC m=+1518.481799659" Nov 26 15:32:34 crc kubenswrapper[4785]: I1126 15:32:34.879850 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/df43a35c-ce4b-434a-90a2-69bd1e97090d-catalog-content\") pod \"redhat-operators-cnlfc\" (UID: \"df43a35c-ce4b-434a-90a2-69bd1e97090d\") " pod="openshift-marketplace/redhat-operators-cnlfc" Nov 26 15:32:34 crc kubenswrapper[4785]: I1126 15:32:34.880151 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/df43a35c-ce4b-434a-90a2-69bd1e97090d-utilities\") pod \"redhat-operators-cnlfc\" (UID: \"df43a35c-ce4b-434a-90a2-69bd1e97090d\") " pod="openshift-marketplace/redhat-operators-cnlfc" Nov 26 15:32:34 crc kubenswrapper[4785]: I1126 15:32:34.880287 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tzct6\" (UniqueName: \"kubernetes.io/projected/df43a35c-ce4b-434a-90a2-69bd1e97090d-kube-api-access-tzct6\") pod \"redhat-operators-cnlfc\" (UID: \"df43a35c-ce4b-434a-90a2-69bd1e97090d\") " pod="openshift-marketplace/redhat-operators-cnlfc" Nov 26 15:32:34 crc kubenswrapper[4785]: I1126 15:32:34.981461 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tzct6\" (UniqueName: \"kubernetes.io/projected/df43a35c-ce4b-434a-90a2-69bd1e97090d-kube-api-access-tzct6\") pod \"redhat-operators-cnlfc\" (UID: \"df43a35c-ce4b-434a-90a2-69bd1e97090d\") " pod="openshift-marketplace/redhat-operators-cnlfc" Nov 26 15:32:34 crc kubenswrapper[4785]: I1126 15:32:34.982328 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/df43a35c-ce4b-434a-90a2-69bd1e97090d-catalog-content\") pod \"redhat-operators-cnlfc\" (UID: \"df43a35c-ce4b-434a-90a2-69bd1e97090d\") " pod="openshift-marketplace/redhat-operators-cnlfc" Nov 26 15:32:34 crc kubenswrapper[4785]: I1126 15:32:34.982738 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/df43a35c-ce4b-434a-90a2-69bd1e97090d-catalog-content\") pod \"redhat-operators-cnlfc\" (UID: \"df43a35c-ce4b-434a-90a2-69bd1e97090d\") " pod="openshift-marketplace/redhat-operators-cnlfc" Nov 26 15:32:34 crc kubenswrapper[4785]: I1126 15:32:34.982803 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/df43a35c-ce4b-434a-90a2-69bd1e97090d-utilities\") pod \"redhat-operators-cnlfc\" (UID: \"df43a35c-ce4b-434a-90a2-69bd1e97090d\") " pod="openshift-marketplace/redhat-operators-cnlfc" Nov 26 15:32:34 crc kubenswrapper[4785]: I1126 15:32:34.983116 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/df43a35c-ce4b-434a-90a2-69bd1e97090d-utilities\") pod \"redhat-operators-cnlfc\" (UID: \"df43a35c-ce4b-434a-90a2-69bd1e97090d\") " pod="openshift-marketplace/redhat-operators-cnlfc" Nov 26 15:32:34 crc kubenswrapper[4785]: I1126 15:32:34.999292 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tzct6\" (UniqueName: \"kubernetes.io/projected/df43a35c-ce4b-434a-90a2-69bd1e97090d-kube-api-access-tzct6\") pod \"redhat-operators-cnlfc\" (UID: \"df43a35c-ce4b-434a-90a2-69bd1e97090d\") " pod="openshift-marketplace/redhat-operators-cnlfc" Nov 26 15:32:35 crc kubenswrapper[4785]: I1126 15:32:35.088074 4785 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-5lfl5"] Nov 26 15:32:35 crc kubenswrapper[4785]: I1126 15:32:35.108078 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cnlfc" Nov 26 15:32:35 crc kubenswrapper[4785]: I1126 15:32:35.375172 4785 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-cnlfc"] Nov 26 15:32:35 crc kubenswrapper[4785]: W1126 15:32:35.379931 4785 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddf43a35c_ce4b_434a_90a2_69bd1e97090d.slice/crio-06df63a948f195415efab95ffc743a16fe3604afb5cc6a3abe0e7ef33502ffb7 WatchSource:0}: Error finding container 06df63a948f195415efab95ffc743a16fe3604afb5cc6a3abe0e7ef33502ffb7: Status 404 returned error can't find the container with id 06df63a948f195415efab95ffc743a16fe3604afb5cc6a3abe0e7ef33502ffb7 Nov 26 15:32:35 crc kubenswrapper[4785]: I1126 15:32:35.788974 4785 generic.go:334] "Generic (PLEG): container finished" podID="42779d5c-2b5c-4058-9b9d-995285d02e80" containerID="69cb355ef838ac455688d18fb38bac207484ba1426f6347c983a0f651ba74df7" exitCode=0 Nov 26 15:32:35 crc kubenswrapper[4785]: I1126 15:32:35.789014 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5lfl5" event={"ID":"42779d5c-2b5c-4058-9b9d-995285d02e80","Type":"ContainerDied","Data":"69cb355ef838ac455688d18fb38bac207484ba1426f6347c983a0f651ba74df7"} Nov 26 15:32:35 crc kubenswrapper[4785]: I1126 15:32:35.789334 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5lfl5" event={"ID":"42779d5c-2b5c-4058-9b9d-995285d02e80","Type":"ContainerStarted","Data":"3e9816f4cf71b2c70f4a2a21729d759b0d2f7cc5d52351899cacc05f8d5ea1d8"} Nov 26 15:32:35 crc kubenswrapper[4785]: I1126 15:32:35.791228 4785 generic.go:334] "Generic (PLEG): container finished" podID="df43a35c-ce4b-434a-90a2-69bd1e97090d" containerID="d94f27706012990ca6bf12276424622b99a039806bae44aa87e5f2a9189883e6" exitCode=0 Nov 26 15:32:35 crc kubenswrapper[4785]: I1126 15:32:35.791856 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cnlfc" event={"ID":"df43a35c-ce4b-434a-90a2-69bd1e97090d","Type":"ContainerDied","Data":"d94f27706012990ca6bf12276424622b99a039806bae44aa87e5f2a9189883e6"} Nov 26 15:32:35 crc kubenswrapper[4785]: I1126 15:32:35.791898 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cnlfc" event={"ID":"df43a35c-ce4b-434a-90a2-69bd1e97090d","Type":"ContainerStarted","Data":"06df63a948f195415efab95ffc743a16fe3604afb5cc6a3abe0e7ef33502ffb7"} Nov 26 15:32:35 crc kubenswrapper[4785]: I1126 15:32:35.802711 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rcm4r" event={"ID":"937be872-928d-4374-8605-eb4e529dbebe","Type":"ContainerStarted","Data":"740f3a55d8475edc3e38cf3009694d665b598c4c05166d31aa96a7c40e77e847"} Nov 26 15:32:35 crc kubenswrapper[4785]: I1126 15:32:35.870523 4785 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-rcm4r" podStartSLOduration=4.324740994 podStartE2EDuration="6.870503942s" podCreationTimestamp="2025-11-26 15:32:29 +0000 UTC" firstStartedPulling="2025-11-26 15:32:32.750853772 +0000 UTC m=+1516.429219556" lastFinishedPulling="2025-11-26 15:32:35.29661674 +0000 UTC m=+1518.974982504" observedRunningTime="2025-11-26 15:32:35.845711767 +0000 UTC m=+1519.524077541" watchObservedRunningTime="2025-11-26 15:32:35.870503942 +0000 UTC m=+1519.548869706" Nov 26 15:32:36 crc kubenswrapper[4785]: I1126 15:32:36.812460 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5lfl5" event={"ID":"42779d5c-2b5c-4058-9b9d-995285d02e80","Type":"ContainerStarted","Data":"876da037fa54d4010a81d565d7c214117d0bf6b0d4ab39ad90ec9741a5dfbc42"} Nov 26 15:32:36 crc kubenswrapper[4785]: I1126 15:32:36.814797 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cnlfc" event={"ID":"df43a35c-ce4b-434a-90a2-69bd1e97090d","Type":"ContainerStarted","Data":"77aa32b19489df2fd4078e4bad85dce3001742ac17661d8093653f6f3dcf68f2"} Nov 26 15:32:37 crc kubenswrapper[4785]: I1126 15:32:37.289077 4785 patch_prober.go:28] interesting pod/machine-config-daemon-gkxdl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 26 15:32:37 crc kubenswrapper[4785]: I1126 15:32:37.289139 4785 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-gkxdl" podUID="5539d39a-e2bc-4e7f-8b5a-a5e3e10c4ba4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 26 15:32:37 crc kubenswrapper[4785]: I1126 15:32:37.289182 4785 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-gkxdl" Nov 26 15:32:37 crc kubenswrapper[4785]: I1126 15:32:37.289909 4785 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"3e8c4964e31211d666dbae0e50d629dfc05802d610982228b7bbd6c4dcfcee2a"} pod="openshift-machine-config-operator/machine-config-daemon-gkxdl" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 26 15:32:37 crc kubenswrapper[4785]: I1126 15:32:37.289975 4785 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-gkxdl" podUID="5539d39a-e2bc-4e7f-8b5a-a5e3e10c4ba4" containerName="machine-config-daemon" containerID="cri-o://3e8c4964e31211d666dbae0e50d629dfc05802d610982228b7bbd6c4dcfcee2a" gracePeriod=600 Nov 26 15:32:38 crc kubenswrapper[4785]: I1126 15:32:38.932091 4785 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-h295v" Nov 26 15:32:38 crc kubenswrapper[4785]: I1126 15:32:38.932404 4785 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-h295v" Nov 26 15:32:38 crc kubenswrapper[4785]: I1126 15:32:38.995635 4785 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-h295v" Nov 26 15:32:39 crc kubenswrapper[4785]: I1126 15:32:39.584849 4785 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-dx6bv"] Nov 26 15:32:39 crc kubenswrapper[4785]: I1126 15:32:39.586678 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-dx6bv" Nov 26 15:32:39 crc kubenswrapper[4785]: I1126 15:32:39.594001 4785 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-dx6bv"] Nov 26 15:32:39 crc kubenswrapper[4785]: I1126 15:32:39.696333 4785 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-rcm4r" Nov 26 15:32:39 crc kubenswrapper[4785]: I1126 15:32:39.697652 4785 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-rcm4r" Nov 26 15:32:39 crc kubenswrapper[4785]: I1126 15:32:39.758356 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/439a16e9-65e4-4450-b95a-79095a4eebb9-utilities\") pod \"community-operators-dx6bv\" (UID: \"439a16e9-65e4-4450-b95a-79095a4eebb9\") " pod="openshift-marketplace/community-operators-dx6bv" Nov 26 15:32:39 crc kubenswrapper[4785]: I1126 15:32:39.758474 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gdktf\" (UniqueName: \"kubernetes.io/projected/439a16e9-65e4-4450-b95a-79095a4eebb9-kube-api-access-gdktf\") pod \"community-operators-dx6bv\" (UID: \"439a16e9-65e4-4450-b95a-79095a4eebb9\") " pod="openshift-marketplace/community-operators-dx6bv" Nov 26 15:32:39 crc kubenswrapper[4785]: I1126 15:32:39.758496 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/439a16e9-65e4-4450-b95a-79095a4eebb9-catalog-content\") pod \"community-operators-dx6bv\" (UID: \"439a16e9-65e4-4450-b95a-79095a4eebb9\") " pod="openshift-marketplace/community-operators-dx6bv" Nov 26 15:32:39 crc kubenswrapper[4785]: I1126 15:32:39.859913 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/439a16e9-65e4-4450-b95a-79095a4eebb9-utilities\") pod \"community-operators-dx6bv\" (UID: \"439a16e9-65e4-4450-b95a-79095a4eebb9\") " pod="openshift-marketplace/community-operators-dx6bv" Nov 26 15:32:39 crc kubenswrapper[4785]: I1126 15:32:39.860079 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gdktf\" (UniqueName: \"kubernetes.io/projected/439a16e9-65e4-4450-b95a-79095a4eebb9-kube-api-access-gdktf\") pod \"community-operators-dx6bv\" (UID: \"439a16e9-65e4-4450-b95a-79095a4eebb9\") " pod="openshift-marketplace/community-operators-dx6bv" Nov 26 15:32:39 crc kubenswrapper[4785]: I1126 15:32:39.860107 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/439a16e9-65e4-4450-b95a-79095a4eebb9-catalog-content\") pod \"community-operators-dx6bv\" (UID: \"439a16e9-65e4-4450-b95a-79095a4eebb9\") " pod="openshift-marketplace/community-operators-dx6bv" Nov 26 15:32:39 crc kubenswrapper[4785]: I1126 15:32:39.860417 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/439a16e9-65e4-4450-b95a-79095a4eebb9-utilities\") pod \"community-operators-dx6bv\" (UID: \"439a16e9-65e4-4450-b95a-79095a4eebb9\") " pod="openshift-marketplace/community-operators-dx6bv" Nov 26 15:32:39 crc kubenswrapper[4785]: I1126 15:32:39.860577 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/439a16e9-65e4-4450-b95a-79095a4eebb9-catalog-content\") pod \"community-operators-dx6bv\" (UID: \"439a16e9-65e4-4450-b95a-79095a4eebb9\") " pod="openshift-marketplace/community-operators-dx6bv" Nov 26 15:32:39 crc kubenswrapper[4785]: I1126 15:32:39.879861 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gdktf\" (UniqueName: \"kubernetes.io/projected/439a16e9-65e4-4450-b95a-79095a4eebb9-kube-api-access-gdktf\") pod \"community-operators-dx6bv\" (UID: \"439a16e9-65e4-4450-b95a-79095a4eebb9\") " pod="openshift-marketplace/community-operators-dx6bv" Nov 26 15:32:39 crc kubenswrapper[4785]: I1126 15:32:39.896735 4785 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-h295v" Nov 26 15:32:39 crc kubenswrapper[4785]: I1126 15:32:39.910863 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-dx6bv" Nov 26 15:32:40 crc kubenswrapper[4785]: I1126 15:32:40.354338 4785 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-dx6bv"] Nov 26 15:32:40 crc kubenswrapper[4785]: I1126 15:32:40.388102 4785 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-frnfh"] Nov 26 15:32:40 crc kubenswrapper[4785]: I1126 15:32:40.389480 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-frnfh" Nov 26 15:32:40 crc kubenswrapper[4785]: I1126 15:32:40.401856 4785 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-frnfh"] Nov 26 15:32:40 crc kubenswrapper[4785]: I1126 15:32:40.481379 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7crcd\" (UniqueName: \"kubernetes.io/projected/5ab902c8-4fc4-4a16-a87d-a3b0979c6895-kube-api-access-7crcd\") pod \"certified-operators-frnfh\" (UID: \"5ab902c8-4fc4-4a16-a87d-a3b0979c6895\") " pod="openshift-marketplace/certified-operators-frnfh" Nov 26 15:32:40 crc kubenswrapper[4785]: I1126 15:32:40.481466 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5ab902c8-4fc4-4a16-a87d-a3b0979c6895-utilities\") pod \"certified-operators-frnfh\" (UID: \"5ab902c8-4fc4-4a16-a87d-a3b0979c6895\") " pod="openshift-marketplace/certified-operators-frnfh" Nov 26 15:32:40 crc kubenswrapper[4785]: I1126 15:32:40.481488 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5ab902c8-4fc4-4a16-a87d-a3b0979c6895-catalog-content\") pod \"certified-operators-frnfh\" (UID: \"5ab902c8-4fc4-4a16-a87d-a3b0979c6895\") " pod="openshift-marketplace/certified-operators-frnfh" Nov 26 15:32:40 crc kubenswrapper[4785]: I1126 15:32:40.582995 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5ab902c8-4fc4-4a16-a87d-a3b0979c6895-catalog-content\") pod \"certified-operators-frnfh\" (UID: \"5ab902c8-4fc4-4a16-a87d-a3b0979c6895\") " pod="openshift-marketplace/certified-operators-frnfh" Nov 26 15:32:40 crc kubenswrapper[4785]: I1126 15:32:40.583462 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7crcd\" (UniqueName: \"kubernetes.io/projected/5ab902c8-4fc4-4a16-a87d-a3b0979c6895-kube-api-access-7crcd\") pod \"certified-operators-frnfh\" (UID: \"5ab902c8-4fc4-4a16-a87d-a3b0979c6895\") " pod="openshift-marketplace/certified-operators-frnfh" Nov 26 15:32:40 crc kubenswrapper[4785]: I1126 15:32:40.583658 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5ab902c8-4fc4-4a16-a87d-a3b0979c6895-utilities\") pod \"certified-operators-frnfh\" (UID: \"5ab902c8-4fc4-4a16-a87d-a3b0979c6895\") " pod="openshift-marketplace/certified-operators-frnfh" Nov 26 15:32:40 crc kubenswrapper[4785]: I1126 15:32:40.584084 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5ab902c8-4fc4-4a16-a87d-a3b0979c6895-catalog-content\") pod \"certified-operators-frnfh\" (UID: \"5ab902c8-4fc4-4a16-a87d-a3b0979c6895\") " pod="openshift-marketplace/certified-operators-frnfh" Nov 26 15:32:40 crc kubenswrapper[4785]: I1126 15:32:40.584284 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5ab902c8-4fc4-4a16-a87d-a3b0979c6895-utilities\") pod \"certified-operators-frnfh\" (UID: \"5ab902c8-4fc4-4a16-a87d-a3b0979c6895\") " pod="openshift-marketplace/certified-operators-frnfh" Nov 26 15:32:40 crc kubenswrapper[4785]: I1126 15:32:40.602045 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7crcd\" (UniqueName: \"kubernetes.io/projected/5ab902c8-4fc4-4a16-a87d-a3b0979c6895-kube-api-access-7crcd\") pod \"certified-operators-frnfh\" (UID: \"5ab902c8-4fc4-4a16-a87d-a3b0979c6895\") " pod="openshift-marketplace/certified-operators-frnfh" Nov 26 15:32:40 crc kubenswrapper[4785]: I1126 15:32:40.747088 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-frnfh" Nov 26 15:32:40 crc kubenswrapper[4785]: I1126 15:32:40.747882 4785 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-rcm4r" podUID="937be872-928d-4374-8605-eb4e529dbebe" containerName="registry-server" probeResult="failure" output=< Nov 26 15:32:40 crc kubenswrapper[4785]: timeout: failed to connect service ":50051" within 1s Nov 26 15:32:40 crc kubenswrapper[4785]: > Nov 26 15:32:40 crc kubenswrapper[4785]: I1126 15:32:40.847826 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dx6bv" event={"ID":"439a16e9-65e4-4450-b95a-79095a4eebb9","Type":"ContainerStarted","Data":"0e94d951ea561cb61ba0d4257a748f2843787f8d1b5a9bd086e67510eb32a3d7"} Nov 26 15:32:40 crc kubenswrapper[4785]: I1126 15:32:40.849680 4785 generic.go:334] "Generic (PLEG): container finished" podID="42779d5c-2b5c-4058-9b9d-995285d02e80" containerID="876da037fa54d4010a81d565d7c214117d0bf6b0d4ab39ad90ec9741a5dfbc42" exitCode=0 Nov 26 15:32:40 crc kubenswrapper[4785]: I1126 15:32:40.849769 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5lfl5" event={"ID":"42779d5c-2b5c-4058-9b9d-995285d02e80","Type":"ContainerDied","Data":"876da037fa54d4010a81d565d7c214117d0bf6b0d4ab39ad90ec9741a5dfbc42"} Nov 26 15:32:41 crc kubenswrapper[4785]: E1126 15:32:41.154487 4785 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gkxdl_openshift-machine-config-operator(5539d39a-e2bc-4e7f-8b5a-a5e3e10c4ba4)\"" pod="openshift-machine-config-operator/machine-config-daemon-gkxdl" podUID="5539d39a-e2bc-4e7f-8b5a-a5e3e10c4ba4" Nov 26 15:32:41 crc kubenswrapper[4785]: I1126 15:32:41.230384 4785 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-frnfh"] Nov 26 15:32:41 crc kubenswrapper[4785]: I1126 15:32:41.859457 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-frnfh" event={"ID":"5ab902c8-4fc4-4a16-a87d-a3b0979c6895","Type":"ContainerStarted","Data":"9281f7c8493951a7c2d93b02236c4b1cf01bbdc241c7f47032d73d4203b99636"} Nov 26 15:32:41 crc kubenswrapper[4785]: I1126 15:32:41.863047 4785 generic.go:334] "Generic (PLEG): container finished" podID="df43a35c-ce4b-434a-90a2-69bd1e97090d" containerID="77aa32b19489df2fd4078e4bad85dce3001742ac17661d8093653f6f3dcf68f2" exitCode=0 Nov 26 15:32:41 crc kubenswrapper[4785]: I1126 15:32:41.863086 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cnlfc" event={"ID":"df43a35c-ce4b-434a-90a2-69bd1e97090d","Type":"ContainerDied","Data":"77aa32b19489df2fd4078e4bad85dce3001742ac17661d8093653f6f3dcf68f2"} Nov 26 15:32:41 crc kubenswrapper[4785]: I1126 15:32:41.865647 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dx6bv" event={"ID":"439a16e9-65e4-4450-b95a-79095a4eebb9","Type":"ContainerStarted","Data":"6f40a2db002e05e9f10bc8ff950b20eac68d64a61c9fba802fb86e21b04752ba"} Nov 26 15:32:45 crc kubenswrapper[4785]: I1126 15:32:45.570664 4785 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-xx9nq"] Nov 26 15:32:45 crc kubenswrapper[4785]: I1126 15:32:45.571415 4785 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-xx9nq" podUID="bb46eb70-9020-45d0-8679-a3ea2e75df85" containerName="registry-server" containerID="cri-o://5a4d0c0ccd7d0e70bcf82c596b756f23aa9f5068e0e7a249a13848cffece44f2" gracePeriod=2 Nov 26 15:32:45 crc kubenswrapper[4785]: I1126 15:32:45.611399 4785 generic.go:334] "Generic (PLEG): container finished" podID="5539d39a-e2bc-4e7f-8b5a-a5e3e10c4ba4" containerID="3e8c4964e31211d666dbae0e50d629dfc05802d610982228b7bbd6c4dcfcee2a" exitCode=0 Nov 26 15:32:45 crc kubenswrapper[4785]: I1126 15:32:45.611477 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-gkxdl" event={"ID":"5539d39a-e2bc-4e7f-8b5a-a5e3e10c4ba4","Type":"ContainerDied","Data":"3e8c4964e31211d666dbae0e50d629dfc05802d610982228b7bbd6c4dcfcee2a"} Nov 26 15:32:45 crc kubenswrapper[4785]: I1126 15:32:45.611549 4785 scope.go:117] "RemoveContainer" containerID="516aa1c954a9db24fdce21dd42520586d80f2cd7166b0bbaba72d20a1a2dfd33" Nov 26 15:32:45 crc kubenswrapper[4785]: I1126 15:32:45.612259 4785 scope.go:117] "RemoveContainer" containerID="3e8c4964e31211d666dbae0e50d629dfc05802d610982228b7bbd6c4dcfcee2a" Nov 26 15:32:45 crc kubenswrapper[4785]: E1126 15:32:45.612619 4785 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gkxdl_openshift-machine-config-operator(5539d39a-e2bc-4e7f-8b5a-a5e3e10c4ba4)\"" pod="openshift-machine-config-operator/machine-config-daemon-gkxdl" podUID="5539d39a-e2bc-4e7f-8b5a-a5e3e10c4ba4" Nov 26 15:32:45 crc kubenswrapper[4785]: I1126 15:32:45.981494 4785 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xx9nq" Nov 26 15:32:46 crc kubenswrapper[4785]: I1126 15:32:46.178627 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bb46eb70-9020-45d0-8679-a3ea2e75df85-utilities\") pod \"bb46eb70-9020-45d0-8679-a3ea2e75df85\" (UID: \"bb46eb70-9020-45d0-8679-a3ea2e75df85\") " Nov 26 15:32:46 crc kubenswrapper[4785]: I1126 15:32:46.178689 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bb46eb70-9020-45d0-8679-a3ea2e75df85-catalog-content\") pod \"bb46eb70-9020-45d0-8679-a3ea2e75df85\" (UID: \"bb46eb70-9020-45d0-8679-a3ea2e75df85\") " Nov 26 15:32:46 crc kubenswrapper[4785]: I1126 15:32:46.178712 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qh2wf\" (UniqueName: \"kubernetes.io/projected/bb46eb70-9020-45d0-8679-a3ea2e75df85-kube-api-access-qh2wf\") pod \"bb46eb70-9020-45d0-8679-a3ea2e75df85\" (UID: \"bb46eb70-9020-45d0-8679-a3ea2e75df85\") " Nov 26 15:32:46 crc kubenswrapper[4785]: I1126 15:32:46.179434 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bb46eb70-9020-45d0-8679-a3ea2e75df85-utilities" (OuterVolumeSpecName: "utilities") pod "bb46eb70-9020-45d0-8679-a3ea2e75df85" (UID: "bb46eb70-9020-45d0-8679-a3ea2e75df85"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 15:32:46 crc kubenswrapper[4785]: I1126 15:32:46.185821 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bb46eb70-9020-45d0-8679-a3ea2e75df85-kube-api-access-qh2wf" (OuterVolumeSpecName: "kube-api-access-qh2wf") pod "bb46eb70-9020-45d0-8679-a3ea2e75df85" (UID: "bb46eb70-9020-45d0-8679-a3ea2e75df85"). InnerVolumeSpecName "kube-api-access-qh2wf". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 15:32:46 crc kubenswrapper[4785]: I1126 15:32:46.193545 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bb46eb70-9020-45d0-8679-a3ea2e75df85-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "bb46eb70-9020-45d0-8679-a3ea2e75df85" (UID: "bb46eb70-9020-45d0-8679-a3ea2e75df85"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 15:32:46 crc kubenswrapper[4785]: I1126 15:32:46.281229 4785 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bb46eb70-9020-45d0-8679-a3ea2e75df85-utilities\") on node \"crc\" DevicePath \"\"" Nov 26 15:32:46 crc kubenswrapper[4785]: I1126 15:32:46.281272 4785 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bb46eb70-9020-45d0-8679-a3ea2e75df85-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 26 15:32:46 crc kubenswrapper[4785]: I1126 15:32:46.281289 4785 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qh2wf\" (UniqueName: \"kubernetes.io/projected/bb46eb70-9020-45d0-8679-a3ea2e75df85-kube-api-access-qh2wf\") on node \"crc\" DevicePath \"\"" Nov 26 15:32:46 crc kubenswrapper[4785]: I1126 15:32:46.620144 4785 generic.go:334] "Generic (PLEG): container finished" podID="439a16e9-65e4-4450-b95a-79095a4eebb9" containerID="6f40a2db002e05e9f10bc8ff950b20eac68d64a61c9fba802fb86e21b04752ba" exitCode=0 Nov 26 15:32:46 crc kubenswrapper[4785]: I1126 15:32:46.620196 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dx6bv" event={"ID":"439a16e9-65e4-4450-b95a-79095a4eebb9","Type":"ContainerDied","Data":"6f40a2db002e05e9f10bc8ff950b20eac68d64a61c9fba802fb86e21b04752ba"} Nov 26 15:32:46 crc kubenswrapper[4785]: I1126 15:32:46.623809 4785 generic.go:334] "Generic (PLEG): container finished" podID="5ab902c8-4fc4-4a16-a87d-a3b0979c6895" containerID="b304468eef8130490e93312e8412f880624d23d67554ca91dd399e147d1f530a" exitCode=0 Nov 26 15:32:46 crc kubenswrapper[4785]: I1126 15:32:46.623856 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-frnfh" event={"ID":"5ab902c8-4fc4-4a16-a87d-a3b0979c6895","Type":"ContainerDied","Data":"b304468eef8130490e93312e8412f880624d23d67554ca91dd399e147d1f530a"} Nov 26 15:32:46 crc kubenswrapper[4785]: I1126 15:32:46.627002 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5lfl5" event={"ID":"42779d5c-2b5c-4058-9b9d-995285d02e80","Type":"ContainerStarted","Data":"825f8ac8fd011688995c2e27ad9ddbeffc233278778c0af9bcc8be3fd6113da0"} Nov 26 15:32:46 crc kubenswrapper[4785]: I1126 15:32:46.630138 4785 generic.go:334] "Generic (PLEG): container finished" podID="bb46eb70-9020-45d0-8679-a3ea2e75df85" containerID="5a4d0c0ccd7d0e70bcf82c596b756f23aa9f5068e0e7a249a13848cffece44f2" exitCode=0 Nov 26 15:32:46 crc kubenswrapper[4785]: I1126 15:32:46.630191 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xx9nq" event={"ID":"bb46eb70-9020-45d0-8679-a3ea2e75df85","Type":"ContainerDied","Data":"5a4d0c0ccd7d0e70bcf82c596b756f23aa9f5068e0e7a249a13848cffece44f2"} Nov 26 15:32:46 crc kubenswrapper[4785]: I1126 15:32:46.630247 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xx9nq" event={"ID":"bb46eb70-9020-45d0-8679-a3ea2e75df85","Type":"ContainerDied","Data":"9da744eaae5d3b1a2d6022353b56278aa8926fe043260e964dfc381ef8401b90"} Nov 26 15:32:46 crc kubenswrapper[4785]: I1126 15:32:46.630205 4785 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xx9nq" Nov 26 15:32:46 crc kubenswrapper[4785]: I1126 15:32:46.630276 4785 scope.go:117] "RemoveContainer" containerID="5a4d0c0ccd7d0e70bcf82c596b756f23aa9f5068e0e7a249a13848cffece44f2" Nov 26 15:32:46 crc kubenswrapper[4785]: I1126 15:32:46.633162 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cnlfc" event={"ID":"df43a35c-ce4b-434a-90a2-69bd1e97090d","Type":"ContainerStarted","Data":"6172c57b6e122d80447b6c4cb587e6af8d65f2753def9b47b0d42a29cf2a8656"} Nov 26 15:32:46 crc kubenswrapper[4785]: I1126 15:32:46.656457 4785 scope.go:117] "RemoveContainer" containerID="2513385f72e1929426bd1ac51124ef014d23bdd9a9c847d93cfa816ad4365721" Nov 26 15:32:46 crc kubenswrapper[4785]: I1126 15:32:46.678760 4785 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-5lfl5" podStartSLOduration=3.267230583 podStartE2EDuration="12.678670974s" podCreationTimestamp="2025-11-26 15:32:34 +0000 UTC" firstStartedPulling="2025-11-26 15:32:35.791335097 +0000 UTC m=+1519.469700871" lastFinishedPulling="2025-11-26 15:32:45.202775498 +0000 UTC m=+1528.881141262" observedRunningTime="2025-11-26 15:32:46.669894565 +0000 UTC m=+1530.348260339" watchObservedRunningTime="2025-11-26 15:32:46.678670974 +0000 UTC m=+1530.357036758" Nov 26 15:32:46 crc kubenswrapper[4785]: I1126 15:32:46.690464 4785 scope.go:117] "RemoveContainer" containerID="c4894aba2092228c9c5f84654c45f4f36c056c2c042553442eff9b4b8678152c" Nov 26 15:32:46 crc kubenswrapper[4785]: I1126 15:32:46.729365 4785 scope.go:117] "RemoveContainer" containerID="5a4d0c0ccd7d0e70bcf82c596b756f23aa9f5068e0e7a249a13848cffece44f2" Nov 26 15:32:46 crc kubenswrapper[4785]: E1126 15:32:46.730712 4785 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5a4d0c0ccd7d0e70bcf82c596b756f23aa9f5068e0e7a249a13848cffece44f2\": container with ID starting with 5a4d0c0ccd7d0e70bcf82c596b756f23aa9f5068e0e7a249a13848cffece44f2 not found: ID does not exist" containerID="5a4d0c0ccd7d0e70bcf82c596b756f23aa9f5068e0e7a249a13848cffece44f2" Nov 26 15:32:46 crc kubenswrapper[4785]: I1126 15:32:46.731327 4785 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5a4d0c0ccd7d0e70bcf82c596b756f23aa9f5068e0e7a249a13848cffece44f2"} err="failed to get container status \"5a4d0c0ccd7d0e70bcf82c596b756f23aa9f5068e0e7a249a13848cffece44f2\": rpc error: code = NotFound desc = could not find container \"5a4d0c0ccd7d0e70bcf82c596b756f23aa9f5068e0e7a249a13848cffece44f2\": container with ID starting with 5a4d0c0ccd7d0e70bcf82c596b756f23aa9f5068e0e7a249a13848cffece44f2 not found: ID does not exist" Nov 26 15:32:46 crc kubenswrapper[4785]: I1126 15:32:46.731361 4785 scope.go:117] "RemoveContainer" containerID="2513385f72e1929426bd1ac51124ef014d23bdd9a9c847d93cfa816ad4365721" Nov 26 15:32:46 crc kubenswrapper[4785]: E1126 15:32:46.732273 4785 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2513385f72e1929426bd1ac51124ef014d23bdd9a9c847d93cfa816ad4365721\": container with ID starting with 2513385f72e1929426bd1ac51124ef014d23bdd9a9c847d93cfa816ad4365721 not found: ID does not exist" containerID="2513385f72e1929426bd1ac51124ef014d23bdd9a9c847d93cfa816ad4365721" Nov 26 15:32:46 crc kubenswrapper[4785]: I1126 15:32:46.733678 4785 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2513385f72e1929426bd1ac51124ef014d23bdd9a9c847d93cfa816ad4365721"} err="failed to get container status \"2513385f72e1929426bd1ac51124ef014d23bdd9a9c847d93cfa816ad4365721\": rpc error: code = NotFound desc = could not find container \"2513385f72e1929426bd1ac51124ef014d23bdd9a9c847d93cfa816ad4365721\": container with ID starting with 2513385f72e1929426bd1ac51124ef014d23bdd9a9c847d93cfa816ad4365721 not found: ID does not exist" Nov 26 15:32:46 crc kubenswrapper[4785]: I1126 15:32:46.733705 4785 scope.go:117] "RemoveContainer" containerID="c4894aba2092228c9c5f84654c45f4f36c056c2c042553442eff9b4b8678152c" Nov 26 15:32:46 crc kubenswrapper[4785]: E1126 15:32:46.734351 4785 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c4894aba2092228c9c5f84654c45f4f36c056c2c042553442eff9b4b8678152c\": container with ID starting with c4894aba2092228c9c5f84654c45f4f36c056c2c042553442eff9b4b8678152c not found: ID does not exist" containerID="c4894aba2092228c9c5f84654c45f4f36c056c2c042553442eff9b4b8678152c" Nov 26 15:32:46 crc kubenswrapper[4785]: I1126 15:32:46.734379 4785 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c4894aba2092228c9c5f84654c45f4f36c056c2c042553442eff9b4b8678152c"} err="failed to get container status \"c4894aba2092228c9c5f84654c45f4f36c056c2c042553442eff9b4b8678152c\": rpc error: code = NotFound desc = could not find container \"c4894aba2092228c9c5f84654c45f4f36c056c2c042553442eff9b4b8678152c\": container with ID starting with c4894aba2092228c9c5f84654c45f4f36c056c2c042553442eff9b4b8678152c not found: ID does not exist" Nov 26 15:32:46 crc kubenswrapper[4785]: I1126 15:32:46.735959 4785 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-cnlfc" podStartSLOduration=3.129323009 podStartE2EDuration="12.735945793s" podCreationTimestamp="2025-11-26 15:32:34 +0000 UTC" firstStartedPulling="2025-11-26 15:32:35.793342972 +0000 UTC m=+1519.471708736" lastFinishedPulling="2025-11-26 15:32:45.399965716 +0000 UTC m=+1529.078331520" observedRunningTime="2025-11-26 15:32:46.721223312 +0000 UTC m=+1530.399589076" watchObservedRunningTime="2025-11-26 15:32:46.735945793 +0000 UTC m=+1530.414311577" Nov 26 15:32:46 crc kubenswrapper[4785]: I1126 15:32:46.741294 4785 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-xx9nq"] Nov 26 15:32:46 crc kubenswrapper[4785]: I1126 15:32:46.747797 4785 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-xx9nq"] Nov 26 15:32:47 crc kubenswrapper[4785]: I1126 15:32:47.062212 4785 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bb46eb70-9020-45d0-8679-a3ea2e75df85" path="/var/lib/kubelet/pods/bb46eb70-9020-45d0-8679-a3ea2e75df85/volumes" Nov 26 15:32:47 crc kubenswrapper[4785]: I1126 15:32:47.657039 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dx6bv" event={"ID":"439a16e9-65e4-4450-b95a-79095a4eebb9","Type":"ContainerStarted","Data":"db0b9c6c5995e44ceafc79a1d2c6dc67ba5ec7dd91f1a2bde6b657c37abe34fc"} Nov 26 15:32:47 crc kubenswrapper[4785]: I1126 15:32:47.664073 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-frnfh" event={"ID":"5ab902c8-4fc4-4a16-a87d-a3b0979c6895","Type":"ContainerStarted","Data":"cdfd91481c17f5adad4807fd3a05c2dbcc2d041be18ff74d1505c5352e6e607f"} Nov 26 15:32:48 crc kubenswrapper[4785]: I1126 15:32:48.672911 4785 generic.go:334] "Generic (PLEG): container finished" podID="439a16e9-65e4-4450-b95a-79095a4eebb9" containerID="db0b9c6c5995e44ceafc79a1d2c6dc67ba5ec7dd91f1a2bde6b657c37abe34fc" exitCode=0 Nov 26 15:32:48 crc kubenswrapper[4785]: I1126 15:32:48.673028 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dx6bv" event={"ID":"439a16e9-65e4-4450-b95a-79095a4eebb9","Type":"ContainerDied","Data":"db0b9c6c5995e44ceafc79a1d2c6dc67ba5ec7dd91f1a2bde6b657c37abe34fc"} Nov 26 15:32:48 crc kubenswrapper[4785]: I1126 15:32:48.677208 4785 generic.go:334] "Generic (PLEG): container finished" podID="5ab902c8-4fc4-4a16-a87d-a3b0979c6895" containerID="cdfd91481c17f5adad4807fd3a05c2dbcc2d041be18ff74d1505c5352e6e607f" exitCode=0 Nov 26 15:32:48 crc kubenswrapper[4785]: I1126 15:32:48.677249 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-frnfh" event={"ID":"5ab902c8-4fc4-4a16-a87d-a3b0979c6895","Type":"ContainerDied","Data":"cdfd91481c17f5adad4807fd3a05c2dbcc2d041be18ff74d1505c5352e6e607f"} Nov 26 15:32:49 crc kubenswrapper[4785]: I1126 15:32:49.060322 4785 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/keystone-db-create-8m6gk"] Nov 26 15:32:49 crc kubenswrapper[4785]: I1126 15:32:49.068147 4785 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/keystone-84ae-account-create-update-c7dg2"] Nov 26 15:32:49 crc kubenswrapper[4785]: I1126 15:32:49.074403 4785 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/keystone-db-create-8m6gk"] Nov 26 15:32:49 crc kubenswrapper[4785]: I1126 15:32:49.079675 4785 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/keystone-84ae-account-create-update-c7dg2"] Nov 26 15:32:49 crc kubenswrapper[4785]: I1126 15:32:49.775410 4785 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-rcm4r" Nov 26 15:32:49 crc kubenswrapper[4785]: I1126 15:32:49.832309 4785 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-rcm4r" Nov 26 15:32:50 crc kubenswrapper[4785]: I1126 15:32:50.699744 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dx6bv" event={"ID":"439a16e9-65e4-4450-b95a-79095a4eebb9","Type":"ContainerStarted","Data":"25e29c4311627efe08667078492a435d58d0181767c202437d86567030f68d9f"} Nov 26 15:32:50 crc kubenswrapper[4785]: I1126 15:32:50.703732 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-frnfh" event={"ID":"5ab902c8-4fc4-4a16-a87d-a3b0979c6895","Type":"ContainerStarted","Data":"c6a406b0e0de328d42317115430647e0220396399715e31075a89f5216da20fd"} Nov 26 15:32:50 crc kubenswrapper[4785]: I1126 15:32:50.717516 4785 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-dx6bv" podStartSLOduration=8.651183705 podStartE2EDuration="11.717500764s" podCreationTimestamp="2025-11-26 15:32:39 +0000 UTC" firstStartedPulling="2025-11-26 15:32:46.622633708 +0000 UTC m=+1530.300999472" lastFinishedPulling="2025-11-26 15:32:49.688950757 +0000 UTC m=+1533.367316531" observedRunningTime="2025-11-26 15:32:50.714099372 +0000 UTC m=+1534.392465146" watchObservedRunningTime="2025-11-26 15:32:50.717500764 +0000 UTC m=+1534.395866518" Nov 26 15:32:50 crc kubenswrapper[4785]: I1126 15:32:50.748616 4785 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-frnfh" Nov 26 15:32:50 crc kubenswrapper[4785]: I1126 15:32:50.748688 4785 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-frnfh" Nov 26 15:32:51 crc kubenswrapper[4785]: I1126 15:32:51.045986 4785 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="720626e8-8cf9-499b-970c-95fce7ef463a" path="/var/lib/kubelet/pods/720626e8-8cf9-499b-970c-95fce7ef463a/volumes" Nov 26 15:32:51 crc kubenswrapper[4785]: I1126 15:32:51.046854 4785 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="74b3e785-2714-48e3-a605-95a8056b8220" path="/var/lib/kubelet/pods/74b3e785-2714-48e3-a605-95a8056b8220/volumes" Nov 26 15:32:51 crc kubenswrapper[4785]: I1126 15:32:51.793078 4785 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-frnfh" podUID="5ab902c8-4fc4-4a16-a87d-a3b0979c6895" containerName="registry-server" probeResult="failure" output=< Nov 26 15:32:51 crc kubenswrapper[4785]: timeout: failed to connect service ":50051" within 1s Nov 26 15:32:51 crc kubenswrapper[4785]: > Nov 26 15:32:54 crc kubenswrapper[4785]: I1126 15:32:54.791932 4785 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-5lfl5" Nov 26 15:32:54 crc kubenswrapper[4785]: I1126 15:32:54.793657 4785 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-5lfl5" Nov 26 15:32:54 crc kubenswrapper[4785]: I1126 15:32:54.852930 4785 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-5lfl5" Nov 26 15:32:54 crc kubenswrapper[4785]: I1126 15:32:54.873004 4785 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-frnfh" podStartSLOduration=11.859569664 podStartE2EDuration="14.872986452s" podCreationTimestamp="2025-11-26 15:32:40 +0000 UTC" firstStartedPulling="2025-11-26 15:32:46.625104156 +0000 UTC m=+1530.303469920" lastFinishedPulling="2025-11-26 15:32:49.638520934 +0000 UTC m=+1533.316886708" observedRunningTime="2025-11-26 15:32:50.72799154 +0000 UTC m=+1534.406357294" watchObservedRunningTime="2025-11-26 15:32:54.872986452 +0000 UTC m=+1538.551352216" Nov 26 15:32:55 crc kubenswrapper[4785]: I1126 15:32:55.108891 4785 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-cnlfc" Nov 26 15:32:55 crc kubenswrapper[4785]: I1126 15:32:55.108936 4785 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-cnlfc" Nov 26 15:32:55 crc kubenswrapper[4785]: I1126 15:32:55.149185 4785 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-cnlfc" Nov 26 15:32:55 crc kubenswrapper[4785]: I1126 15:32:55.817919 4785 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-cnlfc" Nov 26 15:32:55 crc kubenswrapper[4785]: I1126 15:32:55.827583 4785 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-5lfl5" Nov 26 15:32:57 crc kubenswrapper[4785]: I1126 15:32:57.045320 4785 scope.go:117] "RemoveContainer" containerID="3e8c4964e31211d666dbae0e50d629dfc05802d610982228b7bbd6c4dcfcee2a" Nov 26 15:32:57 crc kubenswrapper[4785]: E1126 15:32:57.046243 4785 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gkxdl_openshift-machine-config-operator(5539d39a-e2bc-4e7f-8b5a-a5e3e10c4ba4)\"" pod="openshift-machine-config-operator/machine-config-daemon-gkxdl" podUID="5539d39a-e2bc-4e7f-8b5a-a5e3e10c4ba4" Nov 26 15:32:59 crc kubenswrapper[4785]: I1126 15:32:59.166099 4785 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-rcm4r"] Nov 26 15:32:59 crc kubenswrapper[4785]: I1126 15:32:59.166783 4785 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-rcm4r" podUID="937be872-928d-4374-8605-eb4e529dbebe" containerName="registry-server" containerID="cri-o://740f3a55d8475edc3e38cf3009694d665b598c4c05166d31aa96a7c40e77e847" gracePeriod=2 Nov 26 15:32:59 crc kubenswrapper[4785]: I1126 15:32:59.378393 4785 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-9g4wk"] Nov 26 15:32:59 crc kubenswrapper[4785]: E1126 15:32:59.378707 4785 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb46eb70-9020-45d0-8679-a3ea2e75df85" containerName="extract-utilities" Nov 26 15:32:59 crc kubenswrapper[4785]: I1126 15:32:59.378718 4785 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb46eb70-9020-45d0-8679-a3ea2e75df85" containerName="extract-utilities" Nov 26 15:32:59 crc kubenswrapper[4785]: E1126 15:32:59.378732 4785 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb46eb70-9020-45d0-8679-a3ea2e75df85" containerName="extract-content" Nov 26 15:32:59 crc kubenswrapper[4785]: I1126 15:32:59.378739 4785 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb46eb70-9020-45d0-8679-a3ea2e75df85" containerName="extract-content" Nov 26 15:32:59 crc kubenswrapper[4785]: E1126 15:32:59.378771 4785 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb46eb70-9020-45d0-8679-a3ea2e75df85" containerName="registry-server" Nov 26 15:32:59 crc kubenswrapper[4785]: I1126 15:32:59.378777 4785 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb46eb70-9020-45d0-8679-a3ea2e75df85" containerName="registry-server" Nov 26 15:32:59 crc kubenswrapper[4785]: I1126 15:32:59.378967 4785 memory_manager.go:354] "RemoveStaleState removing state" podUID="bb46eb70-9020-45d0-8679-a3ea2e75df85" containerName="registry-server" Nov 26 15:32:59 crc kubenswrapper[4785]: I1126 15:32:59.379990 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9g4wk" Nov 26 15:32:59 crc kubenswrapper[4785]: I1126 15:32:59.400179 4785 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-9g4wk"] Nov 26 15:32:59 crc kubenswrapper[4785]: I1126 15:32:59.495754 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/738357c6-a8dc-4498-ba43-9124dc07b379-utilities\") pod \"redhat-marketplace-9g4wk\" (UID: \"738357c6-a8dc-4498-ba43-9124dc07b379\") " pod="openshift-marketplace/redhat-marketplace-9g4wk" Nov 26 15:32:59 crc kubenswrapper[4785]: I1126 15:32:59.496120 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lkjv9\" (UniqueName: \"kubernetes.io/projected/738357c6-a8dc-4498-ba43-9124dc07b379-kube-api-access-lkjv9\") pod \"redhat-marketplace-9g4wk\" (UID: \"738357c6-a8dc-4498-ba43-9124dc07b379\") " pod="openshift-marketplace/redhat-marketplace-9g4wk" Nov 26 15:32:59 crc kubenswrapper[4785]: I1126 15:32:59.496173 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/738357c6-a8dc-4498-ba43-9124dc07b379-catalog-content\") pod \"redhat-marketplace-9g4wk\" (UID: \"738357c6-a8dc-4498-ba43-9124dc07b379\") " pod="openshift-marketplace/redhat-marketplace-9g4wk" Nov 26 15:32:59 crc kubenswrapper[4785]: I1126 15:32:59.599058 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/738357c6-a8dc-4498-ba43-9124dc07b379-utilities\") pod \"redhat-marketplace-9g4wk\" (UID: \"738357c6-a8dc-4498-ba43-9124dc07b379\") " pod="openshift-marketplace/redhat-marketplace-9g4wk" Nov 26 15:32:59 crc kubenswrapper[4785]: I1126 15:32:59.599144 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lkjv9\" (UniqueName: \"kubernetes.io/projected/738357c6-a8dc-4498-ba43-9124dc07b379-kube-api-access-lkjv9\") pod \"redhat-marketplace-9g4wk\" (UID: \"738357c6-a8dc-4498-ba43-9124dc07b379\") " pod="openshift-marketplace/redhat-marketplace-9g4wk" Nov 26 15:32:59 crc kubenswrapper[4785]: I1126 15:32:59.599195 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/738357c6-a8dc-4498-ba43-9124dc07b379-catalog-content\") pod \"redhat-marketplace-9g4wk\" (UID: \"738357c6-a8dc-4498-ba43-9124dc07b379\") " pod="openshift-marketplace/redhat-marketplace-9g4wk" Nov 26 15:32:59 crc kubenswrapper[4785]: I1126 15:32:59.599814 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/738357c6-a8dc-4498-ba43-9124dc07b379-catalog-content\") pod \"redhat-marketplace-9g4wk\" (UID: \"738357c6-a8dc-4498-ba43-9124dc07b379\") " pod="openshift-marketplace/redhat-marketplace-9g4wk" Nov 26 15:32:59 crc kubenswrapper[4785]: I1126 15:32:59.600084 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/738357c6-a8dc-4498-ba43-9124dc07b379-utilities\") pod \"redhat-marketplace-9g4wk\" (UID: \"738357c6-a8dc-4498-ba43-9124dc07b379\") " pod="openshift-marketplace/redhat-marketplace-9g4wk" Nov 26 15:32:59 crc kubenswrapper[4785]: I1126 15:32:59.624288 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lkjv9\" (UniqueName: \"kubernetes.io/projected/738357c6-a8dc-4498-ba43-9124dc07b379-kube-api-access-lkjv9\") pod \"redhat-marketplace-9g4wk\" (UID: \"738357c6-a8dc-4498-ba43-9124dc07b379\") " pod="openshift-marketplace/redhat-marketplace-9g4wk" Nov 26 15:32:59 crc kubenswrapper[4785]: I1126 15:32:59.666358 4785 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rcm4r" Nov 26 15:32:59 crc kubenswrapper[4785]: I1126 15:32:59.702267 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9g4wk" Nov 26 15:32:59 crc kubenswrapper[4785]: I1126 15:32:59.779965 4785 generic.go:334] "Generic (PLEG): container finished" podID="937be872-928d-4374-8605-eb4e529dbebe" containerID="740f3a55d8475edc3e38cf3009694d665b598c4c05166d31aa96a7c40e77e847" exitCode=0 Nov 26 15:32:59 crc kubenswrapper[4785]: I1126 15:32:59.780017 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rcm4r" event={"ID":"937be872-928d-4374-8605-eb4e529dbebe","Type":"ContainerDied","Data":"740f3a55d8475edc3e38cf3009694d665b598c4c05166d31aa96a7c40e77e847"} Nov 26 15:32:59 crc kubenswrapper[4785]: I1126 15:32:59.780048 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rcm4r" event={"ID":"937be872-928d-4374-8605-eb4e529dbebe","Type":"ContainerDied","Data":"ae51c466262f853014d46a59723d8bca6442e60974dce6dea0996367b5234285"} Nov 26 15:32:59 crc kubenswrapper[4785]: I1126 15:32:59.780068 4785 scope.go:117] "RemoveContainer" containerID="740f3a55d8475edc3e38cf3009694d665b598c4c05166d31aa96a7c40e77e847" Nov 26 15:32:59 crc kubenswrapper[4785]: I1126 15:32:59.780216 4785 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rcm4r" Nov 26 15:32:59 crc kubenswrapper[4785]: I1126 15:32:59.802095 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/937be872-928d-4374-8605-eb4e529dbebe-utilities\") pod \"937be872-928d-4374-8605-eb4e529dbebe\" (UID: \"937be872-928d-4374-8605-eb4e529dbebe\") " Nov 26 15:32:59 crc kubenswrapper[4785]: I1126 15:32:59.802421 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/937be872-928d-4374-8605-eb4e529dbebe-catalog-content\") pod \"937be872-928d-4374-8605-eb4e529dbebe\" (UID: \"937be872-928d-4374-8605-eb4e529dbebe\") " Nov 26 15:32:59 crc kubenswrapper[4785]: I1126 15:32:59.802522 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7t2tt\" (UniqueName: \"kubernetes.io/projected/937be872-928d-4374-8605-eb4e529dbebe-kube-api-access-7t2tt\") pod \"937be872-928d-4374-8605-eb4e529dbebe\" (UID: \"937be872-928d-4374-8605-eb4e529dbebe\") " Nov 26 15:32:59 crc kubenswrapper[4785]: I1126 15:32:59.803056 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/937be872-928d-4374-8605-eb4e529dbebe-utilities" (OuterVolumeSpecName: "utilities") pod "937be872-928d-4374-8605-eb4e529dbebe" (UID: "937be872-928d-4374-8605-eb4e529dbebe"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 15:32:59 crc kubenswrapper[4785]: I1126 15:32:59.809332 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/937be872-928d-4374-8605-eb4e529dbebe-kube-api-access-7t2tt" (OuterVolumeSpecName: "kube-api-access-7t2tt") pod "937be872-928d-4374-8605-eb4e529dbebe" (UID: "937be872-928d-4374-8605-eb4e529dbebe"). InnerVolumeSpecName "kube-api-access-7t2tt". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 15:32:59 crc kubenswrapper[4785]: I1126 15:32:59.827192 4785 scope.go:117] "RemoveContainer" containerID="0b00758836c3d082cdd413c78f182d44c11557ebe4d9ab16db0a702d7790eace" Nov 26 15:32:59 crc kubenswrapper[4785]: I1126 15:32:59.850130 4785 scope.go:117] "RemoveContainer" containerID="e55b4ee11e5e473c97d4e0673496965c4994e8eb405dc28767417dc94ab0093d" Nov 26 15:32:59 crc kubenswrapper[4785]: I1126 15:32:59.874328 4785 scope.go:117] "RemoveContainer" containerID="740f3a55d8475edc3e38cf3009694d665b598c4c05166d31aa96a7c40e77e847" Nov 26 15:32:59 crc kubenswrapper[4785]: E1126 15:32:59.874710 4785 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"740f3a55d8475edc3e38cf3009694d665b598c4c05166d31aa96a7c40e77e847\": container with ID starting with 740f3a55d8475edc3e38cf3009694d665b598c4c05166d31aa96a7c40e77e847 not found: ID does not exist" containerID="740f3a55d8475edc3e38cf3009694d665b598c4c05166d31aa96a7c40e77e847" Nov 26 15:32:59 crc kubenswrapper[4785]: I1126 15:32:59.874750 4785 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"740f3a55d8475edc3e38cf3009694d665b598c4c05166d31aa96a7c40e77e847"} err="failed to get container status \"740f3a55d8475edc3e38cf3009694d665b598c4c05166d31aa96a7c40e77e847\": rpc error: code = NotFound desc = could not find container \"740f3a55d8475edc3e38cf3009694d665b598c4c05166d31aa96a7c40e77e847\": container with ID starting with 740f3a55d8475edc3e38cf3009694d665b598c4c05166d31aa96a7c40e77e847 not found: ID does not exist" Nov 26 15:32:59 crc kubenswrapper[4785]: I1126 15:32:59.874774 4785 scope.go:117] "RemoveContainer" containerID="0b00758836c3d082cdd413c78f182d44c11557ebe4d9ab16db0a702d7790eace" Nov 26 15:32:59 crc kubenswrapper[4785]: E1126 15:32:59.875018 4785 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0b00758836c3d082cdd413c78f182d44c11557ebe4d9ab16db0a702d7790eace\": container with ID starting with 0b00758836c3d082cdd413c78f182d44c11557ebe4d9ab16db0a702d7790eace not found: ID does not exist" containerID="0b00758836c3d082cdd413c78f182d44c11557ebe4d9ab16db0a702d7790eace" Nov 26 15:32:59 crc kubenswrapper[4785]: I1126 15:32:59.875043 4785 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0b00758836c3d082cdd413c78f182d44c11557ebe4d9ab16db0a702d7790eace"} err="failed to get container status \"0b00758836c3d082cdd413c78f182d44c11557ebe4d9ab16db0a702d7790eace\": rpc error: code = NotFound desc = could not find container \"0b00758836c3d082cdd413c78f182d44c11557ebe4d9ab16db0a702d7790eace\": container with ID starting with 0b00758836c3d082cdd413c78f182d44c11557ebe4d9ab16db0a702d7790eace not found: ID does not exist" Nov 26 15:32:59 crc kubenswrapper[4785]: I1126 15:32:59.875060 4785 scope.go:117] "RemoveContainer" containerID="e55b4ee11e5e473c97d4e0673496965c4994e8eb405dc28767417dc94ab0093d" Nov 26 15:32:59 crc kubenswrapper[4785]: E1126 15:32:59.875261 4785 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e55b4ee11e5e473c97d4e0673496965c4994e8eb405dc28767417dc94ab0093d\": container with ID starting with e55b4ee11e5e473c97d4e0673496965c4994e8eb405dc28767417dc94ab0093d not found: ID does not exist" containerID="e55b4ee11e5e473c97d4e0673496965c4994e8eb405dc28767417dc94ab0093d" Nov 26 15:32:59 crc kubenswrapper[4785]: I1126 15:32:59.875284 4785 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e55b4ee11e5e473c97d4e0673496965c4994e8eb405dc28767417dc94ab0093d"} err="failed to get container status \"e55b4ee11e5e473c97d4e0673496965c4994e8eb405dc28767417dc94ab0093d\": rpc error: code = NotFound desc = could not find container \"e55b4ee11e5e473c97d4e0673496965c4994e8eb405dc28767417dc94ab0093d\": container with ID starting with e55b4ee11e5e473c97d4e0673496965c4994e8eb405dc28767417dc94ab0093d not found: ID does not exist" Nov 26 15:32:59 crc kubenswrapper[4785]: I1126 15:32:59.908143 4785 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/937be872-928d-4374-8605-eb4e529dbebe-utilities\") on node \"crc\" DevicePath \"\"" Nov 26 15:32:59 crc kubenswrapper[4785]: I1126 15:32:59.908178 4785 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7t2tt\" (UniqueName: \"kubernetes.io/projected/937be872-928d-4374-8605-eb4e529dbebe-kube-api-access-7t2tt\") on node \"crc\" DevicePath \"\"" Nov 26 15:32:59 crc kubenswrapper[4785]: I1126 15:32:59.908911 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/937be872-928d-4374-8605-eb4e529dbebe-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "937be872-928d-4374-8605-eb4e529dbebe" (UID: "937be872-928d-4374-8605-eb4e529dbebe"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 15:32:59 crc kubenswrapper[4785]: I1126 15:32:59.911384 4785 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-dx6bv" Nov 26 15:32:59 crc kubenswrapper[4785]: I1126 15:32:59.912024 4785 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-dx6bv" Nov 26 15:32:59 crc kubenswrapper[4785]: I1126 15:32:59.959874 4785 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-dx6bv" Nov 26 15:32:59 crc kubenswrapper[4785]: I1126 15:32:59.996426 4785 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-n9d7v"] Nov 26 15:32:59 crc kubenswrapper[4785]: I1126 15:32:59.996710 4785 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-n9d7v" podUID="827906ee-a764-4950-849d-788b5f263040" containerName="registry-server" containerID="cri-o://4cc649213cfd442576cdf2538352d045193302df8bc7494c6a0509d2306624c0" gracePeriod=2 Nov 26 15:33:00 crc kubenswrapper[4785]: I1126 15:33:00.009891 4785 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/937be872-928d-4374-8605-eb4e529dbebe-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 26 15:33:00 crc kubenswrapper[4785]: I1126 15:33:00.129817 4785 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-rcm4r"] Nov 26 15:33:00 crc kubenswrapper[4785]: I1126 15:33:00.140023 4785 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-rcm4r"] Nov 26 15:33:00 crc kubenswrapper[4785]: I1126 15:33:00.156126 4785 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-9g4wk"] Nov 26 15:33:00 crc kubenswrapper[4785]: I1126 15:33:00.402274 4785 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-n9d7v" Nov 26 15:33:00 crc kubenswrapper[4785]: I1126 15:33:00.517600 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/827906ee-a764-4950-849d-788b5f263040-utilities\") pod \"827906ee-a764-4950-849d-788b5f263040\" (UID: \"827906ee-a764-4950-849d-788b5f263040\") " Nov 26 15:33:00 crc kubenswrapper[4785]: I1126 15:33:00.517698 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2vcpt\" (UniqueName: \"kubernetes.io/projected/827906ee-a764-4950-849d-788b5f263040-kube-api-access-2vcpt\") pod \"827906ee-a764-4950-849d-788b5f263040\" (UID: \"827906ee-a764-4950-849d-788b5f263040\") " Nov 26 15:33:00 crc kubenswrapper[4785]: I1126 15:33:00.517774 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/827906ee-a764-4950-849d-788b5f263040-catalog-content\") pod \"827906ee-a764-4950-849d-788b5f263040\" (UID: \"827906ee-a764-4950-849d-788b5f263040\") " Nov 26 15:33:00 crc kubenswrapper[4785]: I1126 15:33:00.518268 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/827906ee-a764-4950-849d-788b5f263040-utilities" (OuterVolumeSpecName: "utilities") pod "827906ee-a764-4950-849d-788b5f263040" (UID: "827906ee-a764-4950-849d-788b5f263040"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 15:33:00 crc kubenswrapper[4785]: I1126 15:33:00.526810 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/827906ee-a764-4950-849d-788b5f263040-kube-api-access-2vcpt" (OuterVolumeSpecName: "kube-api-access-2vcpt") pod "827906ee-a764-4950-849d-788b5f263040" (UID: "827906ee-a764-4950-849d-788b5f263040"). InnerVolumeSpecName "kube-api-access-2vcpt". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 15:33:00 crc kubenswrapper[4785]: I1126 15:33:00.565402 4785 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-tvs7f"] Nov 26 15:33:00 crc kubenswrapper[4785]: I1126 15:33:00.565648 4785 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-tvs7f" podUID="850a62cf-3d25-4de4-8783-d692272ee55f" containerName="registry-server" containerID="cri-o://4503f975a44882cb04bdb348b1fca023dd249dee83c6c45974d02531062d1d05" gracePeriod=2 Nov 26 15:33:00 crc kubenswrapper[4785]: I1126 15:33:00.608066 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/827906ee-a764-4950-849d-788b5f263040-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "827906ee-a764-4950-849d-788b5f263040" (UID: "827906ee-a764-4950-849d-788b5f263040"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 15:33:00 crc kubenswrapper[4785]: I1126 15:33:00.619640 4785 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2vcpt\" (UniqueName: \"kubernetes.io/projected/827906ee-a764-4950-849d-788b5f263040-kube-api-access-2vcpt\") on node \"crc\" DevicePath \"\"" Nov 26 15:33:00 crc kubenswrapper[4785]: I1126 15:33:00.619673 4785 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/827906ee-a764-4950-849d-788b5f263040-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 26 15:33:00 crc kubenswrapper[4785]: I1126 15:33:00.619682 4785 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/827906ee-a764-4950-849d-788b5f263040-utilities\") on node \"crc\" DevicePath \"\"" Nov 26 15:33:00 crc kubenswrapper[4785]: I1126 15:33:00.788991 4785 generic.go:334] "Generic (PLEG): container finished" podID="827906ee-a764-4950-849d-788b5f263040" containerID="4cc649213cfd442576cdf2538352d045193302df8bc7494c6a0509d2306624c0" exitCode=0 Nov 26 15:33:00 crc kubenswrapper[4785]: I1126 15:33:00.789052 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n9d7v" event={"ID":"827906ee-a764-4950-849d-788b5f263040","Type":"ContainerDied","Data":"4cc649213cfd442576cdf2538352d045193302df8bc7494c6a0509d2306624c0"} Nov 26 15:33:00 crc kubenswrapper[4785]: I1126 15:33:00.789082 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n9d7v" event={"ID":"827906ee-a764-4950-849d-788b5f263040","Type":"ContainerDied","Data":"04db5effbc50aedfce6e419651a56228eb82ef23039673861e7e8bca262bbd48"} Nov 26 15:33:00 crc kubenswrapper[4785]: I1126 15:33:00.789100 4785 scope.go:117] "RemoveContainer" containerID="4cc649213cfd442576cdf2538352d045193302df8bc7494c6a0509d2306624c0" Nov 26 15:33:00 crc kubenswrapper[4785]: I1126 15:33:00.789244 4785 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-n9d7v" Nov 26 15:33:00 crc kubenswrapper[4785]: I1126 15:33:00.806642 4785 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-frnfh" Nov 26 15:33:00 crc kubenswrapper[4785]: I1126 15:33:00.812877 4785 generic.go:334] "Generic (PLEG): container finished" podID="850a62cf-3d25-4de4-8783-d692272ee55f" containerID="4503f975a44882cb04bdb348b1fca023dd249dee83c6c45974d02531062d1d05" exitCode=0 Nov 26 15:33:00 crc kubenswrapper[4785]: I1126 15:33:00.813019 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tvs7f" event={"ID":"850a62cf-3d25-4de4-8783-d692272ee55f","Type":"ContainerDied","Data":"4503f975a44882cb04bdb348b1fca023dd249dee83c6c45974d02531062d1d05"} Nov 26 15:33:00 crc kubenswrapper[4785]: I1126 15:33:00.814572 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9g4wk" event={"ID":"738357c6-a8dc-4498-ba43-9124dc07b379","Type":"ContainerDied","Data":"5eff172a6af41ffc846f6042855c01d215c42f380efcc9094c41944d63726fae"} Nov 26 15:33:00 crc kubenswrapper[4785]: I1126 15:33:00.815432 4785 generic.go:334] "Generic (PLEG): container finished" podID="738357c6-a8dc-4498-ba43-9124dc07b379" containerID="5eff172a6af41ffc846f6042855c01d215c42f380efcc9094c41944d63726fae" exitCode=0 Nov 26 15:33:00 crc kubenswrapper[4785]: I1126 15:33:00.815605 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9g4wk" event={"ID":"738357c6-a8dc-4498-ba43-9124dc07b379","Type":"ContainerStarted","Data":"3f2978c5068244b6df8e799516293fcdf004f45740d472efb38f1cc88c3760f1"} Nov 26 15:33:00 crc kubenswrapper[4785]: I1126 15:33:00.834982 4785 scope.go:117] "RemoveContainer" containerID="ee5fc3fdb982fa41c5327a62679d8dcdeae2d0ffd3e6970e344e583cb549824e" Nov 26 15:33:00 crc kubenswrapper[4785]: I1126 15:33:00.859783 4785 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-n9d7v"] Nov 26 15:33:00 crc kubenswrapper[4785]: I1126 15:33:00.868983 4785 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-n9d7v"] Nov 26 15:33:00 crc kubenswrapper[4785]: I1126 15:33:00.874853 4785 scope.go:117] "RemoveContainer" containerID="ac5da10667af34211e75ac5f0e2c3eb3a8e0c633d36a0112911c64c06df4de73" Nov 26 15:33:00 crc kubenswrapper[4785]: I1126 15:33:00.891377 4785 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-frnfh" Nov 26 15:33:00 crc kubenswrapper[4785]: I1126 15:33:00.893378 4785 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-dx6bv" Nov 26 15:33:00 crc kubenswrapper[4785]: I1126 15:33:00.912576 4785 scope.go:117] "RemoveContainer" containerID="4cc649213cfd442576cdf2538352d045193302df8bc7494c6a0509d2306624c0" Nov 26 15:33:00 crc kubenswrapper[4785]: E1126 15:33:00.912947 4785 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4cc649213cfd442576cdf2538352d045193302df8bc7494c6a0509d2306624c0\": container with ID starting with 4cc649213cfd442576cdf2538352d045193302df8bc7494c6a0509d2306624c0 not found: ID does not exist" containerID="4cc649213cfd442576cdf2538352d045193302df8bc7494c6a0509d2306624c0" Nov 26 15:33:00 crc kubenswrapper[4785]: I1126 15:33:00.912985 4785 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4cc649213cfd442576cdf2538352d045193302df8bc7494c6a0509d2306624c0"} err="failed to get container status \"4cc649213cfd442576cdf2538352d045193302df8bc7494c6a0509d2306624c0\": rpc error: code = NotFound desc = could not find container \"4cc649213cfd442576cdf2538352d045193302df8bc7494c6a0509d2306624c0\": container with ID starting with 4cc649213cfd442576cdf2538352d045193302df8bc7494c6a0509d2306624c0 not found: ID does not exist" Nov 26 15:33:00 crc kubenswrapper[4785]: I1126 15:33:00.913011 4785 scope.go:117] "RemoveContainer" containerID="ee5fc3fdb982fa41c5327a62679d8dcdeae2d0ffd3e6970e344e583cb549824e" Nov 26 15:33:00 crc kubenswrapper[4785]: E1126 15:33:00.913214 4785 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ee5fc3fdb982fa41c5327a62679d8dcdeae2d0ffd3e6970e344e583cb549824e\": container with ID starting with ee5fc3fdb982fa41c5327a62679d8dcdeae2d0ffd3e6970e344e583cb549824e not found: ID does not exist" containerID="ee5fc3fdb982fa41c5327a62679d8dcdeae2d0ffd3e6970e344e583cb549824e" Nov 26 15:33:00 crc kubenswrapper[4785]: I1126 15:33:00.913243 4785 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ee5fc3fdb982fa41c5327a62679d8dcdeae2d0ffd3e6970e344e583cb549824e"} err="failed to get container status \"ee5fc3fdb982fa41c5327a62679d8dcdeae2d0ffd3e6970e344e583cb549824e\": rpc error: code = NotFound desc = could not find container \"ee5fc3fdb982fa41c5327a62679d8dcdeae2d0ffd3e6970e344e583cb549824e\": container with ID starting with ee5fc3fdb982fa41c5327a62679d8dcdeae2d0ffd3e6970e344e583cb549824e not found: ID does not exist" Nov 26 15:33:00 crc kubenswrapper[4785]: I1126 15:33:00.913260 4785 scope.go:117] "RemoveContainer" containerID="ac5da10667af34211e75ac5f0e2c3eb3a8e0c633d36a0112911c64c06df4de73" Nov 26 15:33:00 crc kubenswrapper[4785]: E1126 15:33:00.913472 4785 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ac5da10667af34211e75ac5f0e2c3eb3a8e0c633d36a0112911c64c06df4de73\": container with ID starting with ac5da10667af34211e75ac5f0e2c3eb3a8e0c633d36a0112911c64c06df4de73 not found: ID does not exist" containerID="ac5da10667af34211e75ac5f0e2c3eb3a8e0c633d36a0112911c64c06df4de73" Nov 26 15:33:00 crc kubenswrapper[4785]: I1126 15:33:00.913501 4785 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ac5da10667af34211e75ac5f0e2c3eb3a8e0c633d36a0112911c64c06df4de73"} err="failed to get container status \"ac5da10667af34211e75ac5f0e2c3eb3a8e0c633d36a0112911c64c06df4de73\": rpc error: code = NotFound desc = could not find container \"ac5da10667af34211e75ac5f0e2c3eb3a8e0c633d36a0112911c64c06df4de73\": container with ID starting with ac5da10667af34211e75ac5f0e2c3eb3a8e0c633d36a0112911c64c06df4de73 not found: ID does not exist" Nov 26 15:33:00 crc kubenswrapper[4785]: I1126 15:33:00.992177 4785 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tvs7f" Nov 26 15:33:01 crc kubenswrapper[4785]: I1126 15:33:01.047250 4785 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="827906ee-a764-4950-849d-788b5f263040" path="/var/lib/kubelet/pods/827906ee-a764-4950-849d-788b5f263040/volumes" Nov 26 15:33:01 crc kubenswrapper[4785]: I1126 15:33:01.048428 4785 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="937be872-928d-4374-8605-eb4e529dbebe" path="/var/lib/kubelet/pods/937be872-928d-4374-8605-eb4e529dbebe/volumes" Nov 26 15:33:01 crc kubenswrapper[4785]: I1126 15:33:01.138346 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n72rk\" (UniqueName: \"kubernetes.io/projected/850a62cf-3d25-4de4-8783-d692272ee55f-kube-api-access-n72rk\") pod \"850a62cf-3d25-4de4-8783-d692272ee55f\" (UID: \"850a62cf-3d25-4de4-8783-d692272ee55f\") " Nov 26 15:33:01 crc kubenswrapper[4785]: I1126 15:33:01.138419 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/850a62cf-3d25-4de4-8783-d692272ee55f-catalog-content\") pod \"850a62cf-3d25-4de4-8783-d692272ee55f\" (UID: \"850a62cf-3d25-4de4-8783-d692272ee55f\") " Nov 26 15:33:01 crc kubenswrapper[4785]: I1126 15:33:01.138534 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/850a62cf-3d25-4de4-8783-d692272ee55f-utilities\") pod \"850a62cf-3d25-4de4-8783-d692272ee55f\" (UID: \"850a62cf-3d25-4de4-8783-d692272ee55f\") " Nov 26 15:33:01 crc kubenswrapper[4785]: I1126 15:33:01.139303 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/850a62cf-3d25-4de4-8783-d692272ee55f-utilities" (OuterVolumeSpecName: "utilities") pod "850a62cf-3d25-4de4-8783-d692272ee55f" (UID: "850a62cf-3d25-4de4-8783-d692272ee55f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 15:33:01 crc kubenswrapper[4785]: I1126 15:33:01.142529 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/850a62cf-3d25-4de4-8783-d692272ee55f-kube-api-access-n72rk" (OuterVolumeSpecName: "kube-api-access-n72rk") pod "850a62cf-3d25-4de4-8783-d692272ee55f" (UID: "850a62cf-3d25-4de4-8783-d692272ee55f"). InnerVolumeSpecName "kube-api-access-n72rk". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 15:33:01 crc kubenswrapper[4785]: I1126 15:33:01.163473 4785 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-cnlfc"] Nov 26 15:33:01 crc kubenswrapper[4785]: I1126 15:33:01.163730 4785 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-cnlfc" podUID="df43a35c-ce4b-434a-90a2-69bd1e97090d" containerName="registry-server" containerID="cri-o://6172c57b6e122d80447b6c4cb587e6af8d65f2753def9b47b0d42a29cf2a8656" gracePeriod=2 Nov 26 15:33:01 crc kubenswrapper[4785]: I1126 15:33:01.218083 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/850a62cf-3d25-4de4-8783-d692272ee55f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "850a62cf-3d25-4de4-8783-d692272ee55f" (UID: "850a62cf-3d25-4de4-8783-d692272ee55f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 15:33:01 crc kubenswrapper[4785]: I1126 15:33:01.240370 4785 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n72rk\" (UniqueName: \"kubernetes.io/projected/850a62cf-3d25-4de4-8783-d692272ee55f-kube-api-access-n72rk\") on node \"crc\" DevicePath \"\"" Nov 26 15:33:01 crc kubenswrapper[4785]: I1126 15:33:01.240416 4785 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/850a62cf-3d25-4de4-8783-d692272ee55f-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 26 15:33:01 crc kubenswrapper[4785]: I1126 15:33:01.240434 4785 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/850a62cf-3d25-4de4-8783-d692272ee55f-utilities\") on node \"crc\" DevicePath \"\"" Nov 26 15:33:01 crc kubenswrapper[4785]: I1126 15:33:01.591291 4785 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cnlfc" Nov 26 15:33:01 crc kubenswrapper[4785]: I1126 15:33:01.748212 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tzct6\" (UniqueName: \"kubernetes.io/projected/df43a35c-ce4b-434a-90a2-69bd1e97090d-kube-api-access-tzct6\") pod \"df43a35c-ce4b-434a-90a2-69bd1e97090d\" (UID: \"df43a35c-ce4b-434a-90a2-69bd1e97090d\") " Nov 26 15:33:01 crc kubenswrapper[4785]: I1126 15:33:01.748287 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/df43a35c-ce4b-434a-90a2-69bd1e97090d-utilities\") pod \"df43a35c-ce4b-434a-90a2-69bd1e97090d\" (UID: \"df43a35c-ce4b-434a-90a2-69bd1e97090d\") " Nov 26 15:33:01 crc kubenswrapper[4785]: I1126 15:33:01.748327 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/df43a35c-ce4b-434a-90a2-69bd1e97090d-catalog-content\") pod \"df43a35c-ce4b-434a-90a2-69bd1e97090d\" (UID: \"df43a35c-ce4b-434a-90a2-69bd1e97090d\") " Nov 26 15:33:01 crc kubenswrapper[4785]: I1126 15:33:01.749147 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/df43a35c-ce4b-434a-90a2-69bd1e97090d-utilities" (OuterVolumeSpecName: "utilities") pod "df43a35c-ce4b-434a-90a2-69bd1e97090d" (UID: "df43a35c-ce4b-434a-90a2-69bd1e97090d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 15:33:01 crc kubenswrapper[4785]: I1126 15:33:01.753706 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/df43a35c-ce4b-434a-90a2-69bd1e97090d-kube-api-access-tzct6" (OuterVolumeSpecName: "kube-api-access-tzct6") pod "df43a35c-ce4b-434a-90a2-69bd1e97090d" (UID: "df43a35c-ce4b-434a-90a2-69bd1e97090d"). InnerVolumeSpecName "kube-api-access-tzct6". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 15:33:01 crc kubenswrapper[4785]: I1126 15:33:01.828822 4785 generic.go:334] "Generic (PLEG): container finished" podID="df43a35c-ce4b-434a-90a2-69bd1e97090d" containerID="6172c57b6e122d80447b6c4cb587e6af8d65f2753def9b47b0d42a29cf2a8656" exitCode=0 Nov 26 15:33:01 crc kubenswrapper[4785]: I1126 15:33:01.828899 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cnlfc" event={"ID":"df43a35c-ce4b-434a-90a2-69bd1e97090d","Type":"ContainerDied","Data":"6172c57b6e122d80447b6c4cb587e6af8d65f2753def9b47b0d42a29cf2a8656"} Nov 26 15:33:01 crc kubenswrapper[4785]: I1126 15:33:01.828927 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cnlfc" event={"ID":"df43a35c-ce4b-434a-90a2-69bd1e97090d","Type":"ContainerDied","Data":"06df63a948f195415efab95ffc743a16fe3604afb5cc6a3abe0e7ef33502ffb7"} Nov 26 15:33:01 crc kubenswrapper[4785]: I1126 15:33:01.828943 4785 scope.go:117] "RemoveContainer" containerID="6172c57b6e122d80447b6c4cb587e6af8d65f2753def9b47b0d42a29cf2a8656" Nov 26 15:33:01 crc kubenswrapper[4785]: I1126 15:33:01.828941 4785 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cnlfc" Nov 26 15:33:01 crc kubenswrapper[4785]: I1126 15:33:01.835414 4785 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tvs7f" Nov 26 15:33:01 crc kubenswrapper[4785]: I1126 15:33:01.835843 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tvs7f" event={"ID":"850a62cf-3d25-4de4-8783-d692272ee55f","Type":"ContainerDied","Data":"d4489e5117b7a142e4a8cb0ddb42d2bdaa2e73dd72a7c7d19953c4b1f17f8f2c"} Nov 26 15:33:01 crc kubenswrapper[4785]: I1126 15:33:01.843804 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/df43a35c-ce4b-434a-90a2-69bd1e97090d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "df43a35c-ce4b-434a-90a2-69bd1e97090d" (UID: "df43a35c-ce4b-434a-90a2-69bd1e97090d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 15:33:01 crc kubenswrapper[4785]: I1126 15:33:01.850804 4785 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tzct6\" (UniqueName: \"kubernetes.io/projected/df43a35c-ce4b-434a-90a2-69bd1e97090d-kube-api-access-tzct6\") on node \"crc\" DevicePath \"\"" Nov 26 15:33:01 crc kubenswrapper[4785]: I1126 15:33:01.851066 4785 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/df43a35c-ce4b-434a-90a2-69bd1e97090d-utilities\") on node \"crc\" DevicePath \"\"" Nov 26 15:33:01 crc kubenswrapper[4785]: I1126 15:33:01.851075 4785 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/df43a35c-ce4b-434a-90a2-69bd1e97090d-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 26 15:33:01 crc kubenswrapper[4785]: I1126 15:33:01.864200 4785 scope.go:117] "RemoveContainer" containerID="77aa32b19489df2fd4078e4bad85dce3001742ac17661d8093653f6f3dcf68f2" Nov 26 15:33:01 crc kubenswrapper[4785]: I1126 15:33:01.888676 4785 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-tvs7f"] Nov 26 15:33:01 crc kubenswrapper[4785]: I1126 15:33:01.892039 4785 scope.go:117] "RemoveContainer" containerID="d94f27706012990ca6bf12276424622b99a039806bae44aa87e5f2a9189883e6" Nov 26 15:33:01 crc kubenswrapper[4785]: I1126 15:33:01.897057 4785 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-tvs7f"] Nov 26 15:33:01 crc kubenswrapper[4785]: I1126 15:33:01.916538 4785 scope.go:117] "RemoveContainer" containerID="6172c57b6e122d80447b6c4cb587e6af8d65f2753def9b47b0d42a29cf2a8656" Nov 26 15:33:01 crc kubenswrapper[4785]: E1126 15:33:01.917377 4785 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6172c57b6e122d80447b6c4cb587e6af8d65f2753def9b47b0d42a29cf2a8656\": container with ID starting with 6172c57b6e122d80447b6c4cb587e6af8d65f2753def9b47b0d42a29cf2a8656 not found: ID does not exist" containerID="6172c57b6e122d80447b6c4cb587e6af8d65f2753def9b47b0d42a29cf2a8656" Nov 26 15:33:01 crc kubenswrapper[4785]: I1126 15:33:01.917427 4785 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6172c57b6e122d80447b6c4cb587e6af8d65f2753def9b47b0d42a29cf2a8656"} err="failed to get container status \"6172c57b6e122d80447b6c4cb587e6af8d65f2753def9b47b0d42a29cf2a8656\": rpc error: code = NotFound desc = could not find container \"6172c57b6e122d80447b6c4cb587e6af8d65f2753def9b47b0d42a29cf2a8656\": container with ID starting with 6172c57b6e122d80447b6c4cb587e6af8d65f2753def9b47b0d42a29cf2a8656 not found: ID does not exist" Nov 26 15:33:01 crc kubenswrapper[4785]: I1126 15:33:01.917448 4785 scope.go:117] "RemoveContainer" containerID="77aa32b19489df2fd4078e4bad85dce3001742ac17661d8093653f6f3dcf68f2" Nov 26 15:33:01 crc kubenswrapper[4785]: E1126 15:33:01.917845 4785 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"77aa32b19489df2fd4078e4bad85dce3001742ac17661d8093653f6f3dcf68f2\": container with ID starting with 77aa32b19489df2fd4078e4bad85dce3001742ac17661d8093653f6f3dcf68f2 not found: ID does not exist" containerID="77aa32b19489df2fd4078e4bad85dce3001742ac17661d8093653f6f3dcf68f2" Nov 26 15:33:01 crc kubenswrapper[4785]: I1126 15:33:01.917889 4785 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"77aa32b19489df2fd4078e4bad85dce3001742ac17661d8093653f6f3dcf68f2"} err="failed to get container status \"77aa32b19489df2fd4078e4bad85dce3001742ac17661d8093653f6f3dcf68f2\": rpc error: code = NotFound desc = could not find container \"77aa32b19489df2fd4078e4bad85dce3001742ac17661d8093653f6f3dcf68f2\": container with ID starting with 77aa32b19489df2fd4078e4bad85dce3001742ac17661d8093653f6f3dcf68f2 not found: ID does not exist" Nov 26 15:33:01 crc kubenswrapper[4785]: I1126 15:33:01.917925 4785 scope.go:117] "RemoveContainer" containerID="d94f27706012990ca6bf12276424622b99a039806bae44aa87e5f2a9189883e6" Nov 26 15:33:01 crc kubenswrapper[4785]: E1126 15:33:01.918208 4785 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d94f27706012990ca6bf12276424622b99a039806bae44aa87e5f2a9189883e6\": container with ID starting with d94f27706012990ca6bf12276424622b99a039806bae44aa87e5f2a9189883e6 not found: ID does not exist" containerID="d94f27706012990ca6bf12276424622b99a039806bae44aa87e5f2a9189883e6" Nov 26 15:33:01 crc kubenswrapper[4785]: I1126 15:33:01.918242 4785 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d94f27706012990ca6bf12276424622b99a039806bae44aa87e5f2a9189883e6"} err="failed to get container status \"d94f27706012990ca6bf12276424622b99a039806bae44aa87e5f2a9189883e6\": rpc error: code = NotFound desc = could not find container \"d94f27706012990ca6bf12276424622b99a039806bae44aa87e5f2a9189883e6\": container with ID starting with d94f27706012990ca6bf12276424622b99a039806bae44aa87e5f2a9189883e6 not found: ID does not exist" Nov 26 15:33:01 crc kubenswrapper[4785]: I1126 15:33:01.918259 4785 scope.go:117] "RemoveContainer" containerID="4503f975a44882cb04bdb348b1fca023dd249dee83c6c45974d02531062d1d05" Nov 26 15:33:01 crc kubenswrapper[4785]: I1126 15:33:01.947205 4785 scope.go:117] "RemoveContainer" containerID="17daf3f3a3eb2c0a151f300e7066c3d675628e5d655e67e2857d35d3cd168bcd" Nov 26 15:33:01 crc kubenswrapper[4785]: I1126 15:33:01.972398 4785 scope.go:117] "RemoveContainer" containerID="5774977d7fb55468889bdb29eb42a4e1e2af8ce6b3424a149ad6a0ad16a8375e" Nov 26 15:33:02 crc kubenswrapper[4785]: I1126 15:33:02.158910 4785 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-cnlfc"] Nov 26 15:33:02 crc kubenswrapper[4785]: I1126 15:33:02.165180 4785 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-cnlfc"] Nov 26 15:33:02 crc kubenswrapper[4785]: I1126 15:33:02.857261 4785 generic.go:334] "Generic (PLEG): container finished" podID="738357c6-a8dc-4498-ba43-9124dc07b379" containerID="c45d7a79ba58e398eb85c442d4220451fbac7318d0159b42a81337bbf58bf415" exitCode=0 Nov 26 15:33:02 crc kubenswrapper[4785]: I1126 15:33:02.857405 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9g4wk" event={"ID":"738357c6-a8dc-4498-ba43-9124dc07b379","Type":"ContainerDied","Data":"c45d7a79ba58e398eb85c442d4220451fbac7318d0159b42a81337bbf58bf415"} Nov 26 15:33:03 crc kubenswrapper[4785]: I1126 15:33:03.044955 4785 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="850a62cf-3d25-4de4-8783-d692272ee55f" path="/var/lib/kubelet/pods/850a62cf-3d25-4de4-8783-d692272ee55f/volumes" Nov 26 15:33:03 crc kubenswrapper[4785]: I1126 15:33:03.045872 4785 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="df43a35c-ce4b-434a-90a2-69bd1e97090d" path="/var/lib/kubelet/pods/df43a35c-ce4b-434a-90a2-69bd1e97090d/volumes" Nov 26 15:33:03 crc kubenswrapper[4785]: I1126 15:33:03.868715 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9g4wk" event={"ID":"738357c6-a8dc-4498-ba43-9124dc07b379","Type":"ContainerStarted","Data":"e1467c7c9e059371a3600cd448fec5aee3ac7ce046c2897de2d28f10b8513f34"} Nov 26 15:33:03 crc kubenswrapper[4785]: I1126 15:33:03.887965 4785 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-9g4wk" podStartSLOduration=2.215279097 podStartE2EDuration="4.88794861s" podCreationTimestamp="2025-11-26 15:32:59 +0000 UTC" firstStartedPulling="2025-11-26 15:33:00.834224124 +0000 UTC m=+1544.512589888" lastFinishedPulling="2025-11-26 15:33:03.506893617 +0000 UTC m=+1547.185259401" observedRunningTime="2025-11-26 15:33:03.887053546 +0000 UTC m=+1547.565419330" watchObservedRunningTime="2025-11-26 15:33:03.88794861 +0000 UTC m=+1547.566314374" Nov 26 15:33:05 crc kubenswrapper[4785]: I1126 15:33:05.565730 4785 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-5t2vd"] Nov 26 15:33:05 crc kubenswrapper[4785]: I1126 15:33:05.566438 4785 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-5t2vd" podUID="eb2fc6d0-088f-46cb-b99e-9650ecfbba7f" containerName="registry-server" containerID="cri-o://cd1463dad96250b093df44e7c6ab9f6673afce5b0de80995e3b6a8c16db50c2e" gracePeriod=2 Nov 26 15:33:05 crc kubenswrapper[4785]: I1126 15:33:05.889170 4785 generic.go:334] "Generic (PLEG): container finished" podID="eb2fc6d0-088f-46cb-b99e-9650ecfbba7f" containerID="cd1463dad96250b093df44e7c6ab9f6673afce5b0de80995e3b6a8c16db50c2e" exitCode=0 Nov 26 15:33:05 crc kubenswrapper[4785]: I1126 15:33:05.889261 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5t2vd" event={"ID":"eb2fc6d0-088f-46cb-b99e-9650ecfbba7f","Type":"ContainerDied","Data":"cd1463dad96250b093df44e7c6ab9f6673afce5b0de80995e3b6a8c16db50c2e"} Nov 26 15:33:05 crc kubenswrapper[4785]: I1126 15:33:05.966063 4785 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-jvcvt"] Nov 26 15:33:05 crc kubenswrapper[4785]: I1126 15:33:05.966715 4785 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-jvcvt" podUID="8abc4613-7050-40be-be6d-4ffc3e686b80" containerName="registry-server" containerID="cri-o://4ad67c5e3ac765760b3dc50e08a9a706a85f69686b726d8c0dd9f81075238144" gracePeriod=2 Nov 26 15:33:06 crc kubenswrapper[4785]: I1126 15:33:06.112581 4785 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5t2vd" Nov 26 15:33:06 crc kubenswrapper[4785]: I1126 15:33:06.222909 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-st5ll\" (UniqueName: \"kubernetes.io/projected/eb2fc6d0-088f-46cb-b99e-9650ecfbba7f-kube-api-access-st5ll\") pod \"eb2fc6d0-088f-46cb-b99e-9650ecfbba7f\" (UID: \"eb2fc6d0-088f-46cb-b99e-9650ecfbba7f\") " Nov 26 15:33:06 crc kubenswrapper[4785]: I1126 15:33:06.223094 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eb2fc6d0-088f-46cb-b99e-9650ecfbba7f-utilities\") pod \"eb2fc6d0-088f-46cb-b99e-9650ecfbba7f\" (UID: \"eb2fc6d0-088f-46cb-b99e-9650ecfbba7f\") " Nov 26 15:33:06 crc kubenswrapper[4785]: I1126 15:33:06.223989 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eb2fc6d0-088f-46cb-b99e-9650ecfbba7f-catalog-content\") pod \"eb2fc6d0-088f-46cb-b99e-9650ecfbba7f\" (UID: \"eb2fc6d0-088f-46cb-b99e-9650ecfbba7f\") " Nov 26 15:33:06 crc kubenswrapper[4785]: I1126 15:33:06.223803 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eb2fc6d0-088f-46cb-b99e-9650ecfbba7f-utilities" (OuterVolumeSpecName: "utilities") pod "eb2fc6d0-088f-46cb-b99e-9650ecfbba7f" (UID: "eb2fc6d0-088f-46cb-b99e-9650ecfbba7f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 15:33:06 crc kubenswrapper[4785]: I1126 15:33:06.224592 4785 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eb2fc6d0-088f-46cb-b99e-9650ecfbba7f-utilities\") on node \"crc\" DevicePath \"\"" Nov 26 15:33:06 crc kubenswrapper[4785]: I1126 15:33:06.245870 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eb2fc6d0-088f-46cb-b99e-9650ecfbba7f-kube-api-access-st5ll" (OuterVolumeSpecName: "kube-api-access-st5ll") pod "eb2fc6d0-088f-46cb-b99e-9650ecfbba7f" (UID: "eb2fc6d0-088f-46cb-b99e-9650ecfbba7f"). InnerVolumeSpecName "kube-api-access-st5ll". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 15:33:06 crc kubenswrapper[4785]: I1126 15:33:06.285363 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eb2fc6d0-088f-46cb-b99e-9650ecfbba7f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "eb2fc6d0-088f-46cb-b99e-9650ecfbba7f" (UID: "eb2fc6d0-088f-46cb-b99e-9650ecfbba7f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 15:33:06 crc kubenswrapper[4785]: I1126 15:33:06.318947 4785 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jvcvt" Nov 26 15:33:06 crc kubenswrapper[4785]: I1126 15:33:06.325700 4785 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-st5ll\" (UniqueName: \"kubernetes.io/projected/eb2fc6d0-088f-46cb-b99e-9650ecfbba7f-kube-api-access-st5ll\") on node \"crc\" DevicePath \"\"" Nov 26 15:33:06 crc kubenswrapper[4785]: I1126 15:33:06.325731 4785 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eb2fc6d0-088f-46cb-b99e-9650ecfbba7f-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 26 15:33:06 crc kubenswrapper[4785]: I1126 15:33:06.362478 4785 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-dx6bv"] Nov 26 15:33:06 crc kubenswrapper[4785]: I1126 15:33:06.362754 4785 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-dx6bv" podUID="439a16e9-65e4-4450-b95a-79095a4eebb9" containerName="registry-server" containerID="cri-o://25e29c4311627efe08667078492a435d58d0181767c202437d86567030f68d9f" gracePeriod=2 Nov 26 15:33:06 crc kubenswrapper[4785]: I1126 15:33:06.426832 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8abc4613-7050-40be-be6d-4ffc3e686b80-catalog-content\") pod \"8abc4613-7050-40be-be6d-4ffc3e686b80\" (UID: \"8abc4613-7050-40be-be6d-4ffc3e686b80\") " Nov 26 15:33:06 crc kubenswrapper[4785]: I1126 15:33:06.427003 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8abc4613-7050-40be-be6d-4ffc3e686b80-utilities\") pod \"8abc4613-7050-40be-be6d-4ffc3e686b80\" (UID: \"8abc4613-7050-40be-be6d-4ffc3e686b80\") " Nov 26 15:33:06 crc kubenswrapper[4785]: I1126 15:33:06.427070 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2rvrk\" (UniqueName: \"kubernetes.io/projected/8abc4613-7050-40be-be6d-4ffc3e686b80-kube-api-access-2rvrk\") pod \"8abc4613-7050-40be-be6d-4ffc3e686b80\" (UID: \"8abc4613-7050-40be-be6d-4ffc3e686b80\") " Nov 26 15:33:06 crc kubenswrapper[4785]: I1126 15:33:06.427736 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8abc4613-7050-40be-be6d-4ffc3e686b80-utilities" (OuterVolumeSpecName: "utilities") pod "8abc4613-7050-40be-be6d-4ffc3e686b80" (UID: "8abc4613-7050-40be-be6d-4ffc3e686b80"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 15:33:06 crc kubenswrapper[4785]: I1126 15:33:06.428359 4785 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8abc4613-7050-40be-be6d-4ffc3e686b80-utilities\") on node \"crc\" DevicePath \"\"" Nov 26 15:33:06 crc kubenswrapper[4785]: I1126 15:33:06.432378 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8abc4613-7050-40be-be6d-4ffc3e686b80-kube-api-access-2rvrk" (OuterVolumeSpecName: "kube-api-access-2rvrk") pod "8abc4613-7050-40be-be6d-4ffc3e686b80" (UID: "8abc4613-7050-40be-be6d-4ffc3e686b80"). InnerVolumeSpecName "kube-api-access-2rvrk". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 15:33:06 crc kubenswrapper[4785]: I1126 15:33:06.470620 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8abc4613-7050-40be-be6d-4ffc3e686b80-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8abc4613-7050-40be-be6d-4ffc3e686b80" (UID: "8abc4613-7050-40be-be6d-4ffc3e686b80"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 15:33:06 crc kubenswrapper[4785]: I1126 15:33:06.529509 4785 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2rvrk\" (UniqueName: \"kubernetes.io/projected/8abc4613-7050-40be-be6d-4ffc3e686b80-kube-api-access-2rvrk\") on node \"crc\" DevicePath \"\"" Nov 26 15:33:06 crc kubenswrapper[4785]: I1126 15:33:06.529566 4785 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8abc4613-7050-40be-be6d-4ffc3e686b80-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 26 15:33:06 crc kubenswrapper[4785]: I1126 15:33:06.773418 4785 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-5lfl5"] Nov 26 15:33:06 crc kubenswrapper[4785]: I1126 15:33:06.773812 4785 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-5lfl5" podUID="42779d5c-2b5c-4058-9b9d-995285d02e80" containerName="registry-server" containerID="cri-o://825f8ac8fd011688995c2e27ad9ddbeffc233278778c0af9bcc8be3fd6113da0" gracePeriod=2 Nov 26 15:33:06 crc kubenswrapper[4785]: I1126 15:33:06.908306 4785 generic.go:334] "Generic (PLEG): container finished" podID="8abc4613-7050-40be-be6d-4ffc3e686b80" containerID="4ad67c5e3ac765760b3dc50e08a9a706a85f69686b726d8c0dd9f81075238144" exitCode=0 Nov 26 15:33:06 crc kubenswrapper[4785]: I1126 15:33:06.908770 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jvcvt" event={"ID":"8abc4613-7050-40be-be6d-4ffc3e686b80","Type":"ContainerDied","Data":"4ad67c5e3ac765760b3dc50e08a9a706a85f69686b726d8c0dd9f81075238144"} Nov 26 15:33:06 crc kubenswrapper[4785]: I1126 15:33:06.908806 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jvcvt" event={"ID":"8abc4613-7050-40be-be6d-4ffc3e686b80","Type":"ContainerDied","Data":"7df2beb3b04966441d2d33839e4283af114a8167afebec8616997839eb66a987"} Nov 26 15:33:06 crc kubenswrapper[4785]: I1126 15:33:06.908827 4785 scope.go:117] "RemoveContainer" containerID="4ad67c5e3ac765760b3dc50e08a9a706a85f69686b726d8c0dd9f81075238144" Nov 26 15:33:06 crc kubenswrapper[4785]: I1126 15:33:06.909010 4785 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jvcvt" Nov 26 15:33:06 crc kubenswrapper[4785]: I1126 15:33:06.915327 4785 generic.go:334] "Generic (PLEG): container finished" podID="439a16e9-65e4-4450-b95a-79095a4eebb9" containerID="25e29c4311627efe08667078492a435d58d0181767c202437d86567030f68d9f" exitCode=0 Nov 26 15:33:06 crc kubenswrapper[4785]: I1126 15:33:06.915473 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dx6bv" event={"ID":"439a16e9-65e4-4450-b95a-79095a4eebb9","Type":"ContainerDied","Data":"25e29c4311627efe08667078492a435d58d0181767c202437d86567030f68d9f"} Nov 26 15:33:06 crc kubenswrapper[4785]: I1126 15:33:06.915505 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dx6bv" event={"ID":"439a16e9-65e4-4450-b95a-79095a4eebb9","Type":"ContainerDied","Data":"0e94d951ea561cb61ba0d4257a748f2843787f8d1b5a9bd086e67510eb32a3d7"} Nov 26 15:33:06 crc kubenswrapper[4785]: I1126 15:33:06.915517 4785 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0e94d951ea561cb61ba0d4257a748f2843787f8d1b5a9bd086e67510eb32a3d7" Nov 26 15:33:06 crc kubenswrapper[4785]: I1126 15:33:06.919195 4785 generic.go:334] "Generic (PLEG): container finished" podID="42779d5c-2b5c-4058-9b9d-995285d02e80" containerID="825f8ac8fd011688995c2e27ad9ddbeffc233278778c0af9bcc8be3fd6113da0" exitCode=0 Nov 26 15:33:06 crc kubenswrapper[4785]: I1126 15:33:06.919313 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5lfl5" event={"ID":"42779d5c-2b5c-4058-9b9d-995285d02e80","Type":"ContainerDied","Data":"825f8ac8fd011688995c2e27ad9ddbeffc233278778c0af9bcc8be3fd6113da0"} Nov 26 15:33:06 crc kubenswrapper[4785]: I1126 15:33:06.939701 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5t2vd" event={"ID":"eb2fc6d0-088f-46cb-b99e-9650ecfbba7f","Type":"ContainerDied","Data":"84862910b35164c44c6e8f4f6c7011dc460d02e77fb695b0c6495d9a8805a269"} Nov 26 15:33:06 crc kubenswrapper[4785]: I1126 15:33:06.939820 4785 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5t2vd" Nov 26 15:33:06 crc kubenswrapper[4785]: I1126 15:33:06.954732 4785 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-dx6bv" Nov 26 15:33:06 crc kubenswrapper[4785]: I1126 15:33:06.961338 4785 scope.go:117] "RemoveContainer" containerID="339d2350548f7fae92c2350dba65d706bf8c57db15e8bd46d1c1c3243793bde5" Nov 26 15:33:06 crc kubenswrapper[4785]: I1126 15:33:06.967289 4785 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-jvcvt"] Nov 26 15:33:06 crc kubenswrapper[4785]: I1126 15:33:06.979887 4785 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-jvcvt"] Nov 26 15:33:07 crc kubenswrapper[4785]: I1126 15:33:07.032218 4785 scope.go:117] "RemoveContainer" containerID="be3d2bb6b395a2fbe95a4008af727eedf77fa6ed7ba69668b0763fcf5ac04493" Nov 26 15:33:07 crc kubenswrapper[4785]: I1126 15:33:07.036587 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/439a16e9-65e4-4450-b95a-79095a4eebb9-utilities\") pod \"439a16e9-65e4-4450-b95a-79095a4eebb9\" (UID: \"439a16e9-65e4-4450-b95a-79095a4eebb9\") " Nov 26 15:33:07 crc kubenswrapper[4785]: I1126 15:33:07.036662 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/439a16e9-65e4-4450-b95a-79095a4eebb9-catalog-content\") pod \"439a16e9-65e4-4450-b95a-79095a4eebb9\" (UID: \"439a16e9-65e4-4450-b95a-79095a4eebb9\") " Nov 26 15:33:07 crc kubenswrapper[4785]: I1126 15:33:07.036745 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gdktf\" (UniqueName: \"kubernetes.io/projected/439a16e9-65e4-4450-b95a-79095a4eebb9-kube-api-access-gdktf\") pod \"439a16e9-65e4-4450-b95a-79095a4eebb9\" (UID: \"439a16e9-65e4-4450-b95a-79095a4eebb9\") " Nov 26 15:33:07 crc kubenswrapper[4785]: I1126 15:33:07.037606 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/439a16e9-65e4-4450-b95a-79095a4eebb9-utilities" (OuterVolumeSpecName: "utilities") pod "439a16e9-65e4-4450-b95a-79095a4eebb9" (UID: "439a16e9-65e4-4450-b95a-79095a4eebb9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 15:33:07 crc kubenswrapper[4785]: I1126 15:33:07.041084 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/439a16e9-65e4-4450-b95a-79095a4eebb9-kube-api-access-gdktf" (OuterVolumeSpecName: "kube-api-access-gdktf") pod "439a16e9-65e4-4450-b95a-79095a4eebb9" (UID: "439a16e9-65e4-4450-b95a-79095a4eebb9"). InnerVolumeSpecName "kube-api-access-gdktf". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 15:33:07 crc kubenswrapper[4785]: I1126 15:33:07.054285 4785 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8abc4613-7050-40be-be6d-4ffc3e686b80" path="/var/lib/kubelet/pods/8abc4613-7050-40be-be6d-4ffc3e686b80/volumes" Nov 26 15:33:07 crc kubenswrapper[4785]: I1126 15:33:07.060051 4785 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-5t2vd"] Nov 26 15:33:07 crc kubenswrapper[4785]: I1126 15:33:07.067926 4785 scope.go:117] "RemoveContainer" containerID="4ad67c5e3ac765760b3dc50e08a9a706a85f69686b726d8c0dd9f81075238144" Nov 26 15:33:07 crc kubenswrapper[4785]: E1126 15:33:07.068965 4785 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4ad67c5e3ac765760b3dc50e08a9a706a85f69686b726d8c0dd9f81075238144\": container with ID starting with 4ad67c5e3ac765760b3dc50e08a9a706a85f69686b726d8c0dd9f81075238144 not found: ID does not exist" containerID="4ad67c5e3ac765760b3dc50e08a9a706a85f69686b726d8c0dd9f81075238144" Nov 26 15:33:07 crc kubenswrapper[4785]: I1126 15:33:07.069002 4785 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4ad67c5e3ac765760b3dc50e08a9a706a85f69686b726d8c0dd9f81075238144"} err="failed to get container status \"4ad67c5e3ac765760b3dc50e08a9a706a85f69686b726d8c0dd9f81075238144\": rpc error: code = NotFound desc = could not find container \"4ad67c5e3ac765760b3dc50e08a9a706a85f69686b726d8c0dd9f81075238144\": container with ID starting with 4ad67c5e3ac765760b3dc50e08a9a706a85f69686b726d8c0dd9f81075238144 not found: ID does not exist" Nov 26 15:33:07 crc kubenswrapper[4785]: I1126 15:33:07.069026 4785 scope.go:117] "RemoveContainer" containerID="339d2350548f7fae92c2350dba65d706bf8c57db15e8bd46d1c1c3243793bde5" Nov 26 15:33:07 crc kubenswrapper[4785]: I1126 15:33:07.070960 4785 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-5t2vd"] Nov 26 15:33:07 crc kubenswrapper[4785]: E1126 15:33:07.073491 4785 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"339d2350548f7fae92c2350dba65d706bf8c57db15e8bd46d1c1c3243793bde5\": container with ID starting with 339d2350548f7fae92c2350dba65d706bf8c57db15e8bd46d1c1c3243793bde5 not found: ID does not exist" containerID="339d2350548f7fae92c2350dba65d706bf8c57db15e8bd46d1c1c3243793bde5" Nov 26 15:33:07 crc kubenswrapper[4785]: I1126 15:33:07.073547 4785 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"339d2350548f7fae92c2350dba65d706bf8c57db15e8bd46d1c1c3243793bde5"} err="failed to get container status \"339d2350548f7fae92c2350dba65d706bf8c57db15e8bd46d1c1c3243793bde5\": rpc error: code = NotFound desc = could not find container \"339d2350548f7fae92c2350dba65d706bf8c57db15e8bd46d1c1c3243793bde5\": container with ID starting with 339d2350548f7fae92c2350dba65d706bf8c57db15e8bd46d1c1c3243793bde5 not found: ID does not exist" Nov 26 15:33:07 crc kubenswrapper[4785]: I1126 15:33:07.073588 4785 scope.go:117] "RemoveContainer" containerID="be3d2bb6b395a2fbe95a4008af727eedf77fa6ed7ba69668b0763fcf5ac04493" Nov 26 15:33:07 crc kubenswrapper[4785]: E1126 15:33:07.073857 4785 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"be3d2bb6b395a2fbe95a4008af727eedf77fa6ed7ba69668b0763fcf5ac04493\": container with ID starting with be3d2bb6b395a2fbe95a4008af727eedf77fa6ed7ba69668b0763fcf5ac04493 not found: ID does not exist" containerID="be3d2bb6b395a2fbe95a4008af727eedf77fa6ed7ba69668b0763fcf5ac04493" Nov 26 15:33:07 crc kubenswrapper[4785]: I1126 15:33:07.073880 4785 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"be3d2bb6b395a2fbe95a4008af727eedf77fa6ed7ba69668b0763fcf5ac04493"} err="failed to get container status \"be3d2bb6b395a2fbe95a4008af727eedf77fa6ed7ba69668b0763fcf5ac04493\": rpc error: code = NotFound desc = could not find container \"be3d2bb6b395a2fbe95a4008af727eedf77fa6ed7ba69668b0763fcf5ac04493\": container with ID starting with be3d2bb6b395a2fbe95a4008af727eedf77fa6ed7ba69668b0763fcf5ac04493 not found: ID does not exist" Nov 26 15:33:07 crc kubenswrapper[4785]: I1126 15:33:07.073913 4785 scope.go:117] "RemoveContainer" containerID="cd1463dad96250b093df44e7c6ab9f6673afce5b0de80995e3b6a8c16db50c2e" Nov 26 15:33:07 crc kubenswrapper[4785]: I1126 15:33:07.100389 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/439a16e9-65e4-4450-b95a-79095a4eebb9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "439a16e9-65e4-4450-b95a-79095a4eebb9" (UID: "439a16e9-65e4-4450-b95a-79095a4eebb9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 15:33:07 crc kubenswrapper[4785]: I1126 15:33:07.118689 4785 scope.go:117] "RemoveContainer" containerID="b593c6fb90d9b6de665762269dfc77d1407d10cc6488d05d2ff5c64efde7f361" Nov 26 15:33:07 crc kubenswrapper[4785]: I1126 15:33:07.139244 4785 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/439a16e9-65e4-4450-b95a-79095a4eebb9-utilities\") on node \"crc\" DevicePath \"\"" Nov 26 15:33:07 crc kubenswrapper[4785]: I1126 15:33:07.139292 4785 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/439a16e9-65e4-4450-b95a-79095a4eebb9-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 26 15:33:07 crc kubenswrapper[4785]: I1126 15:33:07.139302 4785 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gdktf\" (UniqueName: \"kubernetes.io/projected/439a16e9-65e4-4450-b95a-79095a4eebb9-kube-api-access-gdktf\") on node \"crc\" DevicePath \"\"" Nov 26 15:33:07 crc kubenswrapper[4785]: I1126 15:33:07.148010 4785 scope.go:117] "RemoveContainer" containerID="cb08ad323fd0d86944c6b644cd09dd47ae8b5973cb650f6d970dc79ad0627e14" Nov 26 15:33:07 crc kubenswrapper[4785]: I1126 15:33:07.198694 4785 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5lfl5" Nov 26 15:33:07 crc kubenswrapper[4785]: I1126 15:33:07.341777 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/42779d5c-2b5c-4058-9b9d-995285d02e80-catalog-content\") pod \"42779d5c-2b5c-4058-9b9d-995285d02e80\" (UID: \"42779d5c-2b5c-4058-9b9d-995285d02e80\") " Nov 26 15:33:07 crc kubenswrapper[4785]: I1126 15:33:07.341844 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cngvm\" (UniqueName: \"kubernetes.io/projected/42779d5c-2b5c-4058-9b9d-995285d02e80-kube-api-access-cngvm\") pod \"42779d5c-2b5c-4058-9b9d-995285d02e80\" (UID: \"42779d5c-2b5c-4058-9b9d-995285d02e80\") " Nov 26 15:33:07 crc kubenswrapper[4785]: I1126 15:33:07.341873 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/42779d5c-2b5c-4058-9b9d-995285d02e80-utilities\") pod \"42779d5c-2b5c-4058-9b9d-995285d02e80\" (UID: \"42779d5c-2b5c-4058-9b9d-995285d02e80\") " Nov 26 15:33:07 crc kubenswrapper[4785]: I1126 15:33:07.344178 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/42779d5c-2b5c-4058-9b9d-995285d02e80-utilities" (OuterVolumeSpecName: "utilities") pod "42779d5c-2b5c-4058-9b9d-995285d02e80" (UID: "42779d5c-2b5c-4058-9b9d-995285d02e80"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 15:33:07 crc kubenswrapper[4785]: I1126 15:33:07.345974 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/42779d5c-2b5c-4058-9b9d-995285d02e80-kube-api-access-cngvm" (OuterVolumeSpecName: "kube-api-access-cngvm") pod "42779d5c-2b5c-4058-9b9d-995285d02e80" (UID: "42779d5c-2b5c-4058-9b9d-995285d02e80"). InnerVolumeSpecName "kube-api-access-cngvm". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 15:33:07 crc kubenswrapper[4785]: I1126 15:33:07.392423 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/42779d5c-2b5c-4058-9b9d-995285d02e80-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "42779d5c-2b5c-4058-9b9d-995285d02e80" (UID: "42779d5c-2b5c-4058-9b9d-995285d02e80"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 15:33:07 crc kubenswrapper[4785]: I1126 15:33:07.443621 4785 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/42779d5c-2b5c-4058-9b9d-995285d02e80-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 26 15:33:07 crc kubenswrapper[4785]: I1126 15:33:07.443673 4785 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cngvm\" (UniqueName: \"kubernetes.io/projected/42779d5c-2b5c-4058-9b9d-995285d02e80-kube-api-access-cngvm\") on node \"crc\" DevicePath \"\"" Nov 26 15:33:07 crc kubenswrapper[4785]: I1126 15:33:07.443690 4785 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/42779d5c-2b5c-4058-9b9d-995285d02e80-utilities\") on node \"crc\" DevicePath \"\"" Nov 26 15:33:07 crc kubenswrapper[4785]: I1126 15:33:07.567760 4785 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-h295v"] Nov 26 15:33:07 crc kubenswrapper[4785]: I1126 15:33:07.568045 4785 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-h295v" podUID="e16ee24a-ec88-4d48-8a64-1136e673aa8d" containerName="registry-server" containerID="cri-o://90c5acb30ad680d4d4e83c48bedd33d933eabe341995fc406429a70569aeead2" gracePeriod=2 Nov 26 15:33:07 crc kubenswrapper[4785]: I1126 15:33:07.953669 4785 generic.go:334] "Generic (PLEG): container finished" podID="e16ee24a-ec88-4d48-8a64-1136e673aa8d" containerID="90c5acb30ad680d4d4e83c48bedd33d933eabe341995fc406429a70569aeead2" exitCode=0 Nov 26 15:33:07 crc kubenswrapper[4785]: I1126 15:33:07.954401 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-h295v" event={"ID":"e16ee24a-ec88-4d48-8a64-1136e673aa8d","Type":"ContainerDied","Data":"90c5acb30ad680d4d4e83c48bedd33d933eabe341995fc406429a70569aeead2"} Nov 26 15:33:07 crc kubenswrapper[4785]: I1126 15:33:07.954437 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-h295v" event={"ID":"e16ee24a-ec88-4d48-8a64-1136e673aa8d","Type":"ContainerDied","Data":"c5b3bd83719eec7acbb5af7518744db1e6504c045f5e196b0d9a75692f817d75"} Nov 26 15:33:07 crc kubenswrapper[4785]: I1126 15:33:07.954456 4785 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c5b3bd83719eec7acbb5af7518744db1e6504c045f5e196b0d9a75692f817d75" Nov 26 15:33:07 crc kubenswrapper[4785]: I1126 15:33:07.956792 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5lfl5" event={"ID":"42779d5c-2b5c-4058-9b9d-995285d02e80","Type":"ContainerDied","Data":"3e9816f4cf71b2c70f4a2a21729d759b0d2f7cc5d52351899cacc05f8d5ea1d8"} Nov 26 15:33:07 crc kubenswrapper[4785]: I1126 15:33:07.956833 4785 scope.go:117] "RemoveContainer" containerID="825f8ac8fd011688995c2e27ad9ddbeffc233278778c0af9bcc8be3fd6113da0" Nov 26 15:33:07 crc kubenswrapper[4785]: I1126 15:33:07.956945 4785 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5lfl5" Nov 26 15:33:07 crc kubenswrapper[4785]: I1126 15:33:07.962013 4785 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-dx6bv" Nov 26 15:33:08 crc kubenswrapper[4785]: I1126 15:33:08.003835 4785 scope.go:117] "RemoveContainer" containerID="876da037fa54d4010a81d565d7c214117d0bf6b0d4ab39ad90ec9741a5dfbc42" Nov 26 15:33:08 crc kubenswrapper[4785]: I1126 15:33:08.004042 4785 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-h295v" Nov 26 15:33:08 crc kubenswrapper[4785]: I1126 15:33:08.040082 4785 scope.go:117] "RemoveContainer" containerID="69cb355ef838ac455688d18fb38bac207484ba1426f6347c983a0f651ba74df7" Nov 26 15:33:08 crc kubenswrapper[4785]: I1126 15:33:08.041328 4785 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-dx6bv"] Nov 26 15:33:08 crc kubenswrapper[4785]: I1126 15:33:08.053992 4785 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-dx6bv"] Nov 26 15:33:08 crc kubenswrapper[4785]: I1126 15:33:08.064508 4785 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/keystone-db-sync-26zth"] Nov 26 15:33:08 crc kubenswrapper[4785]: I1126 15:33:08.075390 4785 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/keystone-db-sync-26zth"] Nov 26 15:33:08 crc kubenswrapper[4785]: I1126 15:33:08.086020 4785 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-5lfl5"] Nov 26 15:33:08 crc kubenswrapper[4785]: I1126 15:33:08.093704 4785 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-5lfl5"] Nov 26 15:33:08 crc kubenswrapper[4785]: I1126 15:33:08.152967 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x6gsk\" (UniqueName: \"kubernetes.io/projected/e16ee24a-ec88-4d48-8a64-1136e673aa8d-kube-api-access-x6gsk\") pod \"e16ee24a-ec88-4d48-8a64-1136e673aa8d\" (UID: \"e16ee24a-ec88-4d48-8a64-1136e673aa8d\") " Nov 26 15:33:08 crc kubenswrapper[4785]: I1126 15:33:08.153180 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e16ee24a-ec88-4d48-8a64-1136e673aa8d-utilities\") pod \"e16ee24a-ec88-4d48-8a64-1136e673aa8d\" (UID: \"e16ee24a-ec88-4d48-8a64-1136e673aa8d\") " Nov 26 15:33:08 crc kubenswrapper[4785]: I1126 15:33:08.153276 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e16ee24a-ec88-4d48-8a64-1136e673aa8d-catalog-content\") pod \"e16ee24a-ec88-4d48-8a64-1136e673aa8d\" (UID: \"e16ee24a-ec88-4d48-8a64-1136e673aa8d\") " Nov 26 15:33:08 crc kubenswrapper[4785]: I1126 15:33:08.154178 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e16ee24a-ec88-4d48-8a64-1136e673aa8d-utilities" (OuterVolumeSpecName: "utilities") pod "e16ee24a-ec88-4d48-8a64-1136e673aa8d" (UID: "e16ee24a-ec88-4d48-8a64-1136e673aa8d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 15:33:08 crc kubenswrapper[4785]: I1126 15:33:08.157827 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e16ee24a-ec88-4d48-8a64-1136e673aa8d-kube-api-access-x6gsk" (OuterVolumeSpecName: "kube-api-access-x6gsk") pod "e16ee24a-ec88-4d48-8a64-1136e673aa8d" (UID: "e16ee24a-ec88-4d48-8a64-1136e673aa8d"). InnerVolumeSpecName "kube-api-access-x6gsk". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 15:33:08 crc kubenswrapper[4785]: I1126 15:33:08.196283 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e16ee24a-ec88-4d48-8a64-1136e673aa8d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e16ee24a-ec88-4d48-8a64-1136e673aa8d" (UID: "e16ee24a-ec88-4d48-8a64-1136e673aa8d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 15:33:08 crc kubenswrapper[4785]: I1126 15:33:08.255524 4785 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e16ee24a-ec88-4d48-8a64-1136e673aa8d-utilities\") on node \"crc\" DevicePath \"\"" Nov 26 15:33:08 crc kubenswrapper[4785]: I1126 15:33:08.255584 4785 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e16ee24a-ec88-4d48-8a64-1136e673aa8d-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 26 15:33:08 crc kubenswrapper[4785]: I1126 15:33:08.255597 4785 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x6gsk\" (UniqueName: \"kubernetes.io/projected/e16ee24a-ec88-4d48-8a64-1136e673aa8d-kube-api-access-x6gsk\") on node \"crc\" DevicePath \"\"" Nov 26 15:33:08 crc kubenswrapper[4785]: I1126 15:33:08.367590 4785 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-cmq9d"] Nov 26 15:33:08 crc kubenswrapper[4785]: I1126 15:33:08.367861 4785 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-cmq9d" podUID="98cf1e09-3af0-4286-aa6f-43dbff032ba1" containerName="registry-server" containerID="cri-o://7cbdb5c31aec060d44293097f056a2c99a25eec7e8178a0c0a0ed57148b9259f" gracePeriod=2 Nov 26 15:33:08 crc kubenswrapper[4785]: I1126 15:33:08.792862 4785 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-cmq9d" Nov 26 15:33:08 crc kubenswrapper[4785]: I1126 15:33:08.966986 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/98cf1e09-3af0-4286-aa6f-43dbff032ba1-utilities\") pod \"98cf1e09-3af0-4286-aa6f-43dbff032ba1\" (UID: \"98cf1e09-3af0-4286-aa6f-43dbff032ba1\") " Nov 26 15:33:08 crc kubenswrapper[4785]: I1126 15:33:08.967109 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/98cf1e09-3af0-4286-aa6f-43dbff032ba1-catalog-content\") pod \"98cf1e09-3af0-4286-aa6f-43dbff032ba1\" (UID: \"98cf1e09-3af0-4286-aa6f-43dbff032ba1\") " Nov 26 15:33:08 crc kubenswrapper[4785]: I1126 15:33:08.967154 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x46mq\" (UniqueName: \"kubernetes.io/projected/98cf1e09-3af0-4286-aa6f-43dbff032ba1-kube-api-access-x46mq\") pod \"98cf1e09-3af0-4286-aa6f-43dbff032ba1\" (UID: \"98cf1e09-3af0-4286-aa6f-43dbff032ba1\") " Nov 26 15:33:08 crc kubenswrapper[4785]: I1126 15:33:08.968628 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/98cf1e09-3af0-4286-aa6f-43dbff032ba1-utilities" (OuterVolumeSpecName: "utilities") pod "98cf1e09-3af0-4286-aa6f-43dbff032ba1" (UID: "98cf1e09-3af0-4286-aa6f-43dbff032ba1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 15:33:08 crc kubenswrapper[4785]: I1126 15:33:08.974825 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/98cf1e09-3af0-4286-aa6f-43dbff032ba1-kube-api-access-x46mq" (OuterVolumeSpecName: "kube-api-access-x46mq") pod "98cf1e09-3af0-4286-aa6f-43dbff032ba1" (UID: "98cf1e09-3af0-4286-aa6f-43dbff032ba1"). InnerVolumeSpecName "kube-api-access-x46mq". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 15:33:08 crc kubenswrapper[4785]: I1126 15:33:08.976940 4785 generic.go:334] "Generic (PLEG): container finished" podID="98cf1e09-3af0-4286-aa6f-43dbff032ba1" containerID="7cbdb5c31aec060d44293097f056a2c99a25eec7e8178a0c0a0ed57148b9259f" exitCode=0 Nov 26 15:33:08 crc kubenswrapper[4785]: I1126 15:33:08.977048 4785 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-cmq9d" Nov 26 15:33:08 crc kubenswrapper[4785]: I1126 15:33:08.977088 4785 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-h295v" Nov 26 15:33:08 crc kubenswrapper[4785]: I1126 15:33:08.977167 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cmq9d" event={"ID":"98cf1e09-3af0-4286-aa6f-43dbff032ba1","Type":"ContainerDied","Data":"7cbdb5c31aec060d44293097f056a2c99a25eec7e8178a0c0a0ed57148b9259f"} Nov 26 15:33:08 crc kubenswrapper[4785]: I1126 15:33:08.977288 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cmq9d" event={"ID":"98cf1e09-3af0-4286-aa6f-43dbff032ba1","Type":"ContainerDied","Data":"7bde13b94a28351dd6efa79b3720dd393248618d1b950614b3b12f2a0a055e06"} Nov 26 15:33:08 crc kubenswrapper[4785]: I1126 15:33:08.977369 4785 scope.go:117] "RemoveContainer" containerID="7cbdb5c31aec060d44293097f056a2c99a25eec7e8178a0c0a0ed57148b9259f" Nov 26 15:33:09 crc kubenswrapper[4785]: I1126 15:33:09.019066 4785 scope.go:117] "RemoveContainer" containerID="05b1732db240efced8921afba290b21f920c035ce14d008e12a64d698231a273" Nov 26 15:33:09 crc kubenswrapper[4785]: I1126 15:33:09.029055 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/98cf1e09-3af0-4286-aa6f-43dbff032ba1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "98cf1e09-3af0-4286-aa6f-43dbff032ba1" (UID: "98cf1e09-3af0-4286-aa6f-43dbff032ba1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 15:33:09 crc kubenswrapper[4785]: I1126 15:33:09.029136 4785 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-h295v"] Nov 26 15:33:09 crc kubenswrapper[4785]: I1126 15:33:09.035121 4785 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-h295v"] Nov 26 15:33:09 crc kubenswrapper[4785]: I1126 15:33:09.044623 4785 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="42779d5c-2b5c-4058-9b9d-995285d02e80" path="/var/lib/kubelet/pods/42779d5c-2b5c-4058-9b9d-995285d02e80/volumes" Nov 26 15:33:09 crc kubenswrapper[4785]: I1126 15:33:09.045715 4785 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="439a16e9-65e4-4450-b95a-79095a4eebb9" path="/var/lib/kubelet/pods/439a16e9-65e4-4450-b95a-79095a4eebb9/volumes" Nov 26 15:33:09 crc kubenswrapper[4785]: I1126 15:33:09.046888 4785 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d6236e0d-402b-4338-93c8-4f1a65013acc" path="/var/lib/kubelet/pods/d6236e0d-402b-4338-93c8-4f1a65013acc/volumes" Nov 26 15:33:09 crc kubenswrapper[4785]: I1126 15:33:09.048296 4785 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e16ee24a-ec88-4d48-8a64-1136e673aa8d" path="/var/lib/kubelet/pods/e16ee24a-ec88-4d48-8a64-1136e673aa8d/volumes" Nov 26 15:33:09 crc kubenswrapper[4785]: I1126 15:33:09.049598 4785 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eb2fc6d0-088f-46cb-b99e-9650ecfbba7f" path="/var/lib/kubelet/pods/eb2fc6d0-088f-46cb-b99e-9650ecfbba7f/volumes" Nov 26 15:33:09 crc kubenswrapper[4785]: I1126 15:33:09.053646 4785 scope.go:117] "RemoveContainer" containerID="0ecd86ffa7062aecf01aca5bb7597113bb22e8dc98ff3a7f2a982517eace1de3" Nov 26 15:33:09 crc kubenswrapper[4785]: I1126 15:33:09.069342 4785 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/98cf1e09-3af0-4286-aa6f-43dbff032ba1-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 26 15:33:09 crc kubenswrapper[4785]: I1126 15:33:09.069385 4785 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x46mq\" (UniqueName: \"kubernetes.io/projected/98cf1e09-3af0-4286-aa6f-43dbff032ba1-kube-api-access-x46mq\") on node \"crc\" DevicePath \"\"" Nov 26 15:33:09 crc kubenswrapper[4785]: I1126 15:33:09.069401 4785 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/98cf1e09-3af0-4286-aa6f-43dbff032ba1-utilities\") on node \"crc\" DevicePath \"\"" Nov 26 15:33:09 crc kubenswrapper[4785]: I1126 15:33:09.069965 4785 scope.go:117] "RemoveContainer" containerID="7cbdb5c31aec060d44293097f056a2c99a25eec7e8178a0c0a0ed57148b9259f" Nov 26 15:33:09 crc kubenswrapper[4785]: E1126 15:33:09.070368 4785 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7cbdb5c31aec060d44293097f056a2c99a25eec7e8178a0c0a0ed57148b9259f\": container with ID starting with 7cbdb5c31aec060d44293097f056a2c99a25eec7e8178a0c0a0ed57148b9259f not found: ID does not exist" containerID="7cbdb5c31aec060d44293097f056a2c99a25eec7e8178a0c0a0ed57148b9259f" Nov 26 15:33:09 crc kubenswrapper[4785]: I1126 15:33:09.070401 4785 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7cbdb5c31aec060d44293097f056a2c99a25eec7e8178a0c0a0ed57148b9259f"} err="failed to get container status \"7cbdb5c31aec060d44293097f056a2c99a25eec7e8178a0c0a0ed57148b9259f\": rpc error: code = NotFound desc = could not find container \"7cbdb5c31aec060d44293097f056a2c99a25eec7e8178a0c0a0ed57148b9259f\": container with ID starting with 7cbdb5c31aec060d44293097f056a2c99a25eec7e8178a0c0a0ed57148b9259f not found: ID does not exist" Nov 26 15:33:09 crc kubenswrapper[4785]: I1126 15:33:09.070427 4785 scope.go:117] "RemoveContainer" containerID="05b1732db240efced8921afba290b21f920c035ce14d008e12a64d698231a273" Nov 26 15:33:09 crc kubenswrapper[4785]: E1126 15:33:09.070669 4785 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"05b1732db240efced8921afba290b21f920c035ce14d008e12a64d698231a273\": container with ID starting with 05b1732db240efced8921afba290b21f920c035ce14d008e12a64d698231a273 not found: ID does not exist" containerID="05b1732db240efced8921afba290b21f920c035ce14d008e12a64d698231a273" Nov 26 15:33:09 crc kubenswrapper[4785]: I1126 15:33:09.070695 4785 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"05b1732db240efced8921afba290b21f920c035ce14d008e12a64d698231a273"} err="failed to get container status \"05b1732db240efced8921afba290b21f920c035ce14d008e12a64d698231a273\": rpc error: code = NotFound desc = could not find container \"05b1732db240efced8921afba290b21f920c035ce14d008e12a64d698231a273\": container with ID starting with 05b1732db240efced8921afba290b21f920c035ce14d008e12a64d698231a273 not found: ID does not exist" Nov 26 15:33:09 crc kubenswrapper[4785]: I1126 15:33:09.070714 4785 scope.go:117] "RemoveContainer" containerID="0ecd86ffa7062aecf01aca5bb7597113bb22e8dc98ff3a7f2a982517eace1de3" Nov 26 15:33:09 crc kubenswrapper[4785]: E1126 15:33:09.071183 4785 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0ecd86ffa7062aecf01aca5bb7597113bb22e8dc98ff3a7f2a982517eace1de3\": container with ID starting with 0ecd86ffa7062aecf01aca5bb7597113bb22e8dc98ff3a7f2a982517eace1de3 not found: ID does not exist" containerID="0ecd86ffa7062aecf01aca5bb7597113bb22e8dc98ff3a7f2a982517eace1de3" Nov 26 15:33:09 crc kubenswrapper[4785]: I1126 15:33:09.071238 4785 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0ecd86ffa7062aecf01aca5bb7597113bb22e8dc98ff3a7f2a982517eace1de3"} err="failed to get container status \"0ecd86ffa7062aecf01aca5bb7597113bb22e8dc98ff3a7f2a982517eace1de3\": rpc error: code = NotFound desc = could not find container \"0ecd86ffa7062aecf01aca5bb7597113bb22e8dc98ff3a7f2a982517eace1de3\": container with ID starting with 0ecd86ffa7062aecf01aca5bb7597113bb22e8dc98ff3a7f2a982517eace1de3 not found: ID does not exist" Nov 26 15:33:09 crc kubenswrapper[4785]: I1126 15:33:09.167868 4785 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-frnfh"] Nov 26 15:33:09 crc kubenswrapper[4785]: I1126 15:33:09.168142 4785 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-frnfh" podUID="5ab902c8-4fc4-4a16-a87d-a3b0979c6895" containerName="registry-server" containerID="cri-o://c6a406b0e0de328d42317115430647e0220396399715e31075a89f5216da20fd" gracePeriod=2 Nov 26 15:33:09 crc kubenswrapper[4785]: I1126 15:33:09.309479 4785 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-cmq9d"] Nov 26 15:33:09 crc kubenswrapper[4785]: I1126 15:33:09.314917 4785 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-cmq9d"] Nov 26 15:33:09 crc kubenswrapper[4785]: I1126 15:33:09.493353 4785 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-frnfh" Nov 26 15:33:09 crc kubenswrapper[4785]: I1126 15:33:09.577431 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5ab902c8-4fc4-4a16-a87d-a3b0979c6895-utilities\") pod \"5ab902c8-4fc4-4a16-a87d-a3b0979c6895\" (UID: \"5ab902c8-4fc4-4a16-a87d-a3b0979c6895\") " Nov 26 15:33:09 crc kubenswrapper[4785]: I1126 15:33:09.577537 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7crcd\" (UniqueName: \"kubernetes.io/projected/5ab902c8-4fc4-4a16-a87d-a3b0979c6895-kube-api-access-7crcd\") pod \"5ab902c8-4fc4-4a16-a87d-a3b0979c6895\" (UID: \"5ab902c8-4fc4-4a16-a87d-a3b0979c6895\") " Nov 26 15:33:09 crc kubenswrapper[4785]: I1126 15:33:09.577595 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5ab902c8-4fc4-4a16-a87d-a3b0979c6895-catalog-content\") pod \"5ab902c8-4fc4-4a16-a87d-a3b0979c6895\" (UID: \"5ab902c8-4fc4-4a16-a87d-a3b0979c6895\") " Nov 26 15:33:09 crc kubenswrapper[4785]: I1126 15:33:09.580076 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5ab902c8-4fc4-4a16-a87d-a3b0979c6895-utilities" (OuterVolumeSpecName: "utilities") pod "5ab902c8-4fc4-4a16-a87d-a3b0979c6895" (UID: "5ab902c8-4fc4-4a16-a87d-a3b0979c6895"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 15:33:09 crc kubenswrapper[4785]: I1126 15:33:09.585236 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5ab902c8-4fc4-4a16-a87d-a3b0979c6895-kube-api-access-7crcd" (OuterVolumeSpecName: "kube-api-access-7crcd") pod "5ab902c8-4fc4-4a16-a87d-a3b0979c6895" (UID: "5ab902c8-4fc4-4a16-a87d-a3b0979c6895"). InnerVolumeSpecName "kube-api-access-7crcd". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 15:33:09 crc kubenswrapper[4785]: I1126 15:33:09.624751 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5ab902c8-4fc4-4a16-a87d-a3b0979c6895-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5ab902c8-4fc4-4a16-a87d-a3b0979c6895" (UID: "5ab902c8-4fc4-4a16-a87d-a3b0979c6895"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 15:33:09 crc kubenswrapper[4785]: I1126 15:33:09.679041 4785 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5ab902c8-4fc4-4a16-a87d-a3b0979c6895-utilities\") on node \"crc\" DevicePath \"\"" Nov 26 15:33:09 crc kubenswrapper[4785]: I1126 15:33:09.679079 4785 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7crcd\" (UniqueName: \"kubernetes.io/projected/5ab902c8-4fc4-4a16-a87d-a3b0979c6895-kube-api-access-7crcd\") on node \"crc\" DevicePath \"\"" Nov 26 15:33:09 crc kubenswrapper[4785]: I1126 15:33:09.679091 4785 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5ab902c8-4fc4-4a16-a87d-a3b0979c6895-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 26 15:33:09 crc kubenswrapper[4785]: I1126 15:33:09.702730 4785 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-9g4wk" Nov 26 15:33:09 crc kubenswrapper[4785]: I1126 15:33:09.702807 4785 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-9g4wk" Nov 26 15:33:09 crc kubenswrapper[4785]: I1126 15:33:09.758763 4785 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-9g4wk" Nov 26 15:33:09 crc kubenswrapper[4785]: I1126 15:33:09.988463 4785 generic.go:334] "Generic (PLEG): container finished" podID="5ab902c8-4fc4-4a16-a87d-a3b0979c6895" containerID="c6a406b0e0de328d42317115430647e0220396399715e31075a89f5216da20fd" exitCode=0 Nov 26 15:33:09 crc kubenswrapper[4785]: I1126 15:33:09.988507 4785 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-frnfh" Nov 26 15:33:09 crc kubenswrapper[4785]: I1126 15:33:09.988582 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-frnfh" event={"ID":"5ab902c8-4fc4-4a16-a87d-a3b0979c6895","Type":"ContainerDied","Data":"c6a406b0e0de328d42317115430647e0220396399715e31075a89f5216da20fd"} Nov 26 15:33:09 crc kubenswrapper[4785]: I1126 15:33:09.988612 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-frnfh" event={"ID":"5ab902c8-4fc4-4a16-a87d-a3b0979c6895","Type":"ContainerDied","Data":"9281f7c8493951a7c2d93b02236c4b1cf01bbdc241c7f47032d73d4203b99636"} Nov 26 15:33:09 crc kubenswrapper[4785]: I1126 15:33:09.988633 4785 scope.go:117] "RemoveContainer" containerID="c6a406b0e0de328d42317115430647e0220396399715e31075a89f5216da20fd" Nov 26 15:33:10 crc kubenswrapper[4785]: I1126 15:33:10.018962 4785 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-frnfh"] Nov 26 15:33:10 crc kubenswrapper[4785]: I1126 15:33:10.025268 4785 scope.go:117] "RemoveContainer" containerID="cdfd91481c17f5adad4807fd3a05c2dbcc2d041be18ff74d1505c5352e6e607f" Nov 26 15:33:10 crc kubenswrapper[4785]: I1126 15:33:10.027361 4785 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-frnfh"] Nov 26 15:33:10 crc kubenswrapper[4785]: I1126 15:33:10.044316 4785 scope.go:117] "RemoveContainer" containerID="b304468eef8130490e93312e8412f880624d23d67554ca91dd399e147d1f530a" Nov 26 15:33:10 crc kubenswrapper[4785]: I1126 15:33:10.052225 4785 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-9g4wk" Nov 26 15:33:10 crc kubenswrapper[4785]: I1126 15:33:10.081077 4785 scope.go:117] "RemoveContainer" containerID="c6a406b0e0de328d42317115430647e0220396399715e31075a89f5216da20fd" Nov 26 15:33:10 crc kubenswrapper[4785]: E1126 15:33:10.081458 4785 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c6a406b0e0de328d42317115430647e0220396399715e31075a89f5216da20fd\": container with ID starting with c6a406b0e0de328d42317115430647e0220396399715e31075a89f5216da20fd not found: ID does not exist" containerID="c6a406b0e0de328d42317115430647e0220396399715e31075a89f5216da20fd" Nov 26 15:33:10 crc kubenswrapper[4785]: I1126 15:33:10.081486 4785 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c6a406b0e0de328d42317115430647e0220396399715e31075a89f5216da20fd"} err="failed to get container status \"c6a406b0e0de328d42317115430647e0220396399715e31075a89f5216da20fd\": rpc error: code = NotFound desc = could not find container \"c6a406b0e0de328d42317115430647e0220396399715e31075a89f5216da20fd\": container with ID starting with c6a406b0e0de328d42317115430647e0220396399715e31075a89f5216da20fd not found: ID does not exist" Nov 26 15:33:10 crc kubenswrapper[4785]: I1126 15:33:10.081504 4785 scope.go:117] "RemoveContainer" containerID="cdfd91481c17f5adad4807fd3a05c2dbcc2d041be18ff74d1505c5352e6e607f" Nov 26 15:33:10 crc kubenswrapper[4785]: E1126 15:33:10.081739 4785 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cdfd91481c17f5adad4807fd3a05c2dbcc2d041be18ff74d1505c5352e6e607f\": container with ID starting with cdfd91481c17f5adad4807fd3a05c2dbcc2d041be18ff74d1505c5352e6e607f not found: ID does not exist" containerID="cdfd91481c17f5adad4807fd3a05c2dbcc2d041be18ff74d1505c5352e6e607f" Nov 26 15:33:10 crc kubenswrapper[4785]: I1126 15:33:10.081774 4785 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cdfd91481c17f5adad4807fd3a05c2dbcc2d041be18ff74d1505c5352e6e607f"} err="failed to get container status \"cdfd91481c17f5adad4807fd3a05c2dbcc2d041be18ff74d1505c5352e6e607f\": rpc error: code = NotFound desc = could not find container \"cdfd91481c17f5adad4807fd3a05c2dbcc2d041be18ff74d1505c5352e6e607f\": container with ID starting with cdfd91481c17f5adad4807fd3a05c2dbcc2d041be18ff74d1505c5352e6e607f not found: ID does not exist" Nov 26 15:33:10 crc kubenswrapper[4785]: I1126 15:33:10.081787 4785 scope.go:117] "RemoveContainer" containerID="b304468eef8130490e93312e8412f880624d23d67554ca91dd399e147d1f530a" Nov 26 15:33:10 crc kubenswrapper[4785]: E1126 15:33:10.081949 4785 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b304468eef8130490e93312e8412f880624d23d67554ca91dd399e147d1f530a\": container with ID starting with b304468eef8130490e93312e8412f880624d23d67554ca91dd399e147d1f530a not found: ID does not exist" containerID="b304468eef8130490e93312e8412f880624d23d67554ca91dd399e147d1f530a" Nov 26 15:33:10 crc kubenswrapper[4785]: I1126 15:33:10.081969 4785 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b304468eef8130490e93312e8412f880624d23d67554ca91dd399e147d1f530a"} err="failed to get container status \"b304468eef8130490e93312e8412f880624d23d67554ca91dd399e147d1f530a\": rpc error: code = NotFound desc = could not find container \"b304468eef8130490e93312e8412f880624d23d67554ca91dd399e147d1f530a\": container with ID starting with b304468eef8130490e93312e8412f880624d23d67554ca91dd399e147d1f530a not found: ID does not exist" Nov 26 15:33:11 crc kubenswrapper[4785]: I1126 15:33:11.052959 4785 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5ab902c8-4fc4-4a16-a87d-a3b0979c6895" path="/var/lib/kubelet/pods/5ab902c8-4fc4-4a16-a87d-a3b0979c6895/volumes" Nov 26 15:33:11 crc kubenswrapper[4785]: I1126 15:33:11.053940 4785 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="98cf1e09-3af0-4286-aa6f-43dbff032ba1" path="/var/lib/kubelet/pods/98cf1e09-3af0-4286-aa6f-43dbff032ba1/volumes" Nov 26 15:33:11 crc kubenswrapper[4785]: I1126 15:33:11.166031 4785 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-9g4wk"] Nov 26 15:33:12 crc kubenswrapper[4785]: I1126 15:33:12.036549 4785 scope.go:117] "RemoveContainer" containerID="3e8c4964e31211d666dbae0e50d629dfc05802d610982228b7bbd6c4dcfcee2a" Nov 26 15:33:12 crc kubenswrapper[4785]: E1126 15:33:12.037248 4785 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gkxdl_openshift-machine-config-operator(5539d39a-e2bc-4e7f-8b5a-a5e3e10c4ba4)\"" pod="openshift-machine-config-operator/machine-config-daemon-gkxdl" podUID="5539d39a-e2bc-4e7f-8b5a-a5e3e10c4ba4" Nov 26 15:33:12 crc kubenswrapper[4785]: I1126 15:33:12.038697 4785 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-9g4wk" podUID="738357c6-a8dc-4498-ba43-9124dc07b379" containerName="registry-server" containerID="cri-o://e1467c7c9e059371a3600cd448fec5aee3ac7ce046c2897de2d28f10b8513f34" gracePeriod=2 Nov 26 15:33:12 crc kubenswrapper[4785]: I1126 15:33:12.467207 4785 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9g4wk" Nov 26 15:33:12 crc kubenswrapper[4785]: I1126 15:33:12.658703 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lkjv9\" (UniqueName: \"kubernetes.io/projected/738357c6-a8dc-4498-ba43-9124dc07b379-kube-api-access-lkjv9\") pod \"738357c6-a8dc-4498-ba43-9124dc07b379\" (UID: \"738357c6-a8dc-4498-ba43-9124dc07b379\") " Nov 26 15:33:12 crc kubenswrapper[4785]: I1126 15:33:12.658777 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/738357c6-a8dc-4498-ba43-9124dc07b379-catalog-content\") pod \"738357c6-a8dc-4498-ba43-9124dc07b379\" (UID: \"738357c6-a8dc-4498-ba43-9124dc07b379\") " Nov 26 15:33:12 crc kubenswrapper[4785]: I1126 15:33:12.658842 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/738357c6-a8dc-4498-ba43-9124dc07b379-utilities\") pod \"738357c6-a8dc-4498-ba43-9124dc07b379\" (UID: \"738357c6-a8dc-4498-ba43-9124dc07b379\") " Nov 26 15:33:12 crc kubenswrapper[4785]: I1126 15:33:12.659639 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/738357c6-a8dc-4498-ba43-9124dc07b379-utilities" (OuterVolumeSpecName: "utilities") pod "738357c6-a8dc-4498-ba43-9124dc07b379" (UID: "738357c6-a8dc-4498-ba43-9124dc07b379"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 15:33:12 crc kubenswrapper[4785]: I1126 15:33:12.664010 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/738357c6-a8dc-4498-ba43-9124dc07b379-kube-api-access-lkjv9" (OuterVolumeSpecName: "kube-api-access-lkjv9") pod "738357c6-a8dc-4498-ba43-9124dc07b379" (UID: "738357c6-a8dc-4498-ba43-9124dc07b379"). InnerVolumeSpecName "kube-api-access-lkjv9". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 15:33:12 crc kubenswrapper[4785]: I1126 15:33:12.676859 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/738357c6-a8dc-4498-ba43-9124dc07b379-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "738357c6-a8dc-4498-ba43-9124dc07b379" (UID: "738357c6-a8dc-4498-ba43-9124dc07b379"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 15:33:12 crc kubenswrapper[4785]: I1126 15:33:12.760272 4785 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lkjv9\" (UniqueName: \"kubernetes.io/projected/738357c6-a8dc-4498-ba43-9124dc07b379-kube-api-access-lkjv9\") on node \"crc\" DevicePath \"\"" Nov 26 15:33:12 crc kubenswrapper[4785]: I1126 15:33:12.760321 4785 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/738357c6-a8dc-4498-ba43-9124dc07b379-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 26 15:33:12 crc kubenswrapper[4785]: I1126 15:33:12.760342 4785 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/738357c6-a8dc-4498-ba43-9124dc07b379-utilities\") on node \"crc\" DevicePath \"\"" Nov 26 15:33:13 crc kubenswrapper[4785]: I1126 15:33:13.049368 4785 generic.go:334] "Generic (PLEG): container finished" podID="738357c6-a8dc-4498-ba43-9124dc07b379" containerID="e1467c7c9e059371a3600cd448fec5aee3ac7ce046c2897de2d28f10b8513f34" exitCode=0 Nov 26 15:33:13 crc kubenswrapper[4785]: I1126 15:33:13.049416 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9g4wk" event={"ID":"738357c6-a8dc-4498-ba43-9124dc07b379","Type":"ContainerDied","Data":"e1467c7c9e059371a3600cd448fec5aee3ac7ce046c2897de2d28f10b8513f34"} Nov 26 15:33:13 crc kubenswrapper[4785]: I1126 15:33:13.049440 4785 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9g4wk" Nov 26 15:33:13 crc kubenswrapper[4785]: I1126 15:33:13.049456 4785 scope.go:117] "RemoveContainer" containerID="e1467c7c9e059371a3600cd448fec5aee3ac7ce046c2897de2d28f10b8513f34" Nov 26 15:33:13 crc kubenswrapper[4785]: I1126 15:33:13.049444 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9g4wk" event={"ID":"738357c6-a8dc-4498-ba43-9124dc07b379","Type":"ContainerDied","Data":"3f2978c5068244b6df8e799516293fcdf004f45740d472efb38f1cc88c3760f1"} Nov 26 15:33:13 crc kubenswrapper[4785]: I1126 15:33:13.081654 4785 scope.go:117] "RemoveContainer" containerID="c45d7a79ba58e398eb85c442d4220451fbac7318d0159b42a81337bbf58bf415" Nov 26 15:33:13 crc kubenswrapper[4785]: I1126 15:33:13.102587 4785 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-9g4wk"] Nov 26 15:33:13 crc kubenswrapper[4785]: I1126 15:33:13.111660 4785 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-9g4wk"] Nov 26 15:33:13 crc kubenswrapper[4785]: I1126 15:33:13.115686 4785 scope.go:117] "RemoveContainer" containerID="5eff172a6af41ffc846f6042855c01d215c42f380efcc9094c41944d63726fae" Nov 26 15:33:13 crc kubenswrapper[4785]: I1126 15:33:13.143850 4785 scope.go:117] "RemoveContainer" containerID="e1467c7c9e059371a3600cd448fec5aee3ac7ce046c2897de2d28f10b8513f34" Nov 26 15:33:13 crc kubenswrapper[4785]: E1126 15:33:13.144306 4785 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e1467c7c9e059371a3600cd448fec5aee3ac7ce046c2897de2d28f10b8513f34\": container with ID starting with e1467c7c9e059371a3600cd448fec5aee3ac7ce046c2897de2d28f10b8513f34 not found: ID does not exist" containerID="e1467c7c9e059371a3600cd448fec5aee3ac7ce046c2897de2d28f10b8513f34" Nov 26 15:33:13 crc kubenswrapper[4785]: I1126 15:33:13.144385 4785 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e1467c7c9e059371a3600cd448fec5aee3ac7ce046c2897de2d28f10b8513f34"} err="failed to get container status \"e1467c7c9e059371a3600cd448fec5aee3ac7ce046c2897de2d28f10b8513f34\": rpc error: code = NotFound desc = could not find container \"e1467c7c9e059371a3600cd448fec5aee3ac7ce046c2897de2d28f10b8513f34\": container with ID starting with e1467c7c9e059371a3600cd448fec5aee3ac7ce046c2897de2d28f10b8513f34 not found: ID does not exist" Nov 26 15:33:13 crc kubenswrapper[4785]: I1126 15:33:13.144412 4785 scope.go:117] "RemoveContainer" containerID="c45d7a79ba58e398eb85c442d4220451fbac7318d0159b42a81337bbf58bf415" Nov 26 15:33:13 crc kubenswrapper[4785]: E1126 15:33:13.145017 4785 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c45d7a79ba58e398eb85c442d4220451fbac7318d0159b42a81337bbf58bf415\": container with ID starting with c45d7a79ba58e398eb85c442d4220451fbac7318d0159b42a81337bbf58bf415 not found: ID does not exist" containerID="c45d7a79ba58e398eb85c442d4220451fbac7318d0159b42a81337bbf58bf415" Nov 26 15:33:13 crc kubenswrapper[4785]: I1126 15:33:13.145048 4785 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c45d7a79ba58e398eb85c442d4220451fbac7318d0159b42a81337bbf58bf415"} err="failed to get container status \"c45d7a79ba58e398eb85c442d4220451fbac7318d0159b42a81337bbf58bf415\": rpc error: code = NotFound desc = could not find container \"c45d7a79ba58e398eb85c442d4220451fbac7318d0159b42a81337bbf58bf415\": container with ID starting with c45d7a79ba58e398eb85c442d4220451fbac7318d0159b42a81337bbf58bf415 not found: ID does not exist" Nov 26 15:33:13 crc kubenswrapper[4785]: I1126 15:33:13.145066 4785 scope.go:117] "RemoveContainer" containerID="5eff172a6af41ffc846f6042855c01d215c42f380efcc9094c41944d63726fae" Nov 26 15:33:13 crc kubenswrapper[4785]: E1126 15:33:13.145713 4785 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5eff172a6af41ffc846f6042855c01d215c42f380efcc9094c41944d63726fae\": container with ID starting with 5eff172a6af41ffc846f6042855c01d215c42f380efcc9094c41944d63726fae not found: ID does not exist" containerID="5eff172a6af41ffc846f6042855c01d215c42f380efcc9094c41944d63726fae" Nov 26 15:33:13 crc kubenswrapper[4785]: I1126 15:33:13.145746 4785 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5eff172a6af41ffc846f6042855c01d215c42f380efcc9094c41944d63726fae"} err="failed to get container status \"5eff172a6af41ffc846f6042855c01d215c42f380efcc9094c41944d63726fae\": rpc error: code = NotFound desc = could not find container \"5eff172a6af41ffc846f6042855c01d215c42f380efcc9094c41944d63726fae\": container with ID starting with 5eff172a6af41ffc846f6042855c01d215c42f380efcc9094c41944d63726fae not found: ID does not exist" Nov 26 15:33:13 crc kubenswrapper[4785]: I1126 15:33:13.974126 4785 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-xlnfx"] Nov 26 15:33:13 crc kubenswrapper[4785]: E1126 15:33:13.974678 4785 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="937be872-928d-4374-8605-eb4e529dbebe" containerName="extract-utilities" Nov 26 15:33:13 crc kubenswrapper[4785]: I1126 15:33:13.974693 4785 state_mem.go:107] "Deleted CPUSet assignment" podUID="937be872-928d-4374-8605-eb4e529dbebe" containerName="extract-utilities" Nov 26 15:33:13 crc kubenswrapper[4785]: E1126 15:33:13.974706 4785 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e16ee24a-ec88-4d48-8a64-1136e673aa8d" containerName="extract-content" Nov 26 15:33:13 crc kubenswrapper[4785]: I1126 15:33:13.974713 4785 state_mem.go:107] "Deleted CPUSet assignment" podUID="e16ee24a-ec88-4d48-8a64-1136e673aa8d" containerName="extract-content" Nov 26 15:33:13 crc kubenswrapper[4785]: E1126 15:33:13.974726 4785 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb2fc6d0-088f-46cb-b99e-9650ecfbba7f" containerName="extract-utilities" Nov 26 15:33:13 crc kubenswrapper[4785]: I1126 15:33:13.974734 4785 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb2fc6d0-088f-46cb-b99e-9650ecfbba7f" containerName="extract-utilities" Nov 26 15:33:13 crc kubenswrapper[4785]: E1126 15:33:13.974748 4785 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="98cf1e09-3af0-4286-aa6f-43dbff032ba1" containerName="registry-server" Nov 26 15:33:13 crc kubenswrapper[4785]: I1126 15:33:13.974755 4785 state_mem.go:107] "Deleted CPUSet assignment" podUID="98cf1e09-3af0-4286-aa6f-43dbff032ba1" containerName="registry-server" Nov 26 15:33:13 crc kubenswrapper[4785]: E1126 15:33:13.974766 4785 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="738357c6-a8dc-4498-ba43-9124dc07b379" containerName="extract-utilities" Nov 26 15:33:13 crc kubenswrapper[4785]: I1126 15:33:13.974773 4785 state_mem.go:107] "Deleted CPUSet assignment" podUID="738357c6-a8dc-4498-ba43-9124dc07b379" containerName="extract-utilities" Nov 26 15:33:13 crc kubenswrapper[4785]: E1126 15:33:13.974784 4785 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="937be872-928d-4374-8605-eb4e529dbebe" containerName="extract-content" Nov 26 15:33:13 crc kubenswrapper[4785]: I1126 15:33:13.974791 4785 state_mem.go:107] "Deleted CPUSet assignment" podUID="937be872-928d-4374-8605-eb4e529dbebe" containerName="extract-content" Nov 26 15:33:13 crc kubenswrapper[4785]: E1126 15:33:13.974802 4785 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="42779d5c-2b5c-4058-9b9d-995285d02e80" containerName="extract-utilities" Nov 26 15:33:13 crc kubenswrapper[4785]: I1126 15:33:13.974811 4785 state_mem.go:107] "Deleted CPUSet assignment" podUID="42779d5c-2b5c-4058-9b9d-995285d02e80" containerName="extract-utilities" Nov 26 15:33:13 crc kubenswrapper[4785]: E1126 15:33:13.974821 4785 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="98cf1e09-3af0-4286-aa6f-43dbff032ba1" containerName="extract-content" Nov 26 15:33:13 crc kubenswrapper[4785]: I1126 15:33:13.974829 4785 state_mem.go:107] "Deleted CPUSet assignment" podUID="98cf1e09-3af0-4286-aa6f-43dbff032ba1" containerName="extract-content" Nov 26 15:33:13 crc kubenswrapper[4785]: E1126 15:33:13.974844 4785 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="42779d5c-2b5c-4058-9b9d-995285d02e80" containerName="extract-content" Nov 26 15:33:13 crc kubenswrapper[4785]: I1126 15:33:13.974851 4785 state_mem.go:107] "Deleted CPUSet assignment" podUID="42779d5c-2b5c-4058-9b9d-995285d02e80" containerName="extract-content" Nov 26 15:33:13 crc kubenswrapper[4785]: E1126 15:33:13.974866 4785 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e16ee24a-ec88-4d48-8a64-1136e673aa8d" containerName="extract-utilities" Nov 26 15:33:13 crc kubenswrapper[4785]: I1126 15:33:13.974873 4785 state_mem.go:107] "Deleted CPUSet assignment" podUID="e16ee24a-ec88-4d48-8a64-1136e673aa8d" containerName="extract-utilities" Nov 26 15:33:13 crc kubenswrapper[4785]: E1126 15:33:13.974888 4785 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="850a62cf-3d25-4de4-8783-d692272ee55f" containerName="extract-utilities" Nov 26 15:33:13 crc kubenswrapper[4785]: I1126 15:33:13.974896 4785 state_mem.go:107] "Deleted CPUSet assignment" podUID="850a62cf-3d25-4de4-8783-d692272ee55f" containerName="extract-utilities" Nov 26 15:33:13 crc kubenswrapper[4785]: E1126 15:33:13.974906 4785 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="827906ee-a764-4950-849d-788b5f263040" containerName="registry-server" Nov 26 15:33:13 crc kubenswrapper[4785]: I1126 15:33:13.974917 4785 state_mem.go:107] "Deleted CPUSet assignment" podUID="827906ee-a764-4950-849d-788b5f263040" containerName="registry-server" Nov 26 15:33:13 crc kubenswrapper[4785]: E1126 15:33:13.974929 4785 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="439a16e9-65e4-4450-b95a-79095a4eebb9" containerName="extract-content" Nov 26 15:33:13 crc kubenswrapper[4785]: I1126 15:33:13.974936 4785 state_mem.go:107] "Deleted CPUSet assignment" podUID="439a16e9-65e4-4450-b95a-79095a4eebb9" containerName="extract-content" Nov 26 15:33:13 crc kubenswrapper[4785]: E1126 15:33:13.974946 4785 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8abc4613-7050-40be-be6d-4ffc3e686b80" containerName="extract-utilities" Nov 26 15:33:13 crc kubenswrapper[4785]: I1126 15:33:13.974953 4785 state_mem.go:107] "Deleted CPUSet assignment" podUID="8abc4613-7050-40be-be6d-4ffc3e686b80" containerName="extract-utilities" Nov 26 15:33:13 crc kubenswrapper[4785]: E1126 15:33:13.974965 4785 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df43a35c-ce4b-434a-90a2-69bd1e97090d" containerName="extract-utilities" Nov 26 15:33:13 crc kubenswrapper[4785]: I1126 15:33:13.974972 4785 state_mem.go:107] "Deleted CPUSet assignment" podUID="df43a35c-ce4b-434a-90a2-69bd1e97090d" containerName="extract-utilities" Nov 26 15:33:13 crc kubenswrapper[4785]: E1126 15:33:13.974982 4785 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df43a35c-ce4b-434a-90a2-69bd1e97090d" containerName="extract-content" Nov 26 15:33:13 crc kubenswrapper[4785]: I1126 15:33:13.974990 4785 state_mem.go:107] "Deleted CPUSet assignment" podUID="df43a35c-ce4b-434a-90a2-69bd1e97090d" containerName="extract-content" Nov 26 15:33:13 crc kubenswrapper[4785]: E1126 15:33:13.975002 4785 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8abc4613-7050-40be-be6d-4ffc3e686b80" containerName="registry-server" Nov 26 15:33:13 crc kubenswrapper[4785]: I1126 15:33:13.975009 4785 state_mem.go:107] "Deleted CPUSet assignment" podUID="8abc4613-7050-40be-be6d-4ffc3e686b80" containerName="registry-server" Nov 26 15:33:13 crc kubenswrapper[4785]: E1126 15:33:13.975018 4785 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="827906ee-a764-4950-849d-788b5f263040" containerName="extract-content" Nov 26 15:33:13 crc kubenswrapper[4785]: I1126 15:33:13.975025 4785 state_mem.go:107] "Deleted CPUSet assignment" podUID="827906ee-a764-4950-849d-788b5f263040" containerName="extract-content" Nov 26 15:33:13 crc kubenswrapper[4785]: E1126 15:33:13.975038 4785 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df43a35c-ce4b-434a-90a2-69bd1e97090d" containerName="registry-server" Nov 26 15:33:13 crc kubenswrapper[4785]: I1126 15:33:13.975045 4785 state_mem.go:107] "Deleted CPUSet assignment" podUID="df43a35c-ce4b-434a-90a2-69bd1e97090d" containerName="registry-server" Nov 26 15:33:13 crc kubenswrapper[4785]: E1126 15:33:13.975056 4785 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="738357c6-a8dc-4498-ba43-9124dc07b379" containerName="extract-content" Nov 26 15:33:13 crc kubenswrapper[4785]: I1126 15:33:13.975063 4785 state_mem.go:107] "Deleted CPUSet assignment" podUID="738357c6-a8dc-4498-ba43-9124dc07b379" containerName="extract-content" Nov 26 15:33:13 crc kubenswrapper[4785]: E1126 15:33:13.975076 4785 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="827906ee-a764-4950-849d-788b5f263040" containerName="extract-utilities" Nov 26 15:33:13 crc kubenswrapper[4785]: I1126 15:33:13.975083 4785 state_mem.go:107] "Deleted CPUSet assignment" podUID="827906ee-a764-4950-849d-788b5f263040" containerName="extract-utilities" Nov 26 15:33:13 crc kubenswrapper[4785]: E1126 15:33:13.975094 4785 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="937be872-928d-4374-8605-eb4e529dbebe" containerName="registry-server" Nov 26 15:33:13 crc kubenswrapper[4785]: I1126 15:33:13.975102 4785 state_mem.go:107] "Deleted CPUSet assignment" podUID="937be872-928d-4374-8605-eb4e529dbebe" containerName="registry-server" Nov 26 15:33:13 crc kubenswrapper[4785]: E1126 15:33:13.975113 4785 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="42779d5c-2b5c-4058-9b9d-995285d02e80" containerName="registry-server" Nov 26 15:33:13 crc kubenswrapper[4785]: I1126 15:33:13.975120 4785 state_mem.go:107] "Deleted CPUSet assignment" podUID="42779d5c-2b5c-4058-9b9d-995285d02e80" containerName="registry-server" Nov 26 15:33:13 crc kubenswrapper[4785]: E1126 15:33:13.975132 4785 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ab902c8-4fc4-4a16-a87d-a3b0979c6895" containerName="extract-content" Nov 26 15:33:13 crc kubenswrapper[4785]: I1126 15:33:13.975140 4785 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ab902c8-4fc4-4a16-a87d-a3b0979c6895" containerName="extract-content" Nov 26 15:33:13 crc kubenswrapper[4785]: E1126 15:33:13.975154 4785 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="738357c6-a8dc-4498-ba43-9124dc07b379" containerName="registry-server" Nov 26 15:33:13 crc kubenswrapper[4785]: I1126 15:33:13.975161 4785 state_mem.go:107] "Deleted CPUSet assignment" podUID="738357c6-a8dc-4498-ba43-9124dc07b379" containerName="registry-server" Nov 26 15:33:13 crc kubenswrapper[4785]: E1126 15:33:13.975170 4785 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="850a62cf-3d25-4de4-8783-d692272ee55f" containerName="extract-content" Nov 26 15:33:13 crc kubenswrapper[4785]: I1126 15:33:13.975177 4785 state_mem.go:107] "Deleted CPUSet assignment" podUID="850a62cf-3d25-4de4-8783-d692272ee55f" containerName="extract-content" Nov 26 15:33:13 crc kubenswrapper[4785]: E1126 15:33:13.975187 4785 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8abc4613-7050-40be-be6d-4ffc3e686b80" containerName="extract-content" Nov 26 15:33:13 crc kubenswrapper[4785]: I1126 15:33:13.975194 4785 state_mem.go:107] "Deleted CPUSet assignment" podUID="8abc4613-7050-40be-be6d-4ffc3e686b80" containerName="extract-content" Nov 26 15:33:13 crc kubenswrapper[4785]: E1126 15:33:13.975205 4785 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb2fc6d0-088f-46cb-b99e-9650ecfbba7f" containerName="extract-content" Nov 26 15:33:13 crc kubenswrapper[4785]: I1126 15:33:13.975213 4785 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb2fc6d0-088f-46cb-b99e-9650ecfbba7f" containerName="extract-content" Nov 26 15:33:13 crc kubenswrapper[4785]: E1126 15:33:13.975223 4785 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e16ee24a-ec88-4d48-8a64-1136e673aa8d" containerName="registry-server" Nov 26 15:33:13 crc kubenswrapper[4785]: I1126 15:33:13.975231 4785 state_mem.go:107] "Deleted CPUSet assignment" podUID="e16ee24a-ec88-4d48-8a64-1136e673aa8d" containerName="registry-server" Nov 26 15:33:13 crc kubenswrapper[4785]: E1126 15:33:13.975243 4785 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="439a16e9-65e4-4450-b95a-79095a4eebb9" containerName="registry-server" Nov 26 15:33:13 crc kubenswrapper[4785]: I1126 15:33:13.975250 4785 state_mem.go:107] "Deleted CPUSet assignment" podUID="439a16e9-65e4-4450-b95a-79095a4eebb9" containerName="registry-server" Nov 26 15:33:13 crc kubenswrapper[4785]: E1126 15:33:13.975263 4785 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="850a62cf-3d25-4de4-8783-d692272ee55f" containerName="registry-server" Nov 26 15:33:13 crc kubenswrapper[4785]: I1126 15:33:13.975271 4785 state_mem.go:107] "Deleted CPUSet assignment" podUID="850a62cf-3d25-4de4-8783-d692272ee55f" containerName="registry-server" Nov 26 15:33:13 crc kubenswrapper[4785]: E1126 15:33:13.975284 4785 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="98cf1e09-3af0-4286-aa6f-43dbff032ba1" containerName="extract-utilities" Nov 26 15:33:13 crc kubenswrapper[4785]: I1126 15:33:13.975291 4785 state_mem.go:107] "Deleted CPUSet assignment" podUID="98cf1e09-3af0-4286-aa6f-43dbff032ba1" containerName="extract-utilities" Nov 26 15:33:13 crc kubenswrapper[4785]: E1126 15:33:13.975305 4785 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ab902c8-4fc4-4a16-a87d-a3b0979c6895" containerName="registry-server" Nov 26 15:33:13 crc kubenswrapper[4785]: I1126 15:33:13.975313 4785 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ab902c8-4fc4-4a16-a87d-a3b0979c6895" containerName="registry-server" Nov 26 15:33:13 crc kubenswrapper[4785]: E1126 15:33:13.975324 4785 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb2fc6d0-088f-46cb-b99e-9650ecfbba7f" containerName="registry-server" Nov 26 15:33:13 crc kubenswrapper[4785]: I1126 15:33:13.975331 4785 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb2fc6d0-088f-46cb-b99e-9650ecfbba7f" containerName="registry-server" Nov 26 15:33:13 crc kubenswrapper[4785]: E1126 15:33:13.975341 4785 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="439a16e9-65e4-4450-b95a-79095a4eebb9" containerName="extract-utilities" Nov 26 15:33:13 crc kubenswrapper[4785]: I1126 15:33:13.975348 4785 state_mem.go:107] "Deleted CPUSet assignment" podUID="439a16e9-65e4-4450-b95a-79095a4eebb9" containerName="extract-utilities" Nov 26 15:33:13 crc kubenswrapper[4785]: E1126 15:33:13.975363 4785 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ab902c8-4fc4-4a16-a87d-a3b0979c6895" containerName="extract-utilities" Nov 26 15:33:13 crc kubenswrapper[4785]: I1126 15:33:13.975371 4785 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ab902c8-4fc4-4a16-a87d-a3b0979c6895" containerName="extract-utilities" Nov 26 15:33:13 crc kubenswrapper[4785]: I1126 15:33:13.975506 4785 memory_manager.go:354] "RemoveStaleState removing state" podUID="5ab902c8-4fc4-4a16-a87d-a3b0979c6895" containerName="registry-server" Nov 26 15:33:13 crc kubenswrapper[4785]: I1126 15:33:13.975522 4785 memory_manager.go:354] "RemoveStaleState removing state" podUID="98cf1e09-3af0-4286-aa6f-43dbff032ba1" containerName="registry-server" Nov 26 15:33:13 crc kubenswrapper[4785]: I1126 15:33:13.975532 4785 memory_manager.go:354] "RemoveStaleState removing state" podUID="827906ee-a764-4950-849d-788b5f263040" containerName="registry-server" Nov 26 15:33:13 crc kubenswrapper[4785]: I1126 15:33:13.975543 4785 memory_manager.go:354] "RemoveStaleState removing state" podUID="df43a35c-ce4b-434a-90a2-69bd1e97090d" containerName="registry-server" Nov 26 15:33:13 crc kubenswrapper[4785]: I1126 15:33:13.975571 4785 memory_manager.go:354] "RemoveStaleState removing state" podUID="eb2fc6d0-088f-46cb-b99e-9650ecfbba7f" containerName="registry-server" Nov 26 15:33:13 crc kubenswrapper[4785]: I1126 15:33:13.975581 4785 memory_manager.go:354] "RemoveStaleState removing state" podUID="850a62cf-3d25-4de4-8783-d692272ee55f" containerName="registry-server" Nov 26 15:33:13 crc kubenswrapper[4785]: I1126 15:33:13.975591 4785 memory_manager.go:354] "RemoveStaleState removing state" podUID="42779d5c-2b5c-4058-9b9d-995285d02e80" containerName="registry-server" Nov 26 15:33:13 crc kubenswrapper[4785]: I1126 15:33:13.975608 4785 memory_manager.go:354] "RemoveStaleState removing state" podUID="439a16e9-65e4-4450-b95a-79095a4eebb9" containerName="registry-server" Nov 26 15:33:13 crc kubenswrapper[4785]: I1126 15:33:13.975622 4785 memory_manager.go:354] "RemoveStaleState removing state" podUID="8abc4613-7050-40be-be6d-4ffc3e686b80" containerName="registry-server" Nov 26 15:33:13 crc kubenswrapper[4785]: I1126 15:33:13.975633 4785 memory_manager.go:354] "RemoveStaleState removing state" podUID="738357c6-a8dc-4498-ba43-9124dc07b379" containerName="registry-server" Nov 26 15:33:13 crc kubenswrapper[4785]: I1126 15:33:13.975647 4785 memory_manager.go:354] "RemoveStaleState removing state" podUID="e16ee24a-ec88-4d48-8a64-1136e673aa8d" containerName="registry-server" Nov 26 15:33:13 crc kubenswrapper[4785]: I1126 15:33:13.975656 4785 memory_manager.go:354] "RemoveStaleState removing state" podUID="937be872-928d-4374-8605-eb4e529dbebe" containerName="registry-server" Nov 26 15:33:13 crc kubenswrapper[4785]: I1126 15:33:13.976793 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xlnfx" Nov 26 15:33:14 crc kubenswrapper[4785]: I1126 15:33:14.016670 4785 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-xlnfx"] Nov 26 15:33:14 crc kubenswrapper[4785]: I1126 15:33:14.060465 4785 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/keystone-bootstrap-88srj"] Nov 26 15:33:14 crc kubenswrapper[4785]: I1126 15:33:14.071837 4785 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/keystone-bootstrap-88srj"] Nov 26 15:33:14 crc kubenswrapper[4785]: I1126 15:33:14.084663 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gkh6n\" (UniqueName: \"kubernetes.io/projected/4e28a930-bff6-48b6-b7ac-e34c20ac22e1-kube-api-access-gkh6n\") pod \"redhat-operators-xlnfx\" (UID: \"4e28a930-bff6-48b6-b7ac-e34c20ac22e1\") " pod="openshift-marketplace/redhat-operators-xlnfx" Nov 26 15:33:14 crc kubenswrapper[4785]: I1126 15:33:14.084957 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4e28a930-bff6-48b6-b7ac-e34c20ac22e1-utilities\") pod \"redhat-operators-xlnfx\" (UID: \"4e28a930-bff6-48b6-b7ac-e34c20ac22e1\") " pod="openshift-marketplace/redhat-operators-xlnfx" Nov 26 15:33:14 crc kubenswrapper[4785]: I1126 15:33:14.085031 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4e28a930-bff6-48b6-b7ac-e34c20ac22e1-catalog-content\") pod \"redhat-operators-xlnfx\" (UID: \"4e28a930-bff6-48b6-b7ac-e34c20ac22e1\") " pod="openshift-marketplace/redhat-operators-xlnfx" Nov 26 15:33:14 crc kubenswrapper[4785]: I1126 15:33:14.186961 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gkh6n\" (UniqueName: \"kubernetes.io/projected/4e28a930-bff6-48b6-b7ac-e34c20ac22e1-kube-api-access-gkh6n\") pod \"redhat-operators-xlnfx\" (UID: \"4e28a930-bff6-48b6-b7ac-e34c20ac22e1\") " pod="openshift-marketplace/redhat-operators-xlnfx" Nov 26 15:33:14 crc kubenswrapper[4785]: I1126 15:33:14.187056 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4e28a930-bff6-48b6-b7ac-e34c20ac22e1-utilities\") pod \"redhat-operators-xlnfx\" (UID: \"4e28a930-bff6-48b6-b7ac-e34c20ac22e1\") " pod="openshift-marketplace/redhat-operators-xlnfx" Nov 26 15:33:14 crc kubenswrapper[4785]: I1126 15:33:14.187109 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4e28a930-bff6-48b6-b7ac-e34c20ac22e1-catalog-content\") pod \"redhat-operators-xlnfx\" (UID: \"4e28a930-bff6-48b6-b7ac-e34c20ac22e1\") " pod="openshift-marketplace/redhat-operators-xlnfx" Nov 26 15:33:14 crc kubenswrapper[4785]: I1126 15:33:14.187607 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4e28a930-bff6-48b6-b7ac-e34c20ac22e1-catalog-content\") pod \"redhat-operators-xlnfx\" (UID: \"4e28a930-bff6-48b6-b7ac-e34c20ac22e1\") " pod="openshift-marketplace/redhat-operators-xlnfx" Nov 26 15:33:14 crc kubenswrapper[4785]: I1126 15:33:14.187721 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4e28a930-bff6-48b6-b7ac-e34c20ac22e1-utilities\") pod \"redhat-operators-xlnfx\" (UID: \"4e28a930-bff6-48b6-b7ac-e34c20ac22e1\") " pod="openshift-marketplace/redhat-operators-xlnfx" Nov 26 15:33:14 crc kubenswrapper[4785]: I1126 15:33:14.208484 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gkh6n\" (UniqueName: \"kubernetes.io/projected/4e28a930-bff6-48b6-b7ac-e34c20ac22e1-kube-api-access-gkh6n\") pod \"redhat-operators-xlnfx\" (UID: \"4e28a930-bff6-48b6-b7ac-e34c20ac22e1\") " pod="openshift-marketplace/redhat-operators-xlnfx" Nov 26 15:33:14 crc kubenswrapper[4785]: I1126 15:33:14.335211 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xlnfx" Nov 26 15:33:14 crc kubenswrapper[4785]: I1126 15:33:14.758747 4785 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-xlnfx"] Nov 26 15:33:14 crc kubenswrapper[4785]: W1126 15:33:14.762652 4785 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4e28a930_bff6_48b6_b7ac_e34c20ac22e1.slice/crio-fe6acd255b84987ed42328d0e6390d250b139b463e16cf3f18044633a4deda02 WatchSource:0}: Error finding container fe6acd255b84987ed42328d0e6390d250b139b463e16cf3f18044633a4deda02: Status 404 returned error can't find the container with id fe6acd255b84987ed42328d0e6390d250b139b463e16cf3f18044633a4deda02 Nov 26 15:33:15 crc kubenswrapper[4785]: I1126 15:33:15.044907 4785 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="738357c6-a8dc-4498-ba43-9124dc07b379" path="/var/lib/kubelet/pods/738357c6-a8dc-4498-ba43-9124dc07b379/volumes" Nov 26 15:33:15 crc kubenswrapper[4785]: I1126 15:33:15.046883 4785 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9f64f69d-c5fa-4936-b96f-c99658d8ff59" path="/var/lib/kubelet/pods/9f64f69d-c5fa-4936-b96f-c99658d8ff59/volumes" Nov 26 15:33:15 crc kubenswrapper[4785]: I1126 15:33:15.065153 4785 generic.go:334] "Generic (PLEG): container finished" podID="4e28a930-bff6-48b6-b7ac-e34c20ac22e1" containerID="1b99da5e55f81b26936ae47f0c68cfb793c05305ced99ffcab6b4c3cedd81761" exitCode=0 Nov 26 15:33:15 crc kubenswrapper[4785]: I1126 15:33:15.065194 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xlnfx" event={"ID":"4e28a930-bff6-48b6-b7ac-e34c20ac22e1","Type":"ContainerDied","Data":"1b99da5e55f81b26936ae47f0c68cfb793c05305ced99ffcab6b4c3cedd81761"} Nov 26 15:33:15 crc kubenswrapper[4785]: I1126 15:33:15.065219 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xlnfx" event={"ID":"4e28a930-bff6-48b6-b7ac-e34c20ac22e1","Type":"ContainerStarted","Data":"fe6acd255b84987ed42328d0e6390d250b139b463e16cf3f18044633a4deda02"} Nov 26 15:33:15 crc kubenswrapper[4785]: I1126 15:33:15.584823 4785 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-sh25t"] Nov 26 15:33:15 crc kubenswrapper[4785]: I1126 15:33:15.587441 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-sh25t" Nov 26 15:33:15 crc kubenswrapper[4785]: I1126 15:33:15.611346 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a0142e7b-caff-4c76-86db-0b2fa7f3fc6d-catalog-content\") pod \"community-operators-sh25t\" (UID: \"a0142e7b-caff-4c76-86db-0b2fa7f3fc6d\") " pod="openshift-marketplace/community-operators-sh25t" Nov 26 15:33:15 crc kubenswrapper[4785]: I1126 15:33:15.611603 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a0142e7b-caff-4c76-86db-0b2fa7f3fc6d-utilities\") pod \"community-operators-sh25t\" (UID: \"a0142e7b-caff-4c76-86db-0b2fa7f3fc6d\") " pod="openshift-marketplace/community-operators-sh25t" Nov 26 15:33:15 crc kubenswrapper[4785]: I1126 15:33:15.611693 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mm79p\" (UniqueName: \"kubernetes.io/projected/a0142e7b-caff-4c76-86db-0b2fa7f3fc6d-kube-api-access-mm79p\") pod \"community-operators-sh25t\" (UID: \"a0142e7b-caff-4c76-86db-0b2fa7f3fc6d\") " pod="openshift-marketplace/community-operators-sh25t" Nov 26 15:33:15 crc kubenswrapper[4785]: I1126 15:33:15.618738 4785 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-sh25t"] Nov 26 15:33:15 crc kubenswrapper[4785]: I1126 15:33:15.712339 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a0142e7b-caff-4c76-86db-0b2fa7f3fc6d-catalog-content\") pod \"community-operators-sh25t\" (UID: \"a0142e7b-caff-4c76-86db-0b2fa7f3fc6d\") " pod="openshift-marketplace/community-operators-sh25t" Nov 26 15:33:15 crc kubenswrapper[4785]: I1126 15:33:15.712875 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a0142e7b-caff-4c76-86db-0b2fa7f3fc6d-utilities\") pod \"community-operators-sh25t\" (UID: \"a0142e7b-caff-4c76-86db-0b2fa7f3fc6d\") " pod="openshift-marketplace/community-operators-sh25t" Nov 26 15:33:15 crc kubenswrapper[4785]: I1126 15:33:15.712923 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mm79p\" (UniqueName: \"kubernetes.io/projected/a0142e7b-caff-4c76-86db-0b2fa7f3fc6d-kube-api-access-mm79p\") pod \"community-operators-sh25t\" (UID: \"a0142e7b-caff-4c76-86db-0b2fa7f3fc6d\") " pod="openshift-marketplace/community-operators-sh25t" Nov 26 15:33:15 crc kubenswrapper[4785]: I1126 15:33:15.713319 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a0142e7b-caff-4c76-86db-0b2fa7f3fc6d-utilities\") pod \"community-operators-sh25t\" (UID: \"a0142e7b-caff-4c76-86db-0b2fa7f3fc6d\") " pod="openshift-marketplace/community-operators-sh25t" Nov 26 15:33:15 crc kubenswrapper[4785]: I1126 15:33:15.713517 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a0142e7b-caff-4c76-86db-0b2fa7f3fc6d-catalog-content\") pod \"community-operators-sh25t\" (UID: \"a0142e7b-caff-4c76-86db-0b2fa7f3fc6d\") " pod="openshift-marketplace/community-operators-sh25t" Nov 26 15:33:15 crc kubenswrapper[4785]: I1126 15:33:15.738516 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mm79p\" (UniqueName: \"kubernetes.io/projected/a0142e7b-caff-4c76-86db-0b2fa7f3fc6d-kube-api-access-mm79p\") pod \"community-operators-sh25t\" (UID: \"a0142e7b-caff-4c76-86db-0b2fa7f3fc6d\") " pod="openshift-marketplace/community-operators-sh25t" Nov 26 15:33:15 crc kubenswrapper[4785]: I1126 15:33:15.917313 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-sh25t" Nov 26 15:33:16 crc kubenswrapper[4785]: W1126 15:33:16.392829 4785 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda0142e7b_caff_4c76_86db_0b2fa7f3fc6d.slice/crio-775e3e87ded51a00376228088ad279d5653ce542b982e743ee9c1cfaaba6818c WatchSource:0}: Error finding container 775e3e87ded51a00376228088ad279d5653ce542b982e743ee9c1cfaaba6818c: Status 404 returned error can't find the container with id 775e3e87ded51a00376228088ad279d5653ce542b982e743ee9c1cfaaba6818c Nov 26 15:33:16 crc kubenswrapper[4785]: I1126 15:33:16.401348 4785 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-sh25t"] Nov 26 15:33:17 crc kubenswrapper[4785]: I1126 15:33:17.081836 4785 generic.go:334] "Generic (PLEG): container finished" podID="a0142e7b-caff-4c76-86db-0b2fa7f3fc6d" containerID="9e5cc22c153f6e8f746706cc9fd003cc4a83e1acca452496d9b881e461561bc4" exitCode=0 Nov 26 15:33:17 crc kubenswrapper[4785]: I1126 15:33:17.081897 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sh25t" event={"ID":"a0142e7b-caff-4c76-86db-0b2fa7f3fc6d","Type":"ContainerDied","Data":"9e5cc22c153f6e8f746706cc9fd003cc4a83e1acca452496d9b881e461561bc4"} Nov 26 15:33:17 crc kubenswrapper[4785]: I1126 15:33:17.082261 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sh25t" event={"ID":"a0142e7b-caff-4c76-86db-0b2fa7f3fc6d","Type":"ContainerStarted","Data":"775e3e87ded51a00376228088ad279d5653ce542b982e743ee9c1cfaaba6818c"} Nov 26 15:33:17 crc kubenswrapper[4785]: I1126 15:33:17.084667 4785 generic.go:334] "Generic (PLEG): container finished" podID="4e28a930-bff6-48b6-b7ac-e34c20ac22e1" containerID="a13702b2bf9d620b3a926588d390bfe87376758c146969f9c4afa958db4bb4da" exitCode=0 Nov 26 15:33:17 crc kubenswrapper[4785]: I1126 15:33:17.084747 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xlnfx" event={"ID":"4e28a930-bff6-48b6-b7ac-e34c20ac22e1","Type":"ContainerDied","Data":"a13702b2bf9d620b3a926588d390bfe87376758c146969f9c4afa958db4bb4da"} Nov 26 15:33:18 crc kubenswrapper[4785]: I1126 15:33:18.096542 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xlnfx" event={"ID":"4e28a930-bff6-48b6-b7ac-e34c20ac22e1","Type":"ContainerStarted","Data":"eda36acfef263364be7cd607d5c7993fec8d7a148d6d572cefc792464b616f45"} Nov 26 15:33:18 crc kubenswrapper[4785]: I1126 15:33:18.134321 4785 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-xlnfx" podStartSLOduration=2.593898589 podStartE2EDuration="5.134291768s" podCreationTimestamp="2025-11-26 15:33:13 +0000 UTC" firstStartedPulling="2025-11-26 15:33:15.066613149 +0000 UTC m=+1558.744978913" lastFinishedPulling="2025-11-26 15:33:17.607006318 +0000 UTC m=+1561.285372092" observedRunningTime="2025-11-26 15:33:18.118126979 +0000 UTC m=+1561.796492773" watchObservedRunningTime="2025-11-26 15:33:18.134291768 +0000 UTC m=+1561.812657572" Nov 26 15:33:18 crc kubenswrapper[4785]: I1126 15:33:18.163785 4785 scope.go:117] "RemoveContainer" containerID="2ffccd3fa5f3f4730162cea4ccfc21ba0252eb8a41b340b551ef2bffa59e36ff" Nov 26 15:33:18 crc kubenswrapper[4785]: I1126 15:33:18.189916 4785 scope.go:117] "RemoveContainer" containerID="65b12e9b3c4851b5c6da5e144adb4e86f98c4f43ea830afdc18b44b3103d9e80" Nov 26 15:33:18 crc kubenswrapper[4785]: I1126 15:33:18.239187 4785 scope.go:117] "RemoveContainer" containerID="0867da53aa2c9940d8e7df4d24d84370f7a4815a48484465e7f9b7abebd3337d" Nov 26 15:33:18 crc kubenswrapper[4785]: I1126 15:33:18.270479 4785 scope.go:117] "RemoveContainer" containerID="8c84a90c72de5e5893aba6898eed2afdef0fdc40d4d733ea01f11a13a269d5d7" Nov 26 15:33:18 crc kubenswrapper[4785]: I1126 15:33:18.326411 4785 scope.go:117] "RemoveContainer" containerID="3dada919d61dce3122685dda7213cbfdbb204af8baad41f68c5704395899363b" Nov 26 15:33:18 crc kubenswrapper[4785]: I1126 15:33:18.351303 4785 scope.go:117] "RemoveContainer" containerID="a22899436e8934c2833baf58645fe45b83e14098745e66a57da67570714e34db" Nov 26 15:33:18 crc kubenswrapper[4785]: I1126 15:33:18.372244 4785 scope.go:117] "RemoveContainer" containerID="4f7040ec43f8c18d36fb6378aa0d99d6ca53b4c91c2ccacefed46dbcec1a1bef" Nov 26 15:33:18 crc kubenswrapper[4785]: I1126 15:33:18.399498 4785 scope.go:117] "RemoveContainer" containerID="99d413fedadb24041128646f11fb62f124690d7419c9688868c720918a96507f" Nov 26 15:33:18 crc kubenswrapper[4785]: I1126 15:33:18.419222 4785 scope.go:117] "RemoveContainer" containerID="3fd311ccc70cf8f4a3a683a80de70adbe247031b5a71df152a6b8b8b897179f4" Nov 26 15:33:19 crc kubenswrapper[4785]: I1126 15:33:19.105291 4785 generic.go:334] "Generic (PLEG): container finished" podID="a0142e7b-caff-4c76-86db-0b2fa7f3fc6d" containerID="1add0f980337bebe7a50ad24d3eee9fff5b5d17cd273bd8d17bf8c36db122afc" exitCode=0 Nov 26 15:33:19 crc kubenswrapper[4785]: I1126 15:33:19.105366 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sh25t" event={"ID":"a0142e7b-caff-4c76-86db-0b2fa7f3fc6d","Type":"ContainerDied","Data":"1add0f980337bebe7a50ad24d3eee9fff5b5d17cd273bd8d17bf8c36db122afc"} Nov 26 15:33:19 crc kubenswrapper[4785]: I1126 15:33:19.787704 4785 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-vhpr7"] Nov 26 15:33:19 crc kubenswrapper[4785]: I1126 15:33:19.789974 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vhpr7" Nov 26 15:33:19 crc kubenswrapper[4785]: I1126 15:33:19.802283 4785 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-vhpr7"] Nov 26 15:33:19 crc kubenswrapper[4785]: I1126 15:33:19.972098 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m8hw7\" (UniqueName: \"kubernetes.io/projected/9d23856b-e9af-4a74-bddc-f17804e47ffa-kube-api-access-m8hw7\") pod \"certified-operators-vhpr7\" (UID: \"9d23856b-e9af-4a74-bddc-f17804e47ffa\") " pod="openshift-marketplace/certified-operators-vhpr7" Nov 26 15:33:19 crc kubenswrapper[4785]: I1126 15:33:19.972199 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9d23856b-e9af-4a74-bddc-f17804e47ffa-catalog-content\") pod \"certified-operators-vhpr7\" (UID: \"9d23856b-e9af-4a74-bddc-f17804e47ffa\") " pod="openshift-marketplace/certified-operators-vhpr7" Nov 26 15:33:19 crc kubenswrapper[4785]: I1126 15:33:19.972355 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9d23856b-e9af-4a74-bddc-f17804e47ffa-utilities\") pod \"certified-operators-vhpr7\" (UID: \"9d23856b-e9af-4a74-bddc-f17804e47ffa\") " pod="openshift-marketplace/certified-operators-vhpr7" Nov 26 15:33:20 crc kubenswrapper[4785]: I1126 15:33:20.074415 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9d23856b-e9af-4a74-bddc-f17804e47ffa-catalog-content\") pod \"certified-operators-vhpr7\" (UID: \"9d23856b-e9af-4a74-bddc-f17804e47ffa\") " pod="openshift-marketplace/certified-operators-vhpr7" Nov 26 15:33:20 crc kubenswrapper[4785]: I1126 15:33:20.074516 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9d23856b-e9af-4a74-bddc-f17804e47ffa-utilities\") pod \"certified-operators-vhpr7\" (UID: \"9d23856b-e9af-4a74-bddc-f17804e47ffa\") " pod="openshift-marketplace/certified-operators-vhpr7" Nov 26 15:33:20 crc kubenswrapper[4785]: I1126 15:33:20.074751 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m8hw7\" (UniqueName: \"kubernetes.io/projected/9d23856b-e9af-4a74-bddc-f17804e47ffa-kube-api-access-m8hw7\") pod \"certified-operators-vhpr7\" (UID: \"9d23856b-e9af-4a74-bddc-f17804e47ffa\") " pod="openshift-marketplace/certified-operators-vhpr7" Nov 26 15:33:20 crc kubenswrapper[4785]: I1126 15:33:20.075370 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9d23856b-e9af-4a74-bddc-f17804e47ffa-utilities\") pod \"certified-operators-vhpr7\" (UID: \"9d23856b-e9af-4a74-bddc-f17804e47ffa\") " pod="openshift-marketplace/certified-operators-vhpr7" Nov 26 15:33:20 crc kubenswrapper[4785]: I1126 15:33:20.076046 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9d23856b-e9af-4a74-bddc-f17804e47ffa-catalog-content\") pod \"certified-operators-vhpr7\" (UID: \"9d23856b-e9af-4a74-bddc-f17804e47ffa\") " pod="openshift-marketplace/certified-operators-vhpr7" Nov 26 15:33:20 crc kubenswrapper[4785]: I1126 15:33:20.103781 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m8hw7\" (UniqueName: \"kubernetes.io/projected/9d23856b-e9af-4a74-bddc-f17804e47ffa-kube-api-access-m8hw7\") pod \"certified-operators-vhpr7\" (UID: \"9d23856b-e9af-4a74-bddc-f17804e47ffa\") " pod="openshift-marketplace/certified-operators-vhpr7" Nov 26 15:33:20 crc kubenswrapper[4785]: I1126 15:33:20.122380 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vhpr7" Nov 26 15:33:20 crc kubenswrapper[4785]: I1126 15:33:20.557795 4785 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-vhpr7"] Nov 26 15:33:20 crc kubenswrapper[4785]: W1126 15:33:20.558162 4785 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d23856b_e9af_4a74_bddc_f17804e47ffa.slice/crio-1bd13247ef4566364d3a99b99a74bd17cdbec1a096daee5dedce3bd0837cfb9d WatchSource:0}: Error finding container 1bd13247ef4566364d3a99b99a74bd17cdbec1a096daee5dedce3bd0837cfb9d: Status 404 returned error can't find the container with id 1bd13247ef4566364d3a99b99a74bd17cdbec1a096daee5dedce3bd0837cfb9d Nov 26 15:33:21 crc kubenswrapper[4785]: I1126 15:33:21.122081 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sh25t" event={"ID":"a0142e7b-caff-4c76-86db-0b2fa7f3fc6d","Type":"ContainerStarted","Data":"4526b6aa13f141d9486020c635c77f0b3510ec7632150fe52fba38ae1fa0a478"} Nov 26 15:33:21 crc kubenswrapper[4785]: I1126 15:33:21.123804 4785 generic.go:334] "Generic (PLEG): container finished" podID="9d23856b-e9af-4a74-bddc-f17804e47ffa" containerID="cf7bd40cb3cb3a3bf0cffa98e568dc4b7a5659a4fd41b0030090d7c17e876acf" exitCode=0 Nov 26 15:33:21 crc kubenswrapper[4785]: I1126 15:33:21.123854 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vhpr7" event={"ID":"9d23856b-e9af-4a74-bddc-f17804e47ffa","Type":"ContainerDied","Data":"cf7bd40cb3cb3a3bf0cffa98e568dc4b7a5659a4fd41b0030090d7c17e876acf"} Nov 26 15:33:21 crc kubenswrapper[4785]: I1126 15:33:21.123917 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vhpr7" event={"ID":"9d23856b-e9af-4a74-bddc-f17804e47ffa","Type":"ContainerStarted","Data":"1bd13247ef4566364d3a99b99a74bd17cdbec1a096daee5dedce3bd0837cfb9d"} Nov 26 15:33:21 crc kubenswrapper[4785]: I1126 15:33:21.143637 4785 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-sh25t" podStartSLOduration=3.241525992 podStartE2EDuration="6.143616391s" podCreationTimestamp="2025-11-26 15:33:15 +0000 UTC" firstStartedPulling="2025-11-26 15:33:17.084245481 +0000 UTC m=+1560.762611285" lastFinishedPulling="2025-11-26 15:33:19.98633592 +0000 UTC m=+1563.664701684" observedRunningTime="2025-11-26 15:33:21.14210596 +0000 UTC m=+1564.820471724" watchObservedRunningTime="2025-11-26 15:33:21.143616391 +0000 UTC m=+1564.821982155" Nov 26 15:33:21 crc kubenswrapper[4785]: I1126 15:33:21.575156 4785 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-ghbwg"] Nov 26 15:33:21 crc kubenswrapper[4785]: I1126 15:33:21.577198 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ghbwg" Nov 26 15:33:21 crc kubenswrapper[4785]: I1126 15:33:21.587652 4785 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-ghbwg"] Nov 26 15:33:21 crc kubenswrapper[4785]: I1126 15:33:21.604671 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nw7bf\" (UniqueName: \"kubernetes.io/projected/f9f06b92-7cee-45e2-870e-aa8805ec08fa-kube-api-access-nw7bf\") pod \"redhat-marketplace-ghbwg\" (UID: \"f9f06b92-7cee-45e2-870e-aa8805ec08fa\") " pod="openshift-marketplace/redhat-marketplace-ghbwg" Nov 26 15:33:21 crc kubenswrapper[4785]: I1126 15:33:21.604709 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f9f06b92-7cee-45e2-870e-aa8805ec08fa-catalog-content\") pod \"redhat-marketplace-ghbwg\" (UID: \"f9f06b92-7cee-45e2-870e-aa8805ec08fa\") " pod="openshift-marketplace/redhat-marketplace-ghbwg" Nov 26 15:33:21 crc kubenswrapper[4785]: I1126 15:33:21.604736 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f9f06b92-7cee-45e2-870e-aa8805ec08fa-utilities\") pod \"redhat-marketplace-ghbwg\" (UID: \"f9f06b92-7cee-45e2-870e-aa8805ec08fa\") " pod="openshift-marketplace/redhat-marketplace-ghbwg" Nov 26 15:33:21 crc kubenswrapper[4785]: I1126 15:33:21.706150 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nw7bf\" (UniqueName: \"kubernetes.io/projected/f9f06b92-7cee-45e2-870e-aa8805ec08fa-kube-api-access-nw7bf\") pod \"redhat-marketplace-ghbwg\" (UID: \"f9f06b92-7cee-45e2-870e-aa8805ec08fa\") " pod="openshift-marketplace/redhat-marketplace-ghbwg" Nov 26 15:33:21 crc kubenswrapper[4785]: I1126 15:33:21.706208 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f9f06b92-7cee-45e2-870e-aa8805ec08fa-catalog-content\") pod \"redhat-marketplace-ghbwg\" (UID: \"f9f06b92-7cee-45e2-870e-aa8805ec08fa\") " pod="openshift-marketplace/redhat-marketplace-ghbwg" Nov 26 15:33:21 crc kubenswrapper[4785]: I1126 15:33:21.706236 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f9f06b92-7cee-45e2-870e-aa8805ec08fa-utilities\") pod \"redhat-marketplace-ghbwg\" (UID: \"f9f06b92-7cee-45e2-870e-aa8805ec08fa\") " pod="openshift-marketplace/redhat-marketplace-ghbwg" Nov 26 15:33:21 crc kubenswrapper[4785]: I1126 15:33:21.706834 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f9f06b92-7cee-45e2-870e-aa8805ec08fa-catalog-content\") pod \"redhat-marketplace-ghbwg\" (UID: \"f9f06b92-7cee-45e2-870e-aa8805ec08fa\") " pod="openshift-marketplace/redhat-marketplace-ghbwg" Nov 26 15:33:21 crc kubenswrapper[4785]: I1126 15:33:21.706855 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f9f06b92-7cee-45e2-870e-aa8805ec08fa-utilities\") pod \"redhat-marketplace-ghbwg\" (UID: \"f9f06b92-7cee-45e2-870e-aa8805ec08fa\") " pod="openshift-marketplace/redhat-marketplace-ghbwg" Nov 26 15:33:21 crc kubenswrapper[4785]: I1126 15:33:21.738498 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nw7bf\" (UniqueName: \"kubernetes.io/projected/f9f06b92-7cee-45e2-870e-aa8805ec08fa-kube-api-access-nw7bf\") pod \"redhat-marketplace-ghbwg\" (UID: \"f9f06b92-7cee-45e2-870e-aa8805ec08fa\") " pod="openshift-marketplace/redhat-marketplace-ghbwg" Nov 26 15:33:21 crc kubenswrapper[4785]: I1126 15:33:21.900604 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ghbwg" Nov 26 15:33:22 crc kubenswrapper[4785]: W1126 15:33:22.393758 4785 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf9f06b92_7cee_45e2_870e_aa8805ec08fa.slice/crio-e5a9df6d069c1c64662e8495b04da373f3381bbd43948cfcb8bc75487045f6e2 WatchSource:0}: Error finding container e5a9df6d069c1c64662e8495b04da373f3381bbd43948cfcb8bc75487045f6e2: Status 404 returned error can't find the container with id e5a9df6d069c1c64662e8495b04da373f3381bbd43948cfcb8bc75487045f6e2 Nov 26 15:33:22 crc kubenswrapper[4785]: I1126 15:33:22.395672 4785 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-ghbwg"] Nov 26 15:33:23 crc kubenswrapper[4785]: I1126 15:33:23.151731 4785 generic.go:334] "Generic (PLEG): container finished" podID="9d23856b-e9af-4a74-bddc-f17804e47ffa" containerID="5e4a529b7319bcf2293f91ed3726ea197be6826d2eacab8b5611b309a0522ab5" exitCode=0 Nov 26 15:33:23 crc kubenswrapper[4785]: I1126 15:33:23.152073 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vhpr7" event={"ID":"9d23856b-e9af-4a74-bddc-f17804e47ffa","Type":"ContainerDied","Data":"5e4a529b7319bcf2293f91ed3726ea197be6826d2eacab8b5611b309a0522ab5"} Nov 26 15:33:23 crc kubenswrapper[4785]: I1126 15:33:23.154916 4785 generic.go:334] "Generic (PLEG): container finished" podID="f9f06b92-7cee-45e2-870e-aa8805ec08fa" containerID="bf2332296f6af56600c74496c7dfc4479ab7388c7d8cdbb2b4652a60bb1b5354" exitCode=0 Nov 26 15:33:23 crc kubenswrapper[4785]: I1126 15:33:23.154951 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ghbwg" event={"ID":"f9f06b92-7cee-45e2-870e-aa8805ec08fa","Type":"ContainerDied","Data":"bf2332296f6af56600c74496c7dfc4479ab7388c7d8cdbb2b4652a60bb1b5354"} Nov 26 15:33:23 crc kubenswrapper[4785]: I1126 15:33:23.154975 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ghbwg" event={"ID":"f9f06b92-7cee-45e2-870e-aa8805ec08fa","Type":"ContainerStarted","Data":"e5a9df6d069c1c64662e8495b04da373f3381bbd43948cfcb8bc75487045f6e2"} Nov 26 15:33:24 crc kubenswrapper[4785]: I1126 15:33:24.335997 4785 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-xlnfx" Nov 26 15:33:24 crc kubenswrapper[4785]: I1126 15:33:24.336319 4785 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-xlnfx" Nov 26 15:33:24 crc kubenswrapper[4785]: I1126 15:33:24.395490 4785 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-xlnfx" Nov 26 15:33:25 crc kubenswrapper[4785]: I1126 15:33:25.037190 4785 scope.go:117] "RemoveContainer" containerID="3e8c4964e31211d666dbae0e50d629dfc05802d610982228b7bbd6c4dcfcee2a" Nov 26 15:33:25 crc kubenswrapper[4785]: E1126 15:33:25.037536 4785 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gkxdl_openshift-machine-config-operator(5539d39a-e2bc-4e7f-8b5a-a5e3e10c4ba4)\"" pod="openshift-machine-config-operator/machine-config-daemon-gkxdl" podUID="5539d39a-e2bc-4e7f-8b5a-a5e3e10c4ba4" Nov 26 15:33:25 crc kubenswrapper[4785]: I1126 15:33:25.174035 4785 generic.go:334] "Generic (PLEG): container finished" podID="f9f06b92-7cee-45e2-870e-aa8805ec08fa" containerID="176a8fa07192183f08fd07ad8c3a8381a7a766fcda79263e03faa49f5f282d75" exitCode=0 Nov 26 15:33:25 crc kubenswrapper[4785]: I1126 15:33:25.174260 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ghbwg" event={"ID":"f9f06b92-7cee-45e2-870e-aa8805ec08fa","Type":"ContainerDied","Data":"176a8fa07192183f08fd07ad8c3a8381a7a766fcda79263e03faa49f5f282d75"} Nov 26 15:33:25 crc kubenswrapper[4785]: I1126 15:33:25.178836 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vhpr7" event={"ID":"9d23856b-e9af-4a74-bddc-f17804e47ffa","Type":"ContainerStarted","Data":"5f5693f19addbc2d9415e35aff03b3d7eb99db31d8eb9a265fd90aa26357c093"} Nov 26 15:33:25 crc kubenswrapper[4785]: I1126 15:33:25.233589 4785 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-vhpr7" podStartSLOduration=3.333990062 podStartE2EDuration="6.233535502s" podCreationTimestamp="2025-11-26 15:33:19 +0000 UTC" firstStartedPulling="2025-11-26 15:33:21.125264173 +0000 UTC m=+1564.803629937" lastFinishedPulling="2025-11-26 15:33:24.024809583 +0000 UTC m=+1567.703175377" observedRunningTime="2025-11-26 15:33:25.217546798 +0000 UTC m=+1568.895912652" watchObservedRunningTime="2025-11-26 15:33:25.233535502 +0000 UTC m=+1568.911901296" Nov 26 15:33:25 crc kubenswrapper[4785]: I1126 15:33:25.247103 4785 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-xlnfx" Nov 26 15:33:25 crc kubenswrapper[4785]: I1126 15:33:25.918051 4785 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-sh25t" Nov 26 15:33:25 crc kubenswrapper[4785]: I1126 15:33:25.918082 4785 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-sh25t" Nov 26 15:33:25 crc kubenswrapper[4785]: I1126 15:33:25.970444 4785 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-sh25t" Nov 26 15:33:26 crc kubenswrapper[4785]: I1126 15:33:26.190780 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ghbwg" event={"ID":"f9f06b92-7cee-45e2-870e-aa8805ec08fa","Type":"ContainerStarted","Data":"19cbaf2091bc1870a522e061f2e8e7052a07c54504a08eff2729b023f33a76b5"} Nov 26 15:33:26 crc kubenswrapper[4785]: I1126 15:33:26.210070 4785 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-ghbwg" podStartSLOduration=2.565629034 podStartE2EDuration="5.210049851s" podCreationTimestamp="2025-11-26 15:33:21 +0000 UTC" firstStartedPulling="2025-11-26 15:33:23.156939487 +0000 UTC m=+1566.835305251" lastFinishedPulling="2025-11-26 15:33:25.801360304 +0000 UTC m=+1569.479726068" observedRunningTime="2025-11-26 15:33:26.208943691 +0000 UTC m=+1569.887309485" watchObservedRunningTime="2025-11-26 15:33:26.210049851 +0000 UTC m=+1569.888415625" Nov 26 15:33:26 crc kubenswrapper[4785]: I1126 15:33:26.253771 4785 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-sh25t" Nov 26 15:33:27 crc kubenswrapper[4785]: I1126 15:33:27.767090 4785 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-sh25t"] Nov 26 15:33:28 crc kubenswrapper[4785]: I1126 15:33:28.213053 4785 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-sh25t" podUID="a0142e7b-caff-4c76-86db-0b2fa7f3fc6d" containerName="registry-server" containerID="cri-o://4526b6aa13f141d9486020c635c77f0b3510ec7632150fe52fba38ae1fa0a478" gracePeriod=2 Nov 26 15:33:29 crc kubenswrapper[4785]: I1126 15:33:29.137031 4785 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-sh25t" Nov 26 15:33:29 crc kubenswrapper[4785]: I1126 15:33:29.218936 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a0142e7b-caff-4c76-86db-0b2fa7f3fc6d-utilities\") pod \"a0142e7b-caff-4c76-86db-0b2fa7f3fc6d\" (UID: \"a0142e7b-caff-4c76-86db-0b2fa7f3fc6d\") " Nov 26 15:33:29 crc kubenswrapper[4785]: I1126 15:33:29.219068 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mm79p\" (UniqueName: \"kubernetes.io/projected/a0142e7b-caff-4c76-86db-0b2fa7f3fc6d-kube-api-access-mm79p\") pod \"a0142e7b-caff-4c76-86db-0b2fa7f3fc6d\" (UID: \"a0142e7b-caff-4c76-86db-0b2fa7f3fc6d\") " Nov 26 15:33:29 crc kubenswrapper[4785]: I1126 15:33:29.219129 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a0142e7b-caff-4c76-86db-0b2fa7f3fc6d-catalog-content\") pod \"a0142e7b-caff-4c76-86db-0b2fa7f3fc6d\" (UID: \"a0142e7b-caff-4c76-86db-0b2fa7f3fc6d\") " Nov 26 15:33:29 crc kubenswrapper[4785]: I1126 15:33:29.219785 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a0142e7b-caff-4c76-86db-0b2fa7f3fc6d-utilities" (OuterVolumeSpecName: "utilities") pod "a0142e7b-caff-4c76-86db-0b2fa7f3fc6d" (UID: "a0142e7b-caff-4c76-86db-0b2fa7f3fc6d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 15:33:29 crc kubenswrapper[4785]: I1126 15:33:29.226410 4785 generic.go:334] "Generic (PLEG): container finished" podID="a0142e7b-caff-4c76-86db-0b2fa7f3fc6d" containerID="4526b6aa13f141d9486020c635c77f0b3510ec7632150fe52fba38ae1fa0a478" exitCode=0 Nov 26 15:33:29 crc kubenswrapper[4785]: I1126 15:33:29.226471 4785 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-sh25t" Nov 26 15:33:29 crc kubenswrapper[4785]: I1126 15:33:29.226472 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sh25t" event={"ID":"a0142e7b-caff-4c76-86db-0b2fa7f3fc6d","Type":"ContainerDied","Data":"4526b6aa13f141d9486020c635c77f0b3510ec7632150fe52fba38ae1fa0a478"} Nov 26 15:33:29 crc kubenswrapper[4785]: I1126 15:33:29.226625 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sh25t" event={"ID":"a0142e7b-caff-4c76-86db-0b2fa7f3fc6d","Type":"ContainerDied","Data":"775e3e87ded51a00376228088ad279d5653ce542b982e743ee9c1cfaaba6818c"} Nov 26 15:33:29 crc kubenswrapper[4785]: I1126 15:33:29.226664 4785 scope.go:117] "RemoveContainer" containerID="4526b6aa13f141d9486020c635c77f0b3510ec7632150fe52fba38ae1fa0a478" Nov 26 15:33:29 crc kubenswrapper[4785]: I1126 15:33:29.226687 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0142e7b-caff-4c76-86db-0b2fa7f3fc6d-kube-api-access-mm79p" (OuterVolumeSpecName: "kube-api-access-mm79p") pod "a0142e7b-caff-4c76-86db-0b2fa7f3fc6d" (UID: "a0142e7b-caff-4c76-86db-0b2fa7f3fc6d"). InnerVolumeSpecName "kube-api-access-mm79p". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 15:33:29 crc kubenswrapper[4785]: I1126 15:33:29.263867 4785 scope.go:117] "RemoveContainer" containerID="1add0f980337bebe7a50ad24d3eee9fff5b5d17cd273bd8d17bf8c36db122afc" Nov 26 15:33:29 crc kubenswrapper[4785]: I1126 15:33:29.281180 4785 scope.go:117] "RemoveContainer" containerID="9e5cc22c153f6e8f746706cc9fd003cc4a83e1acca452496d9b881e461561bc4" Nov 26 15:33:29 crc kubenswrapper[4785]: I1126 15:33:29.314618 4785 scope.go:117] "RemoveContainer" containerID="4526b6aa13f141d9486020c635c77f0b3510ec7632150fe52fba38ae1fa0a478" Nov 26 15:33:29 crc kubenswrapper[4785]: E1126 15:33:29.315054 4785 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4526b6aa13f141d9486020c635c77f0b3510ec7632150fe52fba38ae1fa0a478\": container with ID starting with 4526b6aa13f141d9486020c635c77f0b3510ec7632150fe52fba38ae1fa0a478 not found: ID does not exist" containerID="4526b6aa13f141d9486020c635c77f0b3510ec7632150fe52fba38ae1fa0a478" Nov 26 15:33:29 crc kubenswrapper[4785]: I1126 15:33:29.315107 4785 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4526b6aa13f141d9486020c635c77f0b3510ec7632150fe52fba38ae1fa0a478"} err="failed to get container status \"4526b6aa13f141d9486020c635c77f0b3510ec7632150fe52fba38ae1fa0a478\": rpc error: code = NotFound desc = could not find container \"4526b6aa13f141d9486020c635c77f0b3510ec7632150fe52fba38ae1fa0a478\": container with ID starting with 4526b6aa13f141d9486020c635c77f0b3510ec7632150fe52fba38ae1fa0a478 not found: ID does not exist" Nov 26 15:33:29 crc kubenswrapper[4785]: I1126 15:33:29.315140 4785 scope.go:117] "RemoveContainer" containerID="1add0f980337bebe7a50ad24d3eee9fff5b5d17cd273bd8d17bf8c36db122afc" Nov 26 15:33:29 crc kubenswrapper[4785]: E1126 15:33:29.315470 4785 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1add0f980337bebe7a50ad24d3eee9fff5b5d17cd273bd8d17bf8c36db122afc\": container with ID starting with 1add0f980337bebe7a50ad24d3eee9fff5b5d17cd273bd8d17bf8c36db122afc not found: ID does not exist" containerID="1add0f980337bebe7a50ad24d3eee9fff5b5d17cd273bd8d17bf8c36db122afc" Nov 26 15:33:29 crc kubenswrapper[4785]: I1126 15:33:29.315500 4785 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1add0f980337bebe7a50ad24d3eee9fff5b5d17cd273bd8d17bf8c36db122afc"} err="failed to get container status \"1add0f980337bebe7a50ad24d3eee9fff5b5d17cd273bd8d17bf8c36db122afc\": rpc error: code = NotFound desc = could not find container \"1add0f980337bebe7a50ad24d3eee9fff5b5d17cd273bd8d17bf8c36db122afc\": container with ID starting with 1add0f980337bebe7a50ad24d3eee9fff5b5d17cd273bd8d17bf8c36db122afc not found: ID does not exist" Nov 26 15:33:29 crc kubenswrapper[4785]: I1126 15:33:29.315521 4785 scope.go:117] "RemoveContainer" containerID="9e5cc22c153f6e8f746706cc9fd003cc4a83e1acca452496d9b881e461561bc4" Nov 26 15:33:29 crc kubenswrapper[4785]: E1126 15:33:29.315812 4785 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9e5cc22c153f6e8f746706cc9fd003cc4a83e1acca452496d9b881e461561bc4\": container with ID starting with 9e5cc22c153f6e8f746706cc9fd003cc4a83e1acca452496d9b881e461561bc4 not found: ID does not exist" containerID="9e5cc22c153f6e8f746706cc9fd003cc4a83e1acca452496d9b881e461561bc4" Nov 26 15:33:29 crc kubenswrapper[4785]: I1126 15:33:29.315849 4785 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9e5cc22c153f6e8f746706cc9fd003cc4a83e1acca452496d9b881e461561bc4"} err="failed to get container status \"9e5cc22c153f6e8f746706cc9fd003cc4a83e1acca452496d9b881e461561bc4\": rpc error: code = NotFound desc = could not find container \"9e5cc22c153f6e8f746706cc9fd003cc4a83e1acca452496d9b881e461561bc4\": container with ID starting with 9e5cc22c153f6e8f746706cc9fd003cc4a83e1acca452496d9b881e461561bc4 not found: ID does not exist" Nov 26 15:33:29 crc kubenswrapper[4785]: I1126 15:33:29.320233 4785 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mm79p\" (UniqueName: \"kubernetes.io/projected/a0142e7b-caff-4c76-86db-0b2fa7f3fc6d-kube-api-access-mm79p\") on node \"crc\" DevicePath \"\"" Nov 26 15:33:29 crc kubenswrapper[4785]: I1126 15:33:29.320358 4785 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a0142e7b-caff-4c76-86db-0b2fa7f3fc6d-utilities\") on node \"crc\" DevicePath \"\"" Nov 26 15:33:29 crc kubenswrapper[4785]: I1126 15:33:29.373636 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a0142e7b-caff-4c76-86db-0b2fa7f3fc6d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a0142e7b-caff-4c76-86db-0b2fa7f3fc6d" (UID: "a0142e7b-caff-4c76-86db-0b2fa7f3fc6d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 15:33:29 crc kubenswrapper[4785]: I1126 15:33:29.422298 4785 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a0142e7b-caff-4c76-86db-0b2fa7f3fc6d-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 26 15:33:29 crc kubenswrapper[4785]: I1126 15:33:29.568405 4785 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-sh25t"] Nov 26 15:33:29 crc kubenswrapper[4785]: I1126 15:33:29.574501 4785 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-sh25t"] Nov 26 15:33:30 crc kubenswrapper[4785]: I1126 15:33:30.123387 4785 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-vhpr7" Nov 26 15:33:30 crc kubenswrapper[4785]: I1126 15:33:30.123450 4785 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-vhpr7" Nov 26 15:33:30 crc kubenswrapper[4785]: I1126 15:33:30.166080 4785 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-vhpr7" Nov 26 15:33:30 crc kubenswrapper[4785]: I1126 15:33:30.279356 4785 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-vhpr7" Nov 26 15:33:31 crc kubenswrapper[4785]: I1126 15:33:31.047407 4785 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0142e7b-caff-4c76-86db-0b2fa7f3fc6d" path="/var/lib/kubelet/pods/a0142e7b-caff-4c76-86db-0b2fa7f3fc6d/volumes" Nov 26 15:33:31 crc kubenswrapper[4785]: I1126 15:33:31.901363 4785 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-ghbwg" Nov 26 15:33:31 crc kubenswrapper[4785]: I1126 15:33:31.901439 4785 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-ghbwg" Nov 26 15:33:31 crc kubenswrapper[4785]: I1126 15:33:31.962779 4785 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-ghbwg" Nov 26 15:33:32 crc kubenswrapper[4785]: I1126 15:33:32.323711 4785 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-ghbwg" Nov 26 15:33:35 crc kubenswrapper[4785]: I1126 15:33:35.766881 4785 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-vhpr7"] Nov 26 15:33:35 crc kubenswrapper[4785]: I1126 15:33:35.767444 4785 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-vhpr7" podUID="9d23856b-e9af-4a74-bddc-f17804e47ffa" containerName="registry-server" containerID="cri-o://5f5693f19addbc2d9415e35aff03b3d7eb99db31d8eb9a265fd90aa26357c093" gracePeriod=2 Nov 26 15:33:36 crc kubenswrapper[4785]: I1126 15:33:36.036874 4785 scope.go:117] "RemoveContainer" containerID="3e8c4964e31211d666dbae0e50d629dfc05802d610982228b7bbd6c4dcfcee2a" Nov 26 15:33:36 crc kubenswrapper[4785]: E1126 15:33:36.037455 4785 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gkxdl_openshift-machine-config-operator(5539d39a-e2bc-4e7f-8b5a-a5e3e10c4ba4)\"" pod="openshift-machine-config-operator/machine-config-daemon-gkxdl" podUID="5539d39a-e2bc-4e7f-8b5a-a5e3e10c4ba4" Nov 26 15:33:36 crc kubenswrapper[4785]: I1126 15:33:36.167880 4785 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-ghbwg"] Nov 26 15:33:36 crc kubenswrapper[4785]: I1126 15:33:36.168221 4785 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-ghbwg" podUID="f9f06b92-7cee-45e2-870e-aa8805ec08fa" containerName="registry-server" containerID="cri-o://19cbaf2091bc1870a522e061f2e8e7052a07c54504a08eff2729b023f33a76b5" gracePeriod=2 Nov 26 15:33:36 crc kubenswrapper[4785]: I1126 15:33:36.291866 4785 generic.go:334] "Generic (PLEG): container finished" podID="f9f06b92-7cee-45e2-870e-aa8805ec08fa" containerID="19cbaf2091bc1870a522e061f2e8e7052a07c54504a08eff2729b023f33a76b5" exitCode=0 Nov 26 15:33:36 crc kubenswrapper[4785]: I1126 15:33:36.291956 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ghbwg" event={"ID":"f9f06b92-7cee-45e2-870e-aa8805ec08fa","Type":"ContainerDied","Data":"19cbaf2091bc1870a522e061f2e8e7052a07c54504a08eff2729b023f33a76b5"} Nov 26 15:33:36 crc kubenswrapper[4785]: I1126 15:33:36.294716 4785 generic.go:334] "Generic (PLEG): container finished" podID="9d23856b-e9af-4a74-bddc-f17804e47ffa" containerID="5f5693f19addbc2d9415e35aff03b3d7eb99db31d8eb9a265fd90aa26357c093" exitCode=0 Nov 26 15:33:36 crc kubenswrapper[4785]: I1126 15:33:36.294754 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vhpr7" event={"ID":"9d23856b-e9af-4a74-bddc-f17804e47ffa","Type":"ContainerDied","Data":"5f5693f19addbc2d9415e35aff03b3d7eb99db31d8eb9a265fd90aa26357c093"} Nov 26 15:33:36 crc kubenswrapper[4785]: I1126 15:33:36.294809 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vhpr7" event={"ID":"9d23856b-e9af-4a74-bddc-f17804e47ffa","Type":"ContainerDied","Data":"1bd13247ef4566364d3a99b99a74bd17cdbec1a096daee5dedce3bd0837cfb9d"} Nov 26 15:33:36 crc kubenswrapper[4785]: I1126 15:33:36.294824 4785 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1bd13247ef4566364d3a99b99a74bd17cdbec1a096daee5dedce3bd0837cfb9d" Nov 26 15:33:36 crc kubenswrapper[4785]: I1126 15:33:36.347501 4785 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vhpr7" Nov 26 15:33:36 crc kubenswrapper[4785]: I1126 15:33:36.435322 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9d23856b-e9af-4a74-bddc-f17804e47ffa-utilities\") pod \"9d23856b-e9af-4a74-bddc-f17804e47ffa\" (UID: \"9d23856b-e9af-4a74-bddc-f17804e47ffa\") " Nov 26 15:33:36 crc kubenswrapper[4785]: I1126 15:33:36.435478 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m8hw7\" (UniqueName: \"kubernetes.io/projected/9d23856b-e9af-4a74-bddc-f17804e47ffa-kube-api-access-m8hw7\") pod \"9d23856b-e9af-4a74-bddc-f17804e47ffa\" (UID: \"9d23856b-e9af-4a74-bddc-f17804e47ffa\") " Nov 26 15:33:36 crc kubenswrapper[4785]: I1126 15:33:36.435504 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9d23856b-e9af-4a74-bddc-f17804e47ffa-catalog-content\") pod \"9d23856b-e9af-4a74-bddc-f17804e47ffa\" (UID: \"9d23856b-e9af-4a74-bddc-f17804e47ffa\") " Nov 26 15:33:36 crc kubenswrapper[4785]: I1126 15:33:36.436315 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9d23856b-e9af-4a74-bddc-f17804e47ffa-utilities" (OuterVolumeSpecName: "utilities") pod "9d23856b-e9af-4a74-bddc-f17804e47ffa" (UID: "9d23856b-e9af-4a74-bddc-f17804e47ffa"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 15:33:36 crc kubenswrapper[4785]: I1126 15:33:36.446149 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d23856b-e9af-4a74-bddc-f17804e47ffa-kube-api-access-m8hw7" (OuterVolumeSpecName: "kube-api-access-m8hw7") pod "9d23856b-e9af-4a74-bddc-f17804e47ffa" (UID: "9d23856b-e9af-4a74-bddc-f17804e47ffa"). InnerVolumeSpecName "kube-api-access-m8hw7". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 15:33:36 crc kubenswrapper[4785]: I1126 15:33:36.496451 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9d23856b-e9af-4a74-bddc-f17804e47ffa-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9d23856b-e9af-4a74-bddc-f17804e47ffa" (UID: "9d23856b-e9af-4a74-bddc-f17804e47ffa"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 15:33:36 crc kubenswrapper[4785]: I1126 15:33:36.537888 4785 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m8hw7\" (UniqueName: \"kubernetes.io/projected/9d23856b-e9af-4a74-bddc-f17804e47ffa-kube-api-access-m8hw7\") on node \"crc\" DevicePath \"\"" Nov 26 15:33:36 crc kubenswrapper[4785]: I1126 15:33:36.537956 4785 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9d23856b-e9af-4a74-bddc-f17804e47ffa-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 26 15:33:36 crc kubenswrapper[4785]: I1126 15:33:36.537978 4785 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9d23856b-e9af-4a74-bddc-f17804e47ffa-utilities\") on node \"crc\" DevicePath \"\"" Nov 26 15:33:36 crc kubenswrapper[4785]: I1126 15:33:36.558626 4785 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ghbwg" Nov 26 15:33:36 crc kubenswrapper[4785]: I1126 15:33:36.638668 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f9f06b92-7cee-45e2-870e-aa8805ec08fa-utilities\") pod \"f9f06b92-7cee-45e2-870e-aa8805ec08fa\" (UID: \"f9f06b92-7cee-45e2-870e-aa8805ec08fa\") " Nov 26 15:33:36 crc kubenswrapper[4785]: I1126 15:33:36.638983 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nw7bf\" (UniqueName: \"kubernetes.io/projected/f9f06b92-7cee-45e2-870e-aa8805ec08fa-kube-api-access-nw7bf\") pod \"f9f06b92-7cee-45e2-870e-aa8805ec08fa\" (UID: \"f9f06b92-7cee-45e2-870e-aa8805ec08fa\") " Nov 26 15:33:36 crc kubenswrapper[4785]: I1126 15:33:36.639575 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f9f06b92-7cee-45e2-870e-aa8805ec08fa-utilities" (OuterVolumeSpecName: "utilities") pod "f9f06b92-7cee-45e2-870e-aa8805ec08fa" (UID: "f9f06b92-7cee-45e2-870e-aa8805ec08fa"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 15:33:36 crc kubenswrapper[4785]: I1126 15:33:36.639593 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f9f06b92-7cee-45e2-870e-aa8805ec08fa-catalog-content\") pod \"f9f06b92-7cee-45e2-870e-aa8805ec08fa\" (UID: \"f9f06b92-7cee-45e2-870e-aa8805ec08fa\") " Nov 26 15:33:36 crc kubenswrapper[4785]: I1126 15:33:36.640429 4785 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f9f06b92-7cee-45e2-870e-aa8805ec08fa-utilities\") on node \"crc\" DevicePath \"\"" Nov 26 15:33:36 crc kubenswrapper[4785]: I1126 15:33:36.641751 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f9f06b92-7cee-45e2-870e-aa8805ec08fa-kube-api-access-nw7bf" (OuterVolumeSpecName: "kube-api-access-nw7bf") pod "f9f06b92-7cee-45e2-870e-aa8805ec08fa" (UID: "f9f06b92-7cee-45e2-870e-aa8805ec08fa"). InnerVolumeSpecName "kube-api-access-nw7bf". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 15:33:36 crc kubenswrapper[4785]: I1126 15:33:36.656987 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f9f06b92-7cee-45e2-870e-aa8805ec08fa-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f9f06b92-7cee-45e2-870e-aa8805ec08fa" (UID: "f9f06b92-7cee-45e2-870e-aa8805ec08fa"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 15:33:36 crc kubenswrapper[4785]: I1126 15:33:36.741656 4785 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f9f06b92-7cee-45e2-870e-aa8805ec08fa-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 26 15:33:36 crc kubenswrapper[4785]: I1126 15:33:36.741715 4785 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nw7bf\" (UniqueName: \"kubernetes.io/projected/f9f06b92-7cee-45e2-870e-aa8805ec08fa-kube-api-access-nw7bf\") on node \"crc\" DevicePath \"\"" Nov 26 15:33:37 crc kubenswrapper[4785]: I1126 15:33:37.306073 4785 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vhpr7" Nov 26 15:33:37 crc kubenswrapper[4785]: I1126 15:33:37.306091 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ghbwg" event={"ID":"f9f06b92-7cee-45e2-870e-aa8805ec08fa","Type":"ContainerDied","Data":"e5a9df6d069c1c64662e8495b04da373f3381bbd43948cfcb8bc75487045f6e2"} Nov 26 15:33:37 crc kubenswrapper[4785]: I1126 15:33:37.306629 4785 scope.go:117] "RemoveContainer" containerID="19cbaf2091bc1870a522e061f2e8e7052a07c54504a08eff2729b023f33a76b5" Nov 26 15:33:37 crc kubenswrapper[4785]: I1126 15:33:37.306129 4785 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ghbwg" Nov 26 15:33:37 crc kubenswrapper[4785]: I1126 15:33:37.330832 4785 scope.go:117] "RemoveContainer" containerID="176a8fa07192183f08fd07ad8c3a8381a7a766fcda79263e03faa49f5f282d75" Nov 26 15:33:37 crc kubenswrapper[4785]: I1126 15:33:37.335951 4785 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-ghbwg"] Nov 26 15:33:37 crc kubenswrapper[4785]: I1126 15:33:37.343185 4785 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-ghbwg"] Nov 26 15:33:37 crc kubenswrapper[4785]: I1126 15:33:37.357996 4785 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-vhpr7"] Nov 26 15:33:37 crc kubenswrapper[4785]: I1126 15:33:37.364213 4785 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-vhpr7"] Nov 26 15:33:37 crc kubenswrapper[4785]: I1126 15:33:37.367922 4785 scope.go:117] "RemoveContainer" containerID="bf2332296f6af56600c74496c7dfc4479ab7388c7d8cdbb2b4652a60bb1b5354" Nov 26 15:33:39 crc kubenswrapper[4785]: I1126 15:33:39.047990 4785 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d23856b-e9af-4a74-bddc-f17804e47ffa" path="/var/lib/kubelet/pods/9d23856b-e9af-4a74-bddc-f17804e47ffa/volumes" Nov 26 15:33:39 crc kubenswrapper[4785]: I1126 15:33:39.048945 4785 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f9f06b92-7cee-45e2-870e-aa8805ec08fa" path="/var/lib/kubelet/pods/f9f06b92-7cee-45e2-870e-aa8805ec08fa/volumes" Nov 26 15:33:40 crc kubenswrapper[4785]: I1126 15:33:40.570380 4785 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-xlnfx"] Nov 26 15:33:40 crc kubenswrapper[4785]: I1126 15:33:40.570782 4785 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-xlnfx" podUID="4e28a930-bff6-48b6-b7ac-e34c20ac22e1" containerName="registry-server" containerID="cri-o://eda36acfef263364be7cd607d5c7993fec8d7a148d6d572cefc792464b616f45" gracePeriod=2 Nov 26 15:33:40 crc kubenswrapper[4785]: I1126 15:33:40.991208 4785 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xlnfx" Nov 26 15:33:41 crc kubenswrapper[4785]: I1126 15:33:41.006531 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4e28a930-bff6-48b6-b7ac-e34c20ac22e1-catalog-content\") pod \"4e28a930-bff6-48b6-b7ac-e34c20ac22e1\" (UID: \"4e28a930-bff6-48b6-b7ac-e34c20ac22e1\") " Nov 26 15:33:41 crc kubenswrapper[4785]: I1126 15:33:41.006606 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4e28a930-bff6-48b6-b7ac-e34c20ac22e1-utilities\") pod \"4e28a930-bff6-48b6-b7ac-e34c20ac22e1\" (UID: \"4e28a930-bff6-48b6-b7ac-e34c20ac22e1\") " Nov 26 15:33:41 crc kubenswrapper[4785]: I1126 15:33:41.006817 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gkh6n\" (UniqueName: \"kubernetes.io/projected/4e28a930-bff6-48b6-b7ac-e34c20ac22e1-kube-api-access-gkh6n\") pod \"4e28a930-bff6-48b6-b7ac-e34c20ac22e1\" (UID: \"4e28a930-bff6-48b6-b7ac-e34c20ac22e1\") " Nov 26 15:33:41 crc kubenswrapper[4785]: I1126 15:33:41.010912 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4e28a930-bff6-48b6-b7ac-e34c20ac22e1-utilities" (OuterVolumeSpecName: "utilities") pod "4e28a930-bff6-48b6-b7ac-e34c20ac22e1" (UID: "4e28a930-bff6-48b6-b7ac-e34c20ac22e1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 15:33:41 crc kubenswrapper[4785]: I1126 15:33:41.015153 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4e28a930-bff6-48b6-b7ac-e34c20ac22e1-kube-api-access-gkh6n" (OuterVolumeSpecName: "kube-api-access-gkh6n") pod "4e28a930-bff6-48b6-b7ac-e34c20ac22e1" (UID: "4e28a930-bff6-48b6-b7ac-e34c20ac22e1"). InnerVolumeSpecName "kube-api-access-gkh6n". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 15:33:41 crc kubenswrapper[4785]: I1126 15:33:41.101750 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4e28a930-bff6-48b6-b7ac-e34c20ac22e1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4e28a930-bff6-48b6-b7ac-e34c20ac22e1" (UID: "4e28a930-bff6-48b6-b7ac-e34c20ac22e1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 15:33:41 crc kubenswrapper[4785]: I1126 15:33:41.109122 4785 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gkh6n\" (UniqueName: \"kubernetes.io/projected/4e28a930-bff6-48b6-b7ac-e34c20ac22e1-kube-api-access-gkh6n\") on node \"crc\" DevicePath \"\"" Nov 26 15:33:41 crc kubenswrapper[4785]: I1126 15:33:41.109162 4785 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4e28a930-bff6-48b6-b7ac-e34c20ac22e1-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 26 15:33:41 crc kubenswrapper[4785]: I1126 15:33:41.109172 4785 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4e28a930-bff6-48b6-b7ac-e34c20ac22e1-utilities\") on node \"crc\" DevicePath \"\"" Nov 26 15:33:41 crc kubenswrapper[4785]: I1126 15:33:41.343610 4785 generic.go:334] "Generic (PLEG): container finished" podID="4e28a930-bff6-48b6-b7ac-e34c20ac22e1" containerID="eda36acfef263364be7cd607d5c7993fec8d7a148d6d572cefc792464b616f45" exitCode=0 Nov 26 15:33:41 crc kubenswrapper[4785]: I1126 15:33:41.343652 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xlnfx" event={"ID":"4e28a930-bff6-48b6-b7ac-e34c20ac22e1","Type":"ContainerDied","Data":"eda36acfef263364be7cd607d5c7993fec8d7a148d6d572cefc792464b616f45"} Nov 26 15:33:41 crc kubenswrapper[4785]: I1126 15:33:41.343677 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xlnfx" event={"ID":"4e28a930-bff6-48b6-b7ac-e34c20ac22e1","Type":"ContainerDied","Data":"fe6acd255b84987ed42328d0e6390d250b139b463e16cf3f18044633a4deda02"} Nov 26 15:33:41 crc kubenswrapper[4785]: I1126 15:33:41.343682 4785 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xlnfx" Nov 26 15:33:41 crc kubenswrapper[4785]: I1126 15:33:41.343693 4785 scope.go:117] "RemoveContainer" containerID="eda36acfef263364be7cd607d5c7993fec8d7a148d6d572cefc792464b616f45" Nov 26 15:33:41 crc kubenswrapper[4785]: I1126 15:33:41.391298 4785 scope.go:117] "RemoveContainer" containerID="a13702b2bf9d620b3a926588d390bfe87376758c146969f9c4afa958db4bb4da" Nov 26 15:33:41 crc kubenswrapper[4785]: I1126 15:33:41.419521 4785 scope.go:117] "RemoveContainer" containerID="1b99da5e55f81b26936ae47f0c68cfb793c05305ced99ffcab6b4c3cedd81761" Nov 26 15:33:41 crc kubenswrapper[4785]: I1126 15:33:41.449664 4785 scope.go:117] "RemoveContainer" containerID="eda36acfef263364be7cd607d5c7993fec8d7a148d6d572cefc792464b616f45" Nov 26 15:33:41 crc kubenswrapper[4785]: E1126 15:33:41.450528 4785 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eda36acfef263364be7cd607d5c7993fec8d7a148d6d572cefc792464b616f45\": container with ID starting with eda36acfef263364be7cd607d5c7993fec8d7a148d6d572cefc792464b616f45 not found: ID does not exist" containerID="eda36acfef263364be7cd607d5c7993fec8d7a148d6d572cefc792464b616f45" Nov 26 15:33:41 crc kubenswrapper[4785]: I1126 15:33:41.450594 4785 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eda36acfef263364be7cd607d5c7993fec8d7a148d6d572cefc792464b616f45"} err="failed to get container status \"eda36acfef263364be7cd607d5c7993fec8d7a148d6d572cefc792464b616f45\": rpc error: code = NotFound desc = could not find container \"eda36acfef263364be7cd607d5c7993fec8d7a148d6d572cefc792464b616f45\": container with ID starting with eda36acfef263364be7cd607d5c7993fec8d7a148d6d572cefc792464b616f45 not found: ID does not exist" Nov 26 15:33:41 crc kubenswrapper[4785]: I1126 15:33:41.450626 4785 scope.go:117] "RemoveContainer" containerID="a13702b2bf9d620b3a926588d390bfe87376758c146969f9c4afa958db4bb4da" Nov 26 15:33:41 crc kubenswrapper[4785]: E1126 15:33:41.451927 4785 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a13702b2bf9d620b3a926588d390bfe87376758c146969f9c4afa958db4bb4da\": container with ID starting with a13702b2bf9d620b3a926588d390bfe87376758c146969f9c4afa958db4bb4da not found: ID does not exist" containerID="a13702b2bf9d620b3a926588d390bfe87376758c146969f9c4afa958db4bb4da" Nov 26 15:33:41 crc kubenswrapper[4785]: I1126 15:33:41.452072 4785 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a13702b2bf9d620b3a926588d390bfe87376758c146969f9c4afa958db4bb4da"} err="failed to get container status \"a13702b2bf9d620b3a926588d390bfe87376758c146969f9c4afa958db4bb4da\": rpc error: code = NotFound desc = could not find container \"a13702b2bf9d620b3a926588d390bfe87376758c146969f9c4afa958db4bb4da\": container with ID starting with a13702b2bf9d620b3a926588d390bfe87376758c146969f9c4afa958db4bb4da not found: ID does not exist" Nov 26 15:33:41 crc kubenswrapper[4785]: I1126 15:33:41.452168 4785 scope.go:117] "RemoveContainer" containerID="1b99da5e55f81b26936ae47f0c68cfb793c05305ced99ffcab6b4c3cedd81761" Nov 26 15:33:41 crc kubenswrapper[4785]: E1126 15:33:41.452749 4785 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1b99da5e55f81b26936ae47f0c68cfb793c05305ced99ffcab6b4c3cedd81761\": container with ID starting with 1b99da5e55f81b26936ae47f0c68cfb793c05305ced99ffcab6b4c3cedd81761 not found: ID does not exist" containerID="1b99da5e55f81b26936ae47f0c68cfb793c05305ced99ffcab6b4c3cedd81761" Nov 26 15:33:41 crc kubenswrapper[4785]: I1126 15:33:41.452893 4785 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1b99da5e55f81b26936ae47f0c68cfb793c05305ced99ffcab6b4c3cedd81761"} err="failed to get container status \"1b99da5e55f81b26936ae47f0c68cfb793c05305ced99ffcab6b4c3cedd81761\": rpc error: code = NotFound desc = could not find container \"1b99da5e55f81b26936ae47f0c68cfb793c05305ced99ffcab6b4c3cedd81761\": container with ID starting with 1b99da5e55f81b26936ae47f0c68cfb793c05305ced99ffcab6b4c3cedd81761 not found: ID does not exist" Nov 26 15:33:41 crc kubenswrapper[4785]: I1126 15:33:41.460860 4785 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-xlnfx"] Nov 26 15:33:41 crc kubenswrapper[4785]: I1126 15:33:41.468994 4785 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-xlnfx"] Nov 26 15:33:43 crc kubenswrapper[4785]: I1126 15:33:43.050822 4785 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4e28a930-bff6-48b6-b7ac-e34c20ac22e1" path="/var/lib/kubelet/pods/4e28a930-bff6-48b6-b7ac-e34c20ac22e1/volumes" Nov 26 15:33:49 crc kubenswrapper[4785]: I1126 15:33:49.036303 4785 scope.go:117] "RemoveContainer" containerID="3e8c4964e31211d666dbae0e50d629dfc05802d610982228b7bbd6c4dcfcee2a" Nov 26 15:33:49 crc kubenswrapper[4785]: E1126 15:33:49.036758 4785 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gkxdl_openshift-machine-config-operator(5539d39a-e2bc-4e7f-8b5a-a5e3e10c4ba4)\"" pod="openshift-machine-config-operator/machine-config-daemon-gkxdl" podUID="5539d39a-e2bc-4e7f-8b5a-a5e3e10c4ba4" Nov 26 15:33:56 crc kubenswrapper[4785]: I1126 15:33:56.209648 4785 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/openstackclient"] Nov 26 15:33:56 crc kubenswrapper[4785]: E1126 15:33:56.210625 4785 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e28a930-bff6-48b6-b7ac-e34c20ac22e1" containerName="registry-server" Nov 26 15:33:56 crc kubenswrapper[4785]: I1126 15:33:56.210644 4785 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e28a930-bff6-48b6-b7ac-e34c20ac22e1" containerName="registry-server" Nov 26 15:33:56 crc kubenswrapper[4785]: E1126 15:33:56.210665 4785 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0142e7b-caff-4c76-86db-0b2fa7f3fc6d" containerName="extract-utilities" Nov 26 15:33:56 crc kubenswrapper[4785]: I1126 15:33:56.210672 4785 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0142e7b-caff-4c76-86db-0b2fa7f3fc6d" containerName="extract-utilities" Nov 26 15:33:56 crc kubenswrapper[4785]: E1126 15:33:56.210680 4785 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d23856b-e9af-4a74-bddc-f17804e47ffa" containerName="extract-utilities" Nov 26 15:33:56 crc kubenswrapper[4785]: I1126 15:33:56.210689 4785 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d23856b-e9af-4a74-bddc-f17804e47ffa" containerName="extract-utilities" Nov 26 15:33:56 crc kubenswrapper[4785]: E1126 15:33:56.210701 4785 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0142e7b-caff-4c76-86db-0b2fa7f3fc6d" containerName="extract-content" Nov 26 15:33:56 crc kubenswrapper[4785]: I1126 15:33:56.210709 4785 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0142e7b-caff-4c76-86db-0b2fa7f3fc6d" containerName="extract-content" Nov 26 15:33:56 crc kubenswrapper[4785]: E1126 15:33:56.210720 4785 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9f06b92-7cee-45e2-870e-aa8805ec08fa" containerName="extract-utilities" Nov 26 15:33:56 crc kubenswrapper[4785]: I1126 15:33:56.210728 4785 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9f06b92-7cee-45e2-870e-aa8805ec08fa" containerName="extract-utilities" Nov 26 15:33:56 crc kubenswrapper[4785]: E1126 15:33:56.210741 4785 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d23856b-e9af-4a74-bddc-f17804e47ffa" containerName="registry-server" Nov 26 15:33:56 crc kubenswrapper[4785]: I1126 15:33:56.210749 4785 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d23856b-e9af-4a74-bddc-f17804e47ffa" containerName="registry-server" Nov 26 15:33:56 crc kubenswrapper[4785]: E1126 15:33:56.210766 4785 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d23856b-e9af-4a74-bddc-f17804e47ffa" containerName="extract-content" Nov 26 15:33:56 crc kubenswrapper[4785]: I1126 15:33:56.210774 4785 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d23856b-e9af-4a74-bddc-f17804e47ffa" containerName="extract-content" Nov 26 15:33:56 crc kubenswrapper[4785]: E1126 15:33:56.210785 4785 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9f06b92-7cee-45e2-870e-aa8805ec08fa" containerName="registry-server" Nov 26 15:33:56 crc kubenswrapper[4785]: I1126 15:33:56.210792 4785 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9f06b92-7cee-45e2-870e-aa8805ec08fa" containerName="registry-server" Nov 26 15:33:56 crc kubenswrapper[4785]: E1126 15:33:56.210802 4785 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0142e7b-caff-4c76-86db-0b2fa7f3fc6d" containerName="registry-server" Nov 26 15:33:56 crc kubenswrapper[4785]: I1126 15:33:56.210810 4785 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0142e7b-caff-4c76-86db-0b2fa7f3fc6d" containerName="registry-server" Nov 26 15:33:56 crc kubenswrapper[4785]: E1126 15:33:56.210824 4785 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9f06b92-7cee-45e2-870e-aa8805ec08fa" containerName="extract-content" Nov 26 15:33:56 crc kubenswrapper[4785]: I1126 15:33:56.210831 4785 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9f06b92-7cee-45e2-870e-aa8805ec08fa" containerName="extract-content" Nov 26 15:33:56 crc kubenswrapper[4785]: E1126 15:33:56.210841 4785 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e28a930-bff6-48b6-b7ac-e34c20ac22e1" containerName="extract-utilities" Nov 26 15:33:56 crc kubenswrapper[4785]: I1126 15:33:56.210849 4785 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e28a930-bff6-48b6-b7ac-e34c20ac22e1" containerName="extract-utilities" Nov 26 15:33:56 crc kubenswrapper[4785]: E1126 15:33:56.210858 4785 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e28a930-bff6-48b6-b7ac-e34c20ac22e1" containerName="extract-content" Nov 26 15:33:56 crc kubenswrapper[4785]: I1126 15:33:56.210865 4785 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e28a930-bff6-48b6-b7ac-e34c20ac22e1" containerName="extract-content" Nov 26 15:33:56 crc kubenswrapper[4785]: I1126 15:33:56.211034 4785 memory_manager.go:354] "RemoveStaleState removing state" podUID="a0142e7b-caff-4c76-86db-0b2fa7f3fc6d" containerName="registry-server" Nov 26 15:33:56 crc kubenswrapper[4785]: I1126 15:33:56.211059 4785 memory_manager.go:354] "RemoveStaleState removing state" podUID="9d23856b-e9af-4a74-bddc-f17804e47ffa" containerName="registry-server" Nov 26 15:33:56 crc kubenswrapper[4785]: I1126 15:33:56.211072 4785 memory_manager.go:354] "RemoveStaleState removing state" podUID="4e28a930-bff6-48b6-b7ac-e34c20ac22e1" containerName="registry-server" Nov 26 15:33:56 crc kubenswrapper[4785]: I1126 15:33:56.211086 4785 memory_manager.go:354] "RemoveStaleState removing state" podUID="f9f06b92-7cee-45e2-870e-aa8805ec08fa" containerName="registry-server" Nov 26 15:33:56 crc kubenswrapper[4785]: I1126 15:33:56.211664 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/openstackclient" Nov 26 15:33:56 crc kubenswrapper[4785]: I1126 15:33:56.215836 4785 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"default-dockercfg-sflsz" Nov 26 15:33:56 crc kubenswrapper[4785]: I1126 15:33:56.216017 4785 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"openstack-config-secret" Nov 26 15:33:56 crc kubenswrapper[4785]: I1126 15:33:56.216245 4785 reflector.go:368] Caches populated for *v1.ConfigMap from object-"glance-kuttl-tests"/"openstack-scripts-9db6gc427h" Nov 26 15:33:56 crc kubenswrapper[4785]: I1126 15:33:56.216384 4785 reflector.go:368] Caches populated for *v1.ConfigMap from object-"glance-kuttl-tests"/"openstack-config" Nov 26 15:33:56 crc kubenswrapper[4785]: I1126 15:33:56.228131 4785 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/openstackclient"] Nov 26 15:33:56 crc kubenswrapper[4785]: I1126 15:33:56.267413 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/b8ce9665-24ea-4f7e-93b3-4ca67c53f109-openstack-config-secret\") pod \"openstackclient\" (UID: \"b8ce9665-24ea-4f7e-93b3-4ca67c53f109\") " pod="glance-kuttl-tests/openstackclient" Nov 26 15:33:56 crc kubenswrapper[4785]: I1126 15:33:56.267509 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-scripts\" (UniqueName: \"kubernetes.io/configmap/b8ce9665-24ea-4f7e-93b3-4ca67c53f109-openstack-scripts\") pod \"openstackclient\" (UID: \"b8ce9665-24ea-4f7e-93b3-4ca67c53f109\") " pod="glance-kuttl-tests/openstackclient" Nov 26 15:33:56 crc kubenswrapper[4785]: I1126 15:33:56.267534 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/b8ce9665-24ea-4f7e-93b3-4ca67c53f109-openstack-config\") pod \"openstackclient\" (UID: \"b8ce9665-24ea-4f7e-93b3-4ca67c53f109\") " pod="glance-kuttl-tests/openstackclient" Nov 26 15:33:56 crc kubenswrapper[4785]: I1126 15:33:56.267626 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hlcbh\" (UniqueName: \"kubernetes.io/projected/b8ce9665-24ea-4f7e-93b3-4ca67c53f109-kube-api-access-hlcbh\") pod \"openstackclient\" (UID: \"b8ce9665-24ea-4f7e-93b3-4ca67c53f109\") " pod="glance-kuttl-tests/openstackclient" Nov 26 15:33:56 crc kubenswrapper[4785]: I1126 15:33:56.368894 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hlcbh\" (UniqueName: \"kubernetes.io/projected/b8ce9665-24ea-4f7e-93b3-4ca67c53f109-kube-api-access-hlcbh\") pod \"openstackclient\" (UID: \"b8ce9665-24ea-4f7e-93b3-4ca67c53f109\") " pod="glance-kuttl-tests/openstackclient" Nov 26 15:33:56 crc kubenswrapper[4785]: I1126 15:33:56.369174 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/b8ce9665-24ea-4f7e-93b3-4ca67c53f109-openstack-config-secret\") pod \"openstackclient\" (UID: \"b8ce9665-24ea-4f7e-93b3-4ca67c53f109\") " pod="glance-kuttl-tests/openstackclient" Nov 26 15:33:56 crc kubenswrapper[4785]: I1126 15:33:56.369283 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-scripts\" (UniqueName: \"kubernetes.io/configmap/b8ce9665-24ea-4f7e-93b3-4ca67c53f109-openstack-scripts\") pod \"openstackclient\" (UID: \"b8ce9665-24ea-4f7e-93b3-4ca67c53f109\") " pod="glance-kuttl-tests/openstackclient" Nov 26 15:33:56 crc kubenswrapper[4785]: I1126 15:33:56.369364 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/b8ce9665-24ea-4f7e-93b3-4ca67c53f109-openstack-config\") pod \"openstackclient\" (UID: \"b8ce9665-24ea-4f7e-93b3-4ca67c53f109\") " pod="glance-kuttl-tests/openstackclient" Nov 26 15:33:56 crc kubenswrapper[4785]: I1126 15:33:56.370109 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/b8ce9665-24ea-4f7e-93b3-4ca67c53f109-openstack-config\") pod \"openstackclient\" (UID: \"b8ce9665-24ea-4f7e-93b3-4ca67c53f109\") " pod="glance-kuttl-tests/openstackclient" Nov 26 15:33:56 crc kubenswrapper[4785]: I1126 15:33:56.370135 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-scripts\" (UniqueName: \"kubernetes.io/configmap/b8ce9665-24ea-4f7e-93b3-4ca67c53f109-openstack-scripts\") pod \"openstackclient\" (UID: \"b8ce9665-24ea-4f7e-93b3-4ca67c53f109\") " pod="glance-kuttl-tests/openstackclient" Nov 26 15:33:56 crc kubenswrapper[4785]: I1126 15:33:56.375080 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/b8ce9665-24ea-4f7e-93b3-4ca67c53f109-openstack-config-secret\") pod \"openstackclient\" (UID: \"b8ce9665-24ea-4f7e-93b3-4ca67c53f109\") " pod="glance-kuttl-tests/openstackclient" Nov 26 15:33:56 crc kubenswrapper[4785]: I1126 15:33:56.386953 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hlcbh\" (UniqueName: \"kubernetes.io/projected/b8ce9665-24ea-4f7e-93b3-4ca67c53f109-kube-api-access-hlcbh\") pod \"openstackclient\" (UID: \"b8ce9665-24ea-4f7e-93b3-4ca67c53f109\") " pod="glance-kuttl-tests/openstackclient" Nov 26 15:33:56 crc kubenswrapper[4785]: I1126 15:33:56.526937 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/openstackclient" Nov 26 15:33:56 crc kubenswrapper[4785]: I1126 15:33:56.980388 4785 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/openstackclient"] Nov 26 15:33:57 crc kubenswrapper[4785]: I1126 15:33:57.498700 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/openstackclient" event={"ID":"b8ce9665-24ea-4f7e-93b3-4ca67c53f109","Type":"ContainerStarted","Data":"36931f872870e5ee986c6033792f14dd70d787340900c223394c923e50e2b78d"} Nov 26 15:33:57 crc kubenswrapper[4785]: I1126 15:33:57.498766 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/openstackclient" event={"ID":"b8ce9665-24ea-4f7e-93b3-4ca67c53f109","Type":"ContainerStarted","Data":"a126ca6587649d6a1401bdef114f7d11be17b1e6f780a6be3a5b51546fa83f98"} Nov 26 15:33:57 crc kubenswrapper[4785]: I1126 15:33:57.526320 4785 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/openstackclient" podStartSLOduration=1.5262974740000002 podStartE2EDuration="1.526297474s" podCreationTimestamp="2025-11-26 15:33:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-26 15:33:57.520940769 +0000 UTC m=+1601.199306543" watchObservedRunningTime="2025-11-26 15:33:57.526297474 +0000 UTC m=+1601.204663248" Nov 26 15:34:03 crc kubenswrapper[4785]: I1126 15:34:03.036046 4785 scope.go:117] "RemoveContainer" containerID="3e8c4964e31211d666dbae0e50d629dfc05802d610982228b7bbd6c4dcfcee2a" Nov 26 15:34:03 crc kubenswrapper[4785]: E1126 15:34:03.036923 4785 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gkxdl_openshift-machine-config-operator(5539d39a-e2bc-4e7f-8b5a-a5e3e10c4ba4)\"" pod="openshift-machine-config-operator/machine-config-daemon-gkxdl" podUID="5539d39a-e2bc-4e7f-8b5a-a5e3e10c4ba4" Nov 26 15:34:18 crc kubenswrapper[4785]: I1126 15:34:18.036448 4785 scope.go:117] "RemoveContainer" containerID="3e8c4964e31211d666dbae0e50d629dfc05802d610982228b7bbd6c4dcfcee2a" Nov 26 15:34:18 crc kubenswrapper[4785]: E1126 15:34:18.037380 4785 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gkxdl_openshift-machine-config-operator(5539d39a-e2bc-4e7f-8b5a-a5e3e10c4ba4)\"" pod="openshift-machine-config-operator/machine-config-daemon-gkxdl" podUID="5539d39a-e2bc-4e7f-8b5a-a5e3e10c4ba4" Nov 26 15:34:30 crc kubenswrapper[4785]: I1126 15:34:30.036767 4785 scope.go:117] "RemoveContainer" containerID="3e8c4964e31211d666dbae0e50d629dfc05802d610982228b7bbd6c4dcfcee2a" Nov 26 15:34:30 crc kubenswrapper[4785]: E1126 15:34:30.037590 4785 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gkxdl_openshift-machine-config-operator(5539d39a-e2bc-4e7f-8b5a-a5e3e10c4ba4)\"" pod="openshift-machine-config-operator/machine-config-daemon-gkxdl" podUID="5539d39a-e2bc-4e7f-8b5a-a5e3e10c4ba4" Nov 26 15:34:41 crc kubenswrapper[4785]: I1126 15:34:41.036323 4785 scope.go:117] "RemoveContainer" containerID="3e8c4964e31211d666dbae0e50d629dfc05802d610982228b7bbd6c4dcfcee2a" Nov 26 15:34:41 crc kubenswrapper[4785]: E1126 15:34:41.037682 4785 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gkxdl_openshift-machine-config-operator(5539d39a-e2bc-4e7f-8b5a-a5e3e10c4ba4)\"" pod="openshift-machine-config-operator/machine-config-daemon-gkxdl" podUID="5539d39a-e2bc-4e7f-8b5a-a5e3e10c4ba4" Nov 26 15:34:56 crc kubenswrapper[4785]: I1126 15:34:56.036506 4785 scope.go:117] "RemoveContainer" containerID="3e8c4964e31211d666dbae0e50d629dfc05802d610982228b7bbd6c4dcfcee2a" Nov 26 15:34:56 crc kubenswrapper[4785]: E1126 15:34:56.037610 4785 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gkxdl_openshift-machine-config-operator(5539d39a-e2bc-4e7f-8b5a-a5e3e10c4ba4)\"" pod="openshift-machine-config-operator/machine-config-daemon-gkxdl" podUID="5539d39a-e2bc-4e7f-8b5a-a5e3e10c4ba4" Nov 26 15:35:11 crc kubenswrapper[4785]: I1126 15:35:11.037062 4785 scope.go:117] "RemoveContainer" containerID="3e8c4964e31211d666dbae0e50d629dfc05802d610982228b7bbd6c4dcfcee2a" Nov 26 15:35:11 crc kubenswrapper[4785]: E1126 15:35:11.037875 4785 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gkxdl_openshift-machine-config-operator(5539d39a-e2bc-4e7f-8b5a-a5e3e10c4ba4)\"" pod="openshift-machine-config-operator/machine-config-daemon-gkxdl" podUID="5539d39a-e2bc-4e7f-8b5a-a5e3e10c4ba4" Nov 26 15:35:17 crc kubenswrapper[4785]: I1126 15:35:17.031070 4785 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-s5v8p/must-gather-hrpl6"] Nov 26 15:35:17 crc kubenswrapper[4785]: I1126 15:35:17.033082 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-s5v8p/must-gather-hrpl6" Nov 26 15:35:17 crc kubenswrapper[4785]: I1126 15:35:17.035869 4785 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-s5v8p"/"default-dockercfg-md5tt" Nov 26 15:35:17 crc kubenswrapper[4785]: I1126 15:35:17.036096 4785 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-s5v8p"/"openshift-service-ca.crt" Nov 26 15:35:17 crc kubenswrapper[4785]: I1126 15:35:17.036351 4785 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-s5v8p"/"kube-root-ca.crt" Nov 26 15:35:17 crc kubenswrapper[4785]: I1126 15:35:17.055900 4785 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-s5v8p/must-gather-hrpl6"] Nov 26 15:35:17 crc kubenswrapper[4785]: I1126 15:35:17.112921 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/c7809fca-9387-4705-a57d-eaf21ba07762-must-gather-output\") pod \"must-gather-hrpl6\" (UID: \"c7809fca-9387-4705-a57d-eaf21ba07762\") " pod="openshift-must-gather-s5v8p/must-gather-hrpl6" Nov 26 15:35:17 crc kubenswrapper[4785]: I1126 15:35:17.113056 4785 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wvqzl\" (UniqueName: \"kubernetes.io/projected/c7809fca-9387-4705-a57d-eaf21ba07762-kube-api-access-wvqzl\") pod \"must-gather-hrpl6\" (UID: \"c7809fca-9387-4705-a57d-eaf21ba07762\") " pod="openshift-must-gather-s5v8p/must-gather-hrpl6" Nov 26 15:35:17 crc kubenswrapper[4785]: I1126 15:35:17.214492 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/c7809fca-9387-4705-a57d-eaf21ba07762-must-gather-output\") pod \"must-gather-hrpl6\" (UID: \"c7809fca-9387-4705-a57d-eaf21ba07762\") " pod="openshift-must-gather-s5v8p/must-gather-hrpl6" Nov 26 15:35:17 crc kubenswrapper[4785]: I1126 15:35:17.214591 4785 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wvqzl\" (UniqueName: \"kubernetes.io/projected/c7809fca-9387-4705-a57d-eaf21ba07762-kube-api-access-wvqzl\") pod \"must-gather-hrpl6\" (UID: \"c7809fca-9387-4705-a57d-eaf21ba07762\") " pod="openshift-must-gather-s5v8p/must-gather-hrpl6" Nov 26 15:35:17 crc kubenswrapper[4785]: I1126 15:35:17.215002 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/c7809fca-9387-4705-a57d-eaf21ba07762-must-gather-output\") pod \"must-gather-hrpl6\" (UID: \"c7809fca-9387-4705-a57d-eaf21ba07762\") " pod="openshift-must-gather-s5v8p/must-gather-hrpl6" Nov 26 15:35:17 crc kubenswrapper[4785]: I1126 15:35:17.237598 4785 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wvqzl\" (UniqueName: \"kubernetes.io/projected/c7809fca-9387-4705-a57d-eaf21ba07762-kube-api-access-wvqzl\") pod \"must-gather-hrpl6\" (UID: \"c7809fca-9387-4705-a57d-eaf21ba07762\") " pod="openshift-must-gather-s5v8p/must-gather-hrpl6" Nov 26 15:35:17 crc kubenswrapper[4785]: I1126 15:35:17.354357 4785 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-s5v8p/must-gather-hrpl6" Nov 26 15:35:17 crc kubenswrapper[4785]: I1126 15:35:17.563248 4785 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-s5v8p/must-gather-hrpl6"] Nov 26 15:35:18 crc kubenswrapper[4785]: I1126 15:35:18.176381 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-s5v8p/must-gather-hrpl6" event={"ID":"c7809fca-9387-4705-a57d-eaf21ba07762","Type":"ContainerStarted","Data":"6c8f76eb85ff4939558f8f1750ae73471772e9734e1b91b39c6b129484433da0"} Nov 26 15:35:22 crc kubenswrapper[4785]: I1126 15:35:22.205179 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-s5v8p/must-gather-hrpl6" event={"ID":"c7809fca-9387-4705-a57d-eaf21ba07762","Type":"ContainerStarted","Data":"50ecfc14415efadbf461a9f6865dd747b1dd3a13c30f20afb0333e3e1b33a1ff"} Nov 26 15:35:22 crc kubenswrapper[4785]: I1126 15:35:22.205754 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-s5v8p/must-gather-hrpl6" event={"ID":"c7809fca-9387-4705-a57d-eaf21ba07762","Type":"ContainerStarted","Data":"d38e7c646b10d374027944a3aedcefa7a7c587fe4360e18ea4ff8c2b2a174efc"} Nov 26 15:35:22 crc kubenswrapper[4785]: I1126 15:35:22.226120 4785 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-s5v8p/must-gather-hrpl6" podStartSLOduration=1.752147496 podStartE2EDuration="5.226101817s" podCreationTimestamp="2025-11-26 15:35:17 +0000 UTC" firstStartedPulling="2025-11-26 15:35:17.572361943 +0000 UTC m=+1681.250727707" lastFinishedPulling="2025-11-26 15:35:21.046316264 +0000 UTC m=+1684.724682028" observedRunningTime="2025-11-26 15:35:22.223978289 +0000 UTC m=+1685.902344074" watchObservedRunningTime="2025-11-26 15:35:22.226101817 +0000 UTC m=+1685.904467581" Nov 26 15:35:26 crc kubenswrapper[4785]: I1126 15:35:26.036387 4785 scope.go:117] "RemoveContainer" containerID="3e8c4964e31211d666dbae0e50d629dfc05802d610982228b7bbd6c4dcfcee2a" Nov 26 15:35:26 crc kubenswrapper[4785]: E1126 15:35:26.037910 4785 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gkxdl_openshift-machine-config-operator(5539d39a-e2bc-4e7f-8b5a-a5e3e10c4ba4)\"" pod="openshift-machine-config-operator/machine-config-daemon-gkxdl" podUID="5539d39a-e2bc-4e7f-8b5a-a5e3e10c4ba4" Nov 26 15:35:38 crc kubenswrapper[4785]: I1126 15:35:38.036273 4785 scope.go:117] "RemoveContainer" containerID="3e8c4964e31211d666dbae0e50d629dfc05802d610982228b7bbd6c4dcfcee2a" Nov 26 15:35:38 crc kubenswrapper[4785]: E1126 15:35:38.036958 4785 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gkxdl_openshift-machine-config-operator(5539d39a-e2bc-4e7f-8b5a-a5e3e10c4ba4)\"" pod="openshift-machine-config-operator/machine-config-daemon-gkxdl" podUID="5539d39a-e2bc-4e7f-8b5a-a5e3e10c4ba4" Nov 26 15:35:50 crc kubenswrapper[4785]: I1126 15:35:50.036652 4785 scope.go:117] "RemoveContainer" containerID="3e8c4964e31211d666dbae0e50d629dfc05802d610982228b7bbd6c4dcfcee2a" Nov 26 15:35:50 crc kubenswrapper[4785]: E1126 15:35:50.037522 4785 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gkxdl_openshift-machine-config-operator(5539d39a-e2bc-4e7f-8b5a-a5e3e10c4ba4)\"" pod="openshift-machine-config-operator/machine-config-daemon-gkxdl" podUID="5539d39a-e2bc-4e7f-8b5a-a5e3e10c4ba4" Nov 26 15:35:55 crc kubenswrapper[4785]: I1126 15:35:55.035751 4785 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_27e8bc079695f3aed52a6c5be68196d91a6230a1a03a8fc87a19aa534flmpn9_4aad6667-7a55-4c14-a191-7723fd1e5274/util/0.log" Nov 26 15:35:55 crc kubenswrapper[4785]: I1126 15:35:55.232056 4785 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_27e8bc079695f3aed52a6c5be68196d91a6230a1a03a8fc87a19aa534flmpn9_4aad6667-7a55-4c14-a191-7723fd1e5274/util/0.log" Nov 26 15:35:55 crc kubenswrapper[4785]: I1126 15:35:55.256782 4785 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_27e8bc079695f3aed52a6c5be68196d91a6230a1a03a8fc87a19aa534flmpn9_4aad6667-7a55-4c14-a191-7723fd1e5274/pull/0.log" Nov 26 15:35:55 crc kubenswrapper[4785]: I1126 15:35:55.269626 4785 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_27e8bc079695f3aed52a6c5be68196d91a6230a1a03a8fc87a19aa534flmpn9_4aad6667-7a55-4c14-a191-7723fd1e5274/pull/0.log" Nov 26 15:35:55 crc kubenswrapper[4785]: I1126 15:35:55.395096 4785 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_27e8bc079695f3aed52a6c5be68196d91a6230a1a03a8fc87a19aa534flmpn9_4aad6667-7a55-4c14-a191-7723fd1e5274/util/0.log" Nov 26 15:35:55 crc kubenswrapper[4785]: I1126 15:35:55.431744 4785 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_27e8bc079695f3aed52a6c5be68196d91a6230a1a03a8fc87a19aa534flmpn9_4aad6667-7a55-4c14-a191-7723fd1e5274/pull/0.log" Nov 26 15:35:55 crc kubenswrapper[4785]: I1126 15:35:55.438334 4785 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_27e8bc079695f3aed52a6c5be68196d91a6230a1a03a8fc87a19aa534flmpn9_4aad6667-7a55-4c14-a191-7723fd1e5274/extract/0.log" Nov 26 15:35:55 crc kubenswrapper[4785]: I1126 15:35:55.577971 4785 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_3375e8518d2544d2be57982cef9b070243a887947cb6dc52d06f274d4esfw28_0da8226f-52cd-44a6-9fc6-b30c8a92c074/util/0.log" Nov 26 15:35:55 crc kubenswrapper[4785]: I1126 15:35:55.723327 4785 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_3375e8518d2544d2be57982cef9b070243a887947cb6dc52d06f274d4esfw28_0da8226f-52cd-44a6-9fc6-b30c8a92c074/pull/0.log" Nov 26 15:35:55 crc kubenswrapper[4785]: I1126 15:35:55.730301 4785 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_3375e8518d2544d2be57982cef9b070243a887947cb6dc52d06f274d4esfw28_0da8226f-52cd-44a6-9fc6-b30c8a92c074/pull/0.log" Nov 26 15:35:55 crc kubenswrapper[4785]: I1126 15:35:55.765666 4785 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_3375e8518d2544d2be57982cef9b070243a887947cb6dc52d06f274d4esfw28_0da8226f-52cd-44a6-9fc6-b30c8a92c074/util/0.log" Nov 26 15:35:55 crc kubenswrapper[4785]: I1126 15:35:55.890052 4785 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_3375e8518d2544d2be57982cef9b070243a887947cb6dc52d06f274d4esfw28_0da8226f-52cd-44a6-9fc6-b30c8a92c074/pull/0.log" Nov 26 15:35:55 crc kubenswrapper[4785]: I1126 15:35:55.892529 4785 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_3375e8518d2544d2be57982cef9b070243a887947cb6dc52d06f274d4esfw28_0da8226f-52cd-44a6-9fc6-b30c8a92c074/util/0.log" Nov 26 15:35:55 crc kubenswrapper[4785]: I1126 15:35:55.934874 4785 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_3375e8518d2544d2be57982cef9b070243a887947cb6dc52d06f274d4esfw28_0da8226f-52cd-44a6-9fc6-b30c8a92c074/extract/0.log" Nov 26 15:35:56 crc kubenswrapper[4785]: I1126 15:35:56.048418 4785 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_5d473c3169f40b179d14921c90af2c8546b7b757fe551b7dba7d903f5d6ndpx_362049d6-ebac-4703-b856-408cc878f2b6/util/0.log" Nov 26 15:35:56 crc kubenswrapper[4785]: I1126 15:35:56.267824 4785 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_5d473c3169f40b179d14921c90af2c8546b7b757fe551b7dba7d903f5d6ndpx_362049d6-ebac-4703-b856-408cc878f2b6/pull/0.log" Nov 26 15:35:56 crc kubenswrapper[4785]: I1126 15:35:56.298799 4785 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_5d473c3169f40b179d14921c90af2c8546b7b757fe551b7dba7d903f5d6ndpx_362049d6-ebac-4703-b856-408cc878f2b6/pull/0.log" Nov 26 15:35:56 crc kubenswrapper[4785]: I1126 15:35:56.316464 4785 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_5d473c3169f40b179d14921c90af2c8546b7b757fe551b7dba7d903f5d6ndpx_362049d6-ebac-4703-b856-408cc878f2b6/util/0.log" Nov 26 15:35:56 crc kubenswrapper[4785]: I1126 15:35:56.424135 4785 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_5d473c3169f40b179d14921c90af2c8546b7b757fe551b7dba7d903f5d6ndpx_362049d6-ebac-4703-b856-408cc878f2b6/util/0.log" Nov 26 15:35:56 crc kubenswrapper[4785]: I1126 15:35:56.453732 4785 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_5d473c3169f40b179d14921c90af2c8546b7b757fe551b7dba7d903f5d6ndpx_362049d6-ebac-4703-b856-408cc878f2b6/extract/0.log" Nov 26 15:35:56 crc kubenswrapper[4785]: I1126 15:35:56.453923 4785 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_5d473c3169f40b179d14921c90af2c8546b7b757fe551b7dba7d903f5d6ndpx_362049d6-ebac-4703-b856-408cc878f2b6/pull/0.log" Nov 26 15:35:56 crc kubenswrapper[4785]: I1126 15:35:56.608823 4785 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_87b4bb7621dcb67338b53778f2871f07aa0e4d3dfcd0fd25724bfd240bvx5zp_a6f1913b-2431-4998-a360-fc87606e990e/util/0.log" Nov 26 15:35:56 crc kubenswrapper[4785]: I1126 15:35:56.781577 4785 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_87b4bb7621dcb67338b53778f2871f07aa0e4d3dfcd0fd25724bfd240bvx5zp_a6f1913b-2431-4998-a360-fc87606e990e/pull/0.log" Nov 26 15:35:56 crc kubenswrapper[4785]: I1126 15:35:56.784849 4785 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_87b4bb7621dcb67338b53778f2871f07aa0e4d3dfcd0fd25724bfd240bvx5zp_a6f1913b-2431-4998-a360-fc87606e990e/pull/0.log" Nov 26 15:35:56 crc kubenswrapper[4785]: I1126 15:35:56.801909 4785 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_87b4bb7621dcb67338b53778f2871f07aa0e4d3dfcd0fd25724bfd240bvx5zp_a6f1913b-2431-4998-a360-fc87606e990e/util/0.log" Nov 26 15:35:56 crc kubenswrapper[4785]: I1126 15:35:56.951184 4785 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_87b4bb7621dcb67338b53778f2871f07aa0e4d3dfcd0fd25724bfd240bvx5zp_a6f1913b-2431-4998-a360-fc87606e990e/util/0.log" Nov 26 15:35:56 crc kubenswrapper[4785]: I1126 15:35:56.957222 4785 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_87b4bb7621dcb67338b53778f2871f07aa0e4d3dfcd0fd25724bfd240bvx5zp_a6f1913b-2431-4998-a360-fc87606e990e/extract/0.log" Nov 26 15:35:56 crc kubenswrapper[4785]: I1126 15:35:56.964909 4785 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_87b4bb7621dcb67338b53778f2871f07aa0e4d3dfcd0fd25724bfd240bvx5zp_a6f1913b-2431-4998-a360-fc87606e990e/pull/0.log" Nov 26 15:35:57 crc kubenswrapper[4785]: I1126 15:35:57.112779 4785 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590mtm2r_c784173a-04b2-490a-b83a-ce589d9b5459/util/0.log" Nov 26 15:35:57 crc kubenswrapper[4785]: I1126 15:35:57.289906 4785 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590mtm2r_c784173a-04b2-490a-b83a-ce589d9b5459/util/0.log" Nov 26 15:35:57 crc kubenswrapper[4785]: I1126 15:35:57.291944 4785 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590mtm2r_c784173a-04b2-490a-b83a-ce589d9b5459/pull/0.log" Nov 26 15:35:57 crc kubenswrapper[4785]: I1126 15:35:57.292128 4785 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590mtm2r_c784173a-04b2-490a-b83a-ce589d9b5459/pull/0.log" Nov 26 15:35:57 crc kubenswrapper[4785]: I1126 15:35:57.445416 4785 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590mtm2r_c784173a-04b2-490a-b83a-ce589d9b5459/util/0.log" Nov 26 15:35:57 crc kubenswrapper[4785]: I1126 15:35:57.449593 4785 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590mtm2r_c784173a-04b2-490a-b83a-ce589d9b5459/pull/0.log" Nov 26 15:35:57 crc kubenswrapper[4785]: I1126 15:35:57.472927 4785 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590mtm2r_c784173a-04b2-490a-b83a-ce589d9b5459/extract/0.log" Nov 26 15:35:57 crc kubenswrapper[4785]: I1126 15:35:57.620744 4785 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_9f0c59a3968beec894e04476dd5efd0a707bad85f482efd4940498368cf67k9_353baaac-9344-4e21-af31-6695033fd724/util/0.log" Nov 26 15:35:57 crc kubenswrapper[4785]: I1126 15:35:57.790975 4785 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_9f0c59a3968beec894e04476dd5efd0a707bad85f482efd4940498368cf67k9_353baaac-9344-4e21-af31-6695033fd724/pull/0.log" Nov 26 15:35:57 crc kubenswrapper[4785]: I1126 15:35:57.796275 4785 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_9f0c59a3968beec894e04476dd5efd0a707bad85f482efd4940498368cf67k9_353baaac-9344-4e21-af31-6695033fd724/util/0.log" Nov 26 15:35:57 crc kubenswrapper[4785]: I1126 15:35:57.800936 4785 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_9f0c59a3968beec894e04476dd5efd0a707bad85f482efd4940498368cf67k9_353baaac-9344-4e21-af31-6695033fd724/pull/0.log" Nov 26 15:35:58 crc kubenswrapper[4785]: I1126 15:35:58.007488 4785 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_9f0c59a3968beec894e04476dd5efd0a707bad85f482efd4940498368cf67k9_353baaac-9344-4e21-af31-6695033fd724/util/0.log" Nov 26 15:35:58 crc kubenswrapper[4785]: I1126 15:35:58.043183 4785 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_9f0c59a3968beec894e04476dd5efd0a707bad85f482efd4940498368cf67k9_353baaac-9344-4e21-af31-6695033fd724/pull/0.log" Nov 26 15:35:58 crc kubenswrapper[4785]: I1126 15:35:58.064700 4785 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_d854280893f664a16f85f7c4268f877fa95509a4e25ae77fea242eaaa362xf6_bdc1a44d-c408-48d0-8df4-d52394f541eb/util/0.log" Nov 26 15:35:58 crc kubenswrapper[4785]: I1126 15:35:58.097634 4785 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_9f0c59a3968beec894e04476dd5efd0a707bad85f482efd4940498368cf67k9_353baaac-9344-4e21-af31-6695033fd724/extract/0.log" Nov 26 15:35:58 crc kubenswrapper[4785]: I1126 15:35:58.232860 4785 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_d854280893f664a16f85f7c4268f877fa95509a4e25ae77fea242eaaa362xf6_bdc1a44d-c408-48d0-8df4-d52394f541eb/util/0.log" Nov 26 15:35:58 crc kubenswrapper[4785]: I1126 15:35:58.248825 4785 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_d854280893f664a16f85f7c4268f877fa95509a4e25ae77fea242eaaa362xf6_bdc1a44d-c408-48d0-8df4-d52394f541eb/pull/0.log" Nov 26 15:35:58 crc kubenswrapper[4785]: I1126 15:35:58.267418 4785 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_d854280893f664a16f85f7c4268f877fa95509a4e25ae77fea242eaaa362xf6_bdc1a44d-c408-48d0-8df4-d52394f541eb/pull/0.log" Nov 26 15:35:58 crc kubenswrapper[4785]: I1126 15:35:58.406539 4785 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_d854280893f664a16f85f7c4268f877fa95509a4e25ae77fea242eaaa362xf6_bdc1a44d-c408-48d0-8df4-d52394f541eb/util/0.log" Nov 26 15:35:58 crc kubenswrapper[4785]: I1126 15:35:58.437677 4785 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_d854280893f664a16f85f7c4268f877fa95509a4e25ae77fea242eaaa362xf6_bdc1a44d-c408-48d0-8df4-d52394f541eb/pull/0.log" Nov 26 15:35:58 crc kubenswrapper[4785]: I1126 15:35:58.440704 4785 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_d854280893f664a16f85f7c4268f877fa95509a4e25ae77fea242eaaa362xf6_bdc1a44d-c408-48d0-8df4-d52394f541eb/extract/0.log" Nov 26 15:35:58 crc kubenswrapper[4785]: I1126 15:35:58.467973 4785 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-56ccd5f88c-dzft5_62cac43a-a147-46b5-bbd6-4b452a008291/manager/3.log" Nov 26 15:35:58 crc kubenswrapper[4785]: I1126 15:35:58.598418 4785 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-56ccd5f88c-dzft5_62cac43a-a147-46b5-bbd6-4b452a008291/manager/2.log" Nov 26 15:35:58 crc kubenswrapper[4785]: I1126 15:35:58.654375 4785 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-index-tfk4s_4652c290-44a9-4e40-b880-c73c6be91f2d/registry-server/0.log" Nov 26 15:35:58 crc kubenswrapper[4785]: I1126 15:35:58.680078 4785 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-647db694df-qnrxh_3e7e6d72-e83c-4188-a2bc-11b7c16e6e8e/manager/3.log" Nov 26 15:35:58 crc kubenswrapper[4785]: I1126 15:35:58.760372 4785 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-647db694df-qnrxh_3e7e6d72-e83c-4188-a2bc-11b7c16e6e8e/manager/2.log" Nov 26 15:35:58 crc kubenswrapper[4785]: I1126 15:35:58.823388 4785 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-index-tssqb_3e49c55a-9d72-4b78-ac75-84fba908f67b/registry-server/0.log" Nov 26 15:35:58 crc kubenswrapper[4785]: I1126 15:35:58.878965 4785 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-f68bdc44b-4p65x_b37e9a18-1d3b-4a5b-aa23-d9cc2f6394e3/kube-rbac-proxy/0.log" Nov 26 15:35:58 crc kubenswrapper[4785]: I1126 15:35:58.961698 4785 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-f68bdc44b-4p65x_b37e9a18-1d3b-4a5b-aa23-d9cc2f6394e3/manager/3.log" Nov 26 15:35:58 crc kubenswrapper[4785]: I1126 15:35:58.980093 4785 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-f68bdc44b-4p65x_b37e9a18-1d3b-4a5b-aa23-d9cc2f6394e3/manager/2.log" Nov 26 15:35:59 crc kubenswrapper[4785]: I1126 15:35:59.103670 4785 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-index-lf5bg_f83f3408-4867-48ce-8161-9c07c4e887ec/registry-server/0.log" Nov 26 15:35:59 crc kubenswrapper[4785]: I1126 15:35:59.178197 4785 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-68b4f95d6c-cpkqd_bc8e3329-ae9c-48b1-a49c-92eeef6ae114/manager/3.log" Nov 26 15:35:59 crc kubenswrapper[4785]: I1126 15:35:59.180174 4785 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-68b4f95d6c-cpkqd_bc8e3329-ae9c-48b1-a49c-92eeef6ae114/manager/2.log" Nov 26 15:35:59 crc kubenswrapper[4785]: I1126 15:35:59.297999 4785 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-index-lknwz_e46e72bd-d6c1-48b6-a702-8256e1057ea6/registry-server/0.log" Nov 26 15:35:59 crc kubenswrapper[4785]: I1126 15:35:59.337051 4785 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-747fb5cb85-5slw2_8d72a0d6-d729-4c0f-90c7-2a5eca6fc32a/manager/3.log" Nov 26 15:35:59 crc kubenswrapper[4785]: I1126 15:35:59.341670 4785 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-747fb5cb85-5slw2_8d72a0d6-d729-4c0f-90c7-2a5eca6fc32a/manager/2.log" Nov 26 15:35:59 crc kubenswrapper[4785]: I1126 15:35:59.462460 4785 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-index-h9jqb_06d34fac-12a4-41e4-96b6-1d8df99cfee4/registry-server/0.log" Nov 26 15:35:59 crc kubenswrapper[4785]: I1126 15:35:59.509123 4785 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-779fc9694b-6pwlx_c39759f2-3183-48fa-aaee-14b24c5337d7/operator/3.log" Nov 26 15:35:59 crc kubenswrapper[4785]: I1126 15:35:59.518129 4785 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-779fc9694b-6pwlx_c39759f2-3183-48fa-aaee-14b24c5337d7/operator/2.log" Nov 26 15:35:59 crc kubenswrapper[4785]: I1126 15:35:59.644829 4785 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-index-znj27_50491112-0d7a-44d0-b66f-9920357a2eff/registry-server/0.log" Nov 26 15:35:59 crc kubenswrapper[4785]: I1126 15:35:59.700207 4785 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-5d784fc5bb-kn67f_60b24860-07b4-4841-9c4a-a5e6456a45dc/manager/3.log" Nov 26 15:35:59 crc kubenswrapper[4785]: I1126 15:35:59.733727 4785 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-5d784fc5bb-kn67f_60b24860-07b4-4841-9c4a-a5e6456a45dc/manager/2.log" Nov 26 15:35:59 crc kubenswrapper[4785]: I1126 15:35:59.864244 4785 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-index-c4ks4_6030eac1-f066-489e-9a17-5c2dd5d5880a/registry-server/0.log" Nov 26 15:36:02 crc kubenswrapper[4785]: I1126 15:36:02.036501 4785 scope.go:117] "RemoveContainer" containerID="3e8c4964e31211d666dbae0e50d629dfc05802d610982228b7bbd6c4dcfcee2a" Nov 26 15:36:02 crc kubenswrapper[4785]: E1126 15:36:02.037095 4785 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gkxdl_openshift-machine-config-operator(5539d39a-e2bc-4e7f-8b5a-a5e3e10c4ba4)\"" pod="openshift-machine-config-operator/machine-config-daemon-gkxdl" podUID="5539d39a-e2bc-4e7f-8b5a-a5e3e10c4ba4" Nov 26 15:36:14 crc kubenswrapper[4785]: I1126 15:36:14.134283 4785 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-h952m_bebeadc0-d563-42fc-9283-819249f42c0f/control-plane-machine-set-operator/0.log" Nov 26 15:36:14 crc kubenswrapper[4785]: I1126 15:36:14.307134 4785 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-lcfs6_dc5c31e5-88ab-41d0-9976-b63f97b85543/machine-api-operator/0.log" Nov 26 15:36:14 crc kubenswrapper[4785]: I1126 15:36:14.315263 4785 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-lcfs6_dc5c31e5-88ab-41d0-9976-b63f97b85543/kube-rbac-proxy/0.log" Nov 26 15:36:15 crc kubenswrapper[4785]: I1126 15:36:15.039008 4785 scope.go:117] "RemoveContainer" containerID="3e8c4964e31211d666dbae0e50d629dfc05802d610982228b7bbd6c4dcfcee2a" Nov 26 15:36:15 crc kubenswrapper[4785]: E1126 15:36:15.039197 4785 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gkxdl_openshift-machine-config-operator(5539d39a-e2bc-4e7f-8b5a-a5e3e10c4ba4)\"" pod="openshift-machine-config-operator/machine-config-daemon-gkxdl" podUID="5539d39a-e2bc-4e7f-8b5a-a5e3e10c4ba4" Nov 26 15:36:27 crc kubenswrapper[4785]: I1126 15:36:27.043990 4785 scope.go:117] "RemoveContainer" containerID="3e8c4964e31211d666dbae0e50d629dfc05802d610982228b7bbd6c4dcfcee2a" Nov 26 15:36:27 crc kubenswrapper[4785]: E1126 15:36:27.044938 4785 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gkxdl_openshift-machine-config-operator(5539d39a-e2bc-4e7f-8b5a-a5e3e10c4ba4)\"" pod="openshift-machine-config-operator/machine-config-daemon-gkxdl" podUID="5539d39a-e2bc-4e7f-8b5a-a5e3e10c4ba4" Nov 26 15:36:29 crc kubenswrapper[4785]: I1126 15:36:29.651001 4785 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-6c7b4b5f48-9gfl9_6aa1ab43-d7e6-412b-a16d-d92e016442bb/controller/0.log" Nov 26 15:36:29 crc kubenswrapper[4785]: I1126 15:36:29.659005 4785 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-6c7b4b5f48-9gfl9_6aa1ab43-d7e6-412b-a16d-d92e016442bb/kube-rbac-proxy/0.log" Nov 26 15:36:29 crc kubenswrapper[4785]: I1126 15:36:29.780713 4785 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-h5jcn_c54bc198-6911-461a-9c78-dfd7fd744524/cp-frr-files/0.log" Nov 26 15:36:29 crc kubenswrapper[4785]: I1126 15:36:29.953922 4785 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-h5jcn_c54bc198-6911-461a-9c78-dfd7fd744524/cp-frr-files/0.log" Nov 26 15:36:29 crc kubenswrapper[4785]: I1126 15:36:29.963311 4785 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-h5jcn_c54bc198-6911-461a-9c78-dfd7fd744524/cp-reloader/0.log" Nov 26 15:36:29 crc kubenswrapper[4785]: I1126 15:36:29.976586 4785 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-h5jcn_c54bc198-6911-461a-9c78-dfd7fd744524/cp-reloader/0.log" Nov 26 15:36:30 crc kubenswrapper[4785]: I1126 15:36:30.028523 4785 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-h5jcn_c54bc198-6911-461a-9c78-dfd7fd744524/cp-metrics/0.log" Nov 26 15:36:30 crc kubenswrapper[4785]: I1126 15:36:30.143645 4785 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-h5jcn_c54bc198-6911-461a-9c78-dfd7fd744524/cp-reloader/0.log" Nov 26 15:36:30 crc kubenswrapper[4785]: I1126 15:36:30.144298 4785 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-h5jcn_c54bc198-6911-461a-9c78-dfd7fd744524/cp-frr-files/0.log" Nov 26 15:36:30 crc kubenswrapper[4785]: I1126 15:36:30.158938 4785 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-h5jcn_c54bc198-6911-461a-9c78-dfd7fd744524/cp-metrics/0.log" Nov 26 15:36:30 crc kubenswrapper[4785]: I1126 15:36:30.234964 4785 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-h5jcn_c54bc198-6911-461a-9c78-dfd7fd744524/cp-metrics/0.log" Nov 26 15:36:30 crc kubenswrapper[4785]: I1126 15:36:30.389521 4785 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-h5jcn_c54bc198-6911-461a-9c78-dfd7fd744524/cp-reloader/0.log" Nov 26 15:36:30 crc kubenswrapper[4785]: I1126 15:36:30.390356 4785 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-h5jcn_c54bc198-6911-461a-9c78-dfd7fd744524/cp-metrics/0.log" Nov 26 15:36:30 crc kubenswrapper[4785]: I1126 15:36:30.390693 4785 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-h5jcn_c54bc198-6911-461a-9c78-dfd7fd744524/cp-frr-files/0.log" Nov 26 15:36:30 crc kubenswrapper[4785]: I1126 15:36:30.411703 4785 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-h5jcn_c54bc198-6911-461a-9c78-dfd7fd744524/controller/0.log" Nov 26 15:36:30 crc kubenswrapper[4785]: I1126 15:36:30.600612 4785 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-h5jcn_c54bc198-6911-461a-9c78-dfd7fd744524/kube-rbac-proxy/0.log" Nov 26 15:36:30 crc kubenswrapper[4785]: I1126 15:36:30.602616 4785 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-h5jcn_c54bc198-6911-461a-9c78-dfd7fd744524/frr-metrics/0.log" Nov 26 15:36:30 crc kubenswrapper[4785]: I1126 15:36:30.668777 4785 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-h5jcn_c54bc198-6911-461a-9c78-dfd7fd744524/kube-rbac-proxy-frr/0.log" Nov 26 15:36:30 crc kubenswrapper[4785]: I1126 15:36:30.853656 4785 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-h5jcn_c54bc198-6911-461a-9c78-dfd7fd744524/reloader/0.log" Nov 26 15:36:30 crc kubenswrapper[4785]: I1126 15:36:30.932992 4785 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-6998585d5-7bmlh_98a95f21-9e38-4113-848b-6b9ced267e38/frr-k8s-webhook-server/0.log" Nov 26 15:36:31 crc kubenswrapper[4785]: I1126 15:36:31.030877 4785 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-h5jcn_c54bc198-6911-461a-9c78-dfd7fd744524/frr/0.log" Nov 26 15:36:31 crc kubenswrapper[4785]: I1126 15:36:31.052592 4785 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-84667dbb5-sslgl_161001b1-a5be-49ea-8031-e2c11dd07800/manager/3.log" Nov 26 15:36:31 crc kubenswrapper[4785]: I1126 15:36:31.156608 4785 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-84667dbb5-sslgl_161001b1-a5be-49ea-8031-e2c11dd07800/manager/2.log" Nov 26 15:36:31 crc kubenswrapper[4785]: I1126 15:36:31.252106 4785 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-6cf4498f-spzbf_3bc35760-4dcf-49b0-a6c7-19f57d889012/webhook-server/0.log" Nov 26 15:36:31 crc kubenswrapper[4785]: I1126 15:36:31.299951 4785 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-mdltn_2a6a141b-7f8b-4b07-8afb-33bd21fc7b7d/kube-rbac-proxy/0.log" Nov 26 15:36:31 crc kubenswrapper[4785]: I1126 15:36:31.531341 4785 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-mdltn_2a6a141b-7f8b-4b07-8afb-33bd21fc7b7d/speaker/0.log" Nov 26 15:36:39 crc kubenswrapper[4785]: I1126 15:36:39.036175 4785 scope.go:117] "RemoveContainer" containerID="3e8c4964e31211d666dbae0e50d629dfc05802d610982228b7bbd6c4dcfcee2a" Nov 26 15:36:39 crc kubenswrapper[4785]: E1126 15:36:39.036830 4785 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gkxdl_openshift-machine-config-operator(5539d39a-e2bc-4e7f-8b5a-a5e3e10c4ba4)\"" pod="openshift-machine-config-operator/machine-config-daemon-gkxdl" podUID="5539d39a-e2bc-4e7f-8b5a-a5e3e10c4ba4" Nov 26 15:36:45 crc kubenswrapper[4785]: I1126 15:36:45.586300 4785 log.go:25] "Finished parsing log file" path="/var/log/pods/glance-kuttl-tests_glance-5542-account-create-update-kgwxd_b5cd9a63-e7cf-4c00-b352-be258e30c83b/mariadb-account-create-update/0.log" Nov 26 15:36:45 crc kubenswrapper[4785]: I1126 15:36:45.675990 4785 log.go:25] "Finished parsing log file" path="/var/log/pods/glance-kuttl-tests_glance-cache-glance-default-external-api-0-cleaner-2940285v2t8b_dc512af0-b1e9-4ee7-9e2c-ac53acd314f7/glance-cache-glance-default-external-api-0-cleaner/0.log" Nov 26 15:36:45 crc kubenswrapper[4785]: I1126 15:36:45.803983 4785 log.go:25] "Finished parsing log file" path="/var/log/pods/glance-kuttl-tests_glance-cache-glance-default-internal-api-0-cleaner-2940285mwtnm_89d71759-41c7-4c92-b800-7305c551e991/glance-cache-glance-default-internal-api-0-cleaner/0.log" Nov 26 15:36:45 crc kubenswrapper[4785]: I1126 15:36:45.872482 4785 log.go:25] "Finished parsing log file" path="/var/log/pods/glance-kuttl-tests_glance-db-create-dr56t_37b8d5b5-97ff-4bb8-9149-1af3d8aa824f/mariadb-database-create/0.log" Nov 26 15:36:45 crc kubenswrapper[4785]: I1126 15:36:45.981481 4785 log.go:25] "Finished parsing log file" path="/var/log/pods/glance-kuttl-tests_glance-db-sync-5bhl4_e81fe919-3471-4913-a892-b03f703d3ed9/glance-db-sync/0.log" Nov 26 15:36:46 crc kubenswrapper[4785]: I1126 15:36:46.052407 4785 log.go:25] "Finished parsing log file" path="/var/log/pods/glance-kuttl-tests_glance-default-external-api-0_59fd54c3-fef6-4165-ab5a-3bb74543da8b/glance-api/0.log" Nov 26 15:36:46 crc kubenswrapper[4785]: I1126 15:36:46.140430 4785 log.go:25] "Finished parsing log file" path="/var/log/pods/glance-kuttl-tests_glance-default-external-api-0_59fd54c3-fef6-4165-ab5a-3bb74543da8b/glance-httpd/0.log" Nov 26 15:36:46 crc kubenswrapper[4785]: I1126 15:36:46.167398 4785 log.go:25] "Finished parsing log file" path="/var/log/pods/glance-kuttl-tests_glance-default-external-api-0_59fd54c3-fef6-4165-ab5a-3bb74543da8b/glance-log/0.log" Nov 26 15:36:46 crc kubenswrapper[4785]: I1126 15:36:46.234988 4785 log.go:25] "Finished parsing log file" path="/var/log/pods/glance-kuttl-tests_glance-default-internal-api-0_82000356-60a7-4c1b-8f86-c7ecc1a25d7f/glance-api/0.log" Nov 26 15:36:46 crc kubenswrapper[4785]: I1126 15:36:46.319919 4785 log.go:25] "Finished parsing log file" path="/var/log/pods/glance-kuttl-tests_glance-default-internal-api-0_82000356-60a7-4c1b-8f86-c7ecc1a25d7f/glance-httpd/0.log" Nov 26 15:36:46 crc kubenswrapper[4785]: I1126 15:36:46.342543 4785 log.go:25] "Finished parsing log file" path="/var/log/pods/glance-kuttl-tests_glance-default-internal-api-0_82000356-60a7-4c1b-8f86-c7ecc1a25d7f/glance-log/0.log" Nov 26 15:36:46 crc kubenswrapper[4785]: I1126 15:36:46.687732 4785 log.go:25] "Finished parsing log file" path="/var/log/pods/glance-kuttl-tests_keystone-5b85f48447-rwscr_578b1e05-62bf-4cc2-921f-cbfccf41a170/keystone-api/0.log" Nov 26 15:36:46 crc kubenswrapper[4785]: I1126 15:36:46.751684 4785 log.go:25] "Finished parsing log file" path="/var/log/pods/glance-kuttl-tests_openstack-galera-0_b12a01ef-3cf0-4e03-b38b-9b306ce01fdf/mysql-bootstrap/0.log" Nov 26 15:36:46 crc kubenswrapper[4785]: I1126 15:36:46.903610 4785 log.go:25] "Finished parsing log file" path="/var/log/pods/glance-kuttl-tests_openstack-galera-0_b12a01ef-3cf0-4e03-b38b-9b306ce01fdf/mysql-bootstrap/0.log" Nov 26 15:36:46 crc kubenswrapper[4785]: I1126 15:36:46.970301 4785 log.go:25] "Finished parsing log file" path="/var/log/pods/glance-kuttl-tests_openstack-galera-0_b12a01ef-3cf0-4e03-b38b-9b306ce01fdf/galera/0.log" Nov 26 15:36:47 crc kubenswrapper[4785]: I1126 15:36:47.127294 4785 log.go:25] "Finished parsing log file" path="/var/log/pods/glance-kuttl-tests_openstack-galera-1_a9248619-c310-43ae-b33a-b51f3e9d0a03/mysql-bootstrap/0.log" Nov 26 15:36:47 crc kubenswrapper[4785]: I1126 15:36:47.281536 4785 log.go:25] "Finished parsing log file" path="/var/log/pods/glance-kuttl-tests_openstack-galera-1_a9248619-c310-43ae-b33a-b51f3e9d0a03/mysql-bootstrap/0.log" Nov 26 15:36:47 crc kubenswrapper[4785]: I1126 15:36:47.325249 4785 log.go:25] "Finished parsing log file" path="/var/log/pods/glance-kuttl-tests_openstack-galera-1_a9248619-c310-43ae-b33a-b51f3e9d0a03/galera/0.log" Nov 26 15:36:47 crc kubenswrapper[4785]: I1126 15:36:47.513443 4785 log.go:25] "Finished parsing log file" path="/var/log/pods/glance-kuttl-tests_openstack-galera-2_b8f3a4fb-df39-4059-a9dd-4f566b1e4860/mysql-bootstrap/0.log" Nov 26 15:36:47 crc kubenswrapper[4785]: I1126 15:36:47.743179 4785 log.go:25] "Finished parsing log file" path="/var/log/pods/glance-kuttl-tests_openstack-galera-2_b8f3a4fb-df39-4059-a9dd-4f566b1e4860/mysql-bootstrap/0.log" Nov 26 15:36:47 crc kubenswrapper[4785]: I1126 15:36:47.771470 4785 log.go:25] "Finished parsing log file" path="/var/log/pods/glance-kuttl-tests_openstack-galera-2_b8f3a4fb-df39-4059-a9dd-4f566b1e4860/galera/0.log" Nov 26 15:36:47 crc kubenswrapper[4785]: I1126 15:36:47.873896 4785 log.go:25] "Finished parsing log file" path="/var/log/pods/glance-kuttl-tests_memcached-0_a1feca48-dc9b-434e-8caf-608727943291/memcached/0.log" Nov 26 15:36:48 crc kubenswrapper[4785]: I1126 15:36:48.024048 4785 log.go:25] "Finished parsing log file" path="/var/log/pods/glance-kuttl-tests_openstackclient_b8ce9665-24ea-4f7e-93b3-4ca67c53f109/openstackclient/0.log" Nov 26 15:36:48 crc kubenswrapper[4785]: I1126 15:36:48.068315 4785 log.go:25] "Finished parsing log file" path="/var/log/pods/glance-kuttl-tests_rabbitmq-server-0_4ace84da-cbee-4e2b-b473-67dac2985d5e/setup-container/0.log" Nov 26 15:36:48 crc kubenswrapper[4785]: I1126 15:36:48.254351 4785 log.go:25] "Finished parsing log file" path="/var/log/pods/glance-kuttl-tests_rabbitmq-server-0_4ace84da-cbee-4e2b-b473-67dac2985d5e/setup-container/0.log" Nov 26 15:36:48 crc kubenswrapper[4785]: I1126 15:36:48.278767 4785 log.go:25] "Finished parsing log file" path="/var/log/pods/glance-kuttl-tests_swift-proxy-6bd58cfcf7-mxwdz_438d5704-c198-4184-aba9-e9be2025f903/proxy-httpd/0.log" Nov 26 15:36:48 crc kubenswrapper[4785]: I1126 15:36:48.304045 4785 log.go:25] "Finished parsing log file" path="/var/log/pods/glance-kuttl-tests_rabbitmq-server-0_4ace84da-cbee-4e2b-b473-67dac2985d5e/rabbitmq/0.log" Nov 26 15:36:48 crc kubenswrapper[4785]: I1126 15:36:48.415107 4785 log.go:25] "Finished parsing log file" path="/var/log/pods/glance-kuttl-tests_swift-proxy-6bd58cfcf7-mxwdz_438d5704-c198-4184-aba9-e9be2025f903/proxy-server/0.log" Nov 26 15:36:48 crc kubenswrapper[4785]: I1126 15:36:48.473459 4785 log.go:25] "Finished parsing log file" path="/var/log/pods/glance-kuttl-tests_swift-ring-rebalance-fsr6b_5d518dbb-f95d-409e-be26-ec87f47d465a/swift-ring-rebalance/0.log" Nov 26 15:36:48 crc kubenswrapper[4785]: I1126 15:36:48.591070 4785 log.go:25] "Finished parsing log file" path="/var/log/pods/glance-kuttl-tests_swift-storage-0_c22f4ea9-991d-4431-be3c-aeb8f547176e/account-auditor/0.log" Nov 26 15:36:48 crc kubenswrapper[4785]: I1126 15:36:48.674148 4785 log.go:25] "Finished parsing log file" path="/var/log/pods/glance-kuttl-tests_swift-storage-0_c22f4ea9-991d-4431-be3c-aeb8f547176e/account-reaper/0.log" Nov 26 15:36:48 crc kubenswrapper[4785]: I1126 15:36:48.680307 4785 log.go:25] "Finished parsing log file" path="/var/log/pods/glance-kuttl-tests_swift-storage-0_c22f4ea9-991d-4431-be3c-aeb8f547176e/account-replicator/0.log" Nov 26 15:36:48 crc kubenswrapper[4785]: I1126 15:36:48.703752 4785 log.go:25] "Finished parsing log file" path="/var/log/pods/glance-kuttl-tests_swift-storage-0_c22f4ea9-991d-4431-be3c-aeb8f547176e/account-server/0.log" Nov 26 15:36:48 crc kubenswrapper[4785]: I1126 15:36:48.716739 4785 log.go:25] "Finished parsing log file" path="/var/log/pods/glance-kuttl-tests_swift-storage-0_c22f4ea9-991d-4431-be3c-aeb8f547176e/container-auditor/0.log" Nov 26 15:36:48 crc kubenswrapper[4785]: I1126 15:36:48.782938 4785 log.go:25] "Finished parsing log file" path="/var/log/pods/glance-kuttl-tests_swift-storage-0_c22f4ea9-991d-4431-be3c-aeb8f547176e/container-replicator/0.log" Nov 26 15:36:48 crc kubenswrapper[4785]: I1126 15:36:48.848795 4785 log.go:25] "Finished parsing log file" path="/var/log/pods/glance-kuttl-tests_swift-storage-0_c22f4ea9-991d-4431-be3c-aeb8f547176e/container-updater/0.log" Nov 26 15:36:48 crc kubenswrapper[4785]: I1126 15:36:48.849002 4785 log.go:25] "Finished parsing log file" path="/var/log/pods/glance-kuttl-tests_swift-storage-0_c22f4ea9-991d-4431-be3c-aeb8f547176e/container-server/0.log" Nov 26 15:36:48 crc kubenswrapper[4785]: I1126 15:36:48.868082 4785 log.go:25] "Finished parsing log file" path="/var/log/pods/glance-kuttl-tests_swift-storage-0_c22f4ea9-991d-4431-be3c-aeb8f547176e/object-auditor/0.log" Nov 26 15:36:48 crc kubenswrapper[4785]: I1126 15:36:48.877032 4785 log.go:25] "Finished parsing log file" path="/var/log/pods/glance-kuttl-tests_swift-storage-0_c22f4ea9-991d-4431-be3c-aeb8f547176e/object-expirer/0.log" Nov 26 15:36:48 crc kubenswrapper[4785]: I1126 15:36:48.971540 4785 log.go:25] "Finished parsing log file" path="/var/log/pods/glance-kuttl-tests_swift-storage-0_c22f4ea9-991d-4431-be3c-aeb8f547176e/object-replicator/0.log" Nov 26 15:36:49 crc kubenswrapper[4785]: I1126 15:36:49.013982 4785 log.go:25] "Finished parsing log file" path="/var/log/pods/glance-kuttl-tests_swift-storage-0_c22f4ea9-991d-4431-be3c-aeb8f547176e/object-updater/0.log" Nov 26 15:36:49 crc kubenswrapper[4785]: I1126 15:36:49.038994 4785 log.go:25] "Finished parsing log file" path="/var/log/pods/glance-kuttl-tests_swift-storage-0_c22f4ea9-991d-4431-be3c-aeb8f547176e/rsync/0.log" Nov 26 15:36:49 crc kubenswrapper[4785]: I1126 15:36:49.046015 4785 log.go:25] "Finished parsing log file" path="/var/log/pods/glance-kuttl-tests_swift-storage-0_c22f4ea9-991d-4431-be3c-aeb8f547176e/object-server/0.log" Nov 26 15:36:49 crc kubenswrapper[4785]: I1126 15:36:49.074874 4785 log.go:25] "Finished parsing log file" path="/var/log/pods/glance-kuttl-tests_swift-storage-0_c22f4ea9-991d-4431-be3c-aeb8f547176e/swift-recon-cron/0.log" Nov 26 15:36:50 crc kubenswrapper[4785]: I1126 15:36:50.036631 4785 scope.go:117] "RemoveContainer" containerID="3e8c4964e31211d666dbae0e50d629dfc05802d610982228b7bbd6c4dcfcee2a" Nov 26 15:36:50 crc kubenswrapper[4785]: E1126 15:36:50.037145 4785 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gkxdl_openshift-machine-config-operator(5539d39a-e2bc-4e7f-8b5a-a5e3e10c4ba4)\"" pod="openshift-machine-config-operator/machine-config-daemon-gkxdl" podUID="5539d39a-e2bc-4e7f-8b5a-a5e3e10c4ba4" Nov 26 15:37:02 crc kubenswrapper[4785]: I1126 15:37:02.157537 4785 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-tnkfn_7b86e813-9dab-4537-be3d-9903e0b53f70/extract-utilities/0.log" Nov 26 15:37:02 crc kubenswrapper[4785]: I1126 15:37:02.327514 4785 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-tnkfn_7b86e813-9dab-4537-be3d-9903e0b53f70/extract-utilities/0.log" Nov 26 15:37:02 crc kubenswrapper[4785]: I1126 15:37:02.333426 4785 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-tnkfn_7b86e813-9dab-4537-be3d-9903e0b53f70/extract-content/0.log" Nov 26 15:37:02 crc kubenswrapper[4785]: I1126 15:37:02.356377 4785 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-tnkfn_7b86e813-9dab-4537-be3d-9903e0b53f70/extract-content/0.log" Nov 26 15:37:02 crc kubenswrapper[4785]: I1126 15:37:02.469296 4785 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-tnkfn_7b86e813-9dab-4537-be3d-9903e0b53f70/extract-utilities/0.log" Nov 26 15:37:02 crc kubenswrapper[4785]: I1126 15:37:02.477895 4785 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-tnkfn_7b86e813-9dab-4537-be3d-9903e0b53f70/extract-content/0.log" Nov 26 15:37:02 crc kubenswrapper[4785]: I1126 15:37:02.676687 4785 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-lcdxx_e588c635-bfa9-4276-ba15-0ca8b125fc67/extract-utilities/0.log" Nov 26 15:37:02 crc kubenswrapper[4785]: I1126 15:37:02.761057 4785 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-tnkfn_7b86e813-9dab-4537-be3d-9903e0b53f70/registry-server/0.log" Nov 26 15:37:02 crc kubenswrapper[4785]: I1126 15:37:02.790886 4785 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-lcdxx_e588c635-bfa9-4276-ba15-0ca8b125fc67/extract-utilities/0.log" Nov 26 15:37:02 crc kubenswrapper[4785]: I1126 15:37:02.837684 4785 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-lcdxx_e588c635-bfa9-4276-ba15-0ca8b125fc67/extract-content/0.log" Nov 26 15:37:02 crc kubenswrapper[4785]: I1126 15:37:02.905436 4785 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-lcdxx_e588c635-bfa9-4276-ba15-0ca8b125fc67/extract-content/0.log" Nov 26 15:37:03 crc kubenswrapper[4785]: I1126 15:37:03.014088 4785 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-lcdxx_e588c635-bfa9-4276-ba15-0ca8b125fc67/extract-utilities/0.log" Nov 26 15:37:03 crc kubenswrapper[4785]: I1126 15:37:03.034453 4785 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-lcdxx_e588c635-bfa9-4276-ba15-0ca8b125fc67/extract-content/0.log" Nov 26 15:37:03 crc kubenswrapper[4785]: I1126 15:37:03.035761 4785 scope.go:117] "RemoveContainer" containerID="3e8c4964e31211d666dbae0e50d629dfc05802d610982228b7bbd6c4dcfcee2a" Nov 26 15:37:03 crc kubenswrapper[4785]: E1126 15:37:03.035994 4785 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gkxdl_openshift-machine-config-operator(5539d39a-e2bc-4e7f-8b5a-a5e3e10c4ba4)\"" pod="openshift-machine-config-operator/machine-config-daemon-gkxdl" podUID="5539d39a-e2bc-4e7f-8b5a-a5e3e10c4ba4" Nov 26 15:37:03 crc kubenswrapper[4785]: I1126 15:37:03.260386 4785 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6zpfj7_bcaf9d05-38af-46ec-b475-37ba51771361/util/0.log" Nov 26 15:37:03 crc kubenswrapper[4785]: I1126 15:37:03.380662 4785 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-lcdxx_e588c635-bfa9-4276-ba15-0ca8b125fc67/registry-server/0.log" Nov 26 15:37:03 crc kubenswrapper[4785]: I1126 15:37:03.454414 4785 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6zpfj7_bcaf9d05-38af-46ec-b475-37ba51771361/pull/0.log" Nov 26 15:37:03 crc kubenswrapper[4785]: I1126 15:37:03.458606 4785 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6zpfj7_bcaf9d05-38af-46ec-b475-37ba51771361/pull/0.log" Nov 26 15:37:03 crc kubenswrapper[4785]: I1126 15:37:03.483023 4785 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6zpfj7_bcaf9d05-38af-46ec-b475-37ba51771361/util/0.log" Nov 26 15:37:03 crc kubenswrapper[4785]: I1126 15:37:03.606027 4785 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6zpfj7_bcaf9d05-38af-46ec-b475-37ba51771361/util/0.log" Nov 26 15:37:03 crc kubenswrapper[4785]: I1126 15:37:03.606955 4785 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6zpfj7_bcaf9d05-38af-46ec-b475-37ba51771361/pull/0.log" Nov 26 15:37:03 crc kubenswrapper[4785]: I1126 15:37:03.629520 4785 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6zpfj7_bcaf9d05-38af-46ec-b475-37ba51771361/extract/0.log" Nov 26 15:37:03 crc kubenswrapper[4785]: I1126 15:37:03.794460 4785 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-ffv6f_75367016-4697-457d-8bbe-c874cfa6e712/extract-utilities/0.log" Nov 26 15:37:03 crc kubenswrapper[4785]: I1126 15:37:03.825279 4785 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-2fgr9_3dd50a40-cee2-4f3b-b522-cf1ab60c4be6/marketplace-operator/0.log" Nov 26 15:37:03 crc kubenswrapper[4785]: I1126 15:37:03.978392 4785 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-ffv6f_75367016-4697-457d-8bbe-c874cfa6e712/extract-content/0.log" Nov 26 15:37:03 crc kubenswrapper[4785]: I1126 15:37:03.985514 4785 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-ffv6f_75367016-4697-457d-8bbe-c874cfa6e712/extract-content/0.log" Nov 26 15:37:04 crc kubenswrapper[4785]: I1126 15:37:04.020045 4785 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-ffv6f_75367016-4697-457d-8bbe-c874cfa6e712/extract-utilities/0.log" Nov 26 15:37:04 crc kubenswrapper[4785]: I1126 15:37:04.177635 4785 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-ffv6f_75367016-4697-457d-8bbe-c874cfa6e712/extract-content/0.log" Nov 26 15:37:04 crc kubenswrapper[4785]: I1126 15:37:04.223242 4785 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-ffv6f_75367016-4697-457d-8bbe-c874cfa6e712/extract-utilities/0.log" Nov 26 15:37:04 crc kubenswrapper[4785]: I1126 15:37:04.265584 4785 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-ffv6f_75367016-4697-457d-8bbe-c874cfa6e712/registry-server/0.log" Nov 26 15:37:04 crc kubenswrapper[4785]: I1126 15:37:04.353260 4785 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-td9bd_449ae537-a267-4c68-9aea-2712023ab42f/extract-utilities/0.log" Nov 26 15:37:04 crc kubenswrapper[4785]: I1126 15:37:04.527225 4785 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-td9bd_449ae537-a267-4c68-9aea-2712023ab42f/extract-utilities/0.log" Nov 26 15:37:04 crc kubenswrapper[4785]: I1126 15:37:04.534874 4785 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-td9bd_449ae537-a267-4c68-9aea-2712023ab42f/extract-content/0.log" Nov 26 15:37:04 crc kubenswrapper[4785]: I1126 15:37:04.563513 4785 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-td9bd_449ae537-a267-4c68-9aea-2712023ab42f/extract-content/0.log" Nov 26 15:37:04 crc kubenswrapper[4785]: I1126 15:37:04.717455 4785 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-td9bd_449ae537-a267-4c68-9aea-2712023ab42f/extract-content/0.log" Nov 26 15:37:04 crc kubenswrapper[4785]: I1126 15:37:04.746959 4785 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-td9bd_449ae537-a267-4c68-9aea-2712023ab42f/extract-utilities/0.log" Nov 26 15:37:05 crc kubenswrapper[4785]: I1126 15:37:05.135998 4785 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-td9bd_449ae537-a267-4c68-9aea-2712023ab42f/registry-server/0.log" Nov 26 15:37:15 crc kubenswrapper[4785]: I1126 15:37:15.036978 4785 scope.go:117] "RemoveContainer" containerID="3e8c4964e31211d666dbae0e50d629dfc05802d610982228b7bbd6c4dcfcee2a" Nov 26 15:37:15 crc kubenswrapper[4785]: E1126 15:37:15.037710 4785 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gkxdl_openshift-machine-config-operator(5539d39a-e2bc-4e7f-8b5a-a5e3e10c4ba4)\"" pod="openshift-machine-config-operator/machine-config-daemon-gkxdl" podUID="5539d39a-e2bc-4e7f-8b5a-a5e3e10c4ba4" Nov 26 15:37:24 crc kubenswrapper[4785]: I1126 15:37:24.027562 4785 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-5542-account-create-update-kgwxd"] Nov 26 15:37:24 crc kubenswrapper[4785]: I1126 15:37:24.034184 4785 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-db-create-dr56t"] Nov 26 15:37:24 crc kubenswrapper[4785]: I1126 15:37:24.041774 4785 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-db-create-dr56t"] Nov 26 15:37:24 crc kubenswrapper[4785]: I1126 15:37:24.047504 4785 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-5542-account-create-update-kgwxd"] Nov 26 15:37:25 crc kubenswrapper[4785]: I1126 15:37:25.053235 4785 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="37b8d5b5-97ff-4bb8-9149-1af3d8aa824f" path="/var/lib/kubelet/pods/37b8d5b5-97ff-4bb8-9149-1af3d8aa824f/volumes" Nov 26 15:37:25 crc kubenswrapper[4785]: I1126 15:37:25.054817 4785 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b5cd9a63-e7cf-4c00-b352-be258e30c83b" path="/var/lib/kubelet/pods/b5cd9a63-e7cf-4c00-b352-be258e30c83b/volumes" Nov 26 15:37:29 crc kubenswrapper[4785]: I1126 15:37:29.040554 4785 scope.go:117] "RemoveContainer" containerID="3e8c4964e31211d666dbae0e50d629dfc05802d610982228b7bbd6c4dcfcee2a" Nov 26 15:37:29 crc kubenswrapper[4785]: E1126 15:37:29.041097 4785 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gkxdl_openshift-machine-config-operator(5539d39a-e2bc-4e7f-8b5a-a5e3e10c4ba4)\"" pod="openshift-machine-config-operator/machine-config-daemon-gkxdl" podUID="5539d39a-e2bc-4e7f-8b5a-a5e3e10c4ba4" Nov 26 15:37:32 crc kubenswrapper[4785]: I1126 15:37:32.031534 4785 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-db-sync-5bhl4"] Nov 26 15:37:32 crc kubenswrapper[4785]: I1126 15:37:32.041675 4785 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-db-sync-5bhl4"] Nov 26 15:37:33 crc kubenswrapper[4785]: I1126 15:37:33.047950 4785 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e81fe919-3471-4913-a892-b03f703d3ed9" path="/var/lib/kubelet/pods/e81fe919-3471-4913-a892-b03f703d3ed9/volumes" Nov 26 15:37:44 crc kubenswrapper[4785]: I1126 15:37:44.036672 4785 scope.go:117] "RemoveContainer" containerID="3e8c4964e31211d666dbae0e50d629dfc05802d610982228b7bbd6c4dcfcee2a" Nov 26 15:37:44 crc kubenswrapper[4785]: E1126 15:37:44.037636 4785 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gkxdl_openshift-machine-config-operator(5539d39a-e2bc-4e7f-8b5a-a5e3e10c4ba4)\"" pod="openshift-machine-config-operator/machine-config-daemon-gkxdl" podUID="5539d39a-e2bc-4e7f-8b5a-a5e3e10c4ba4" Nov 26 15:37:59 crc kubenswrapper[4785]: I1126 15:37:59.036698 4785 scope.go:117] "RemoveContainer" containerID="3e8c4964e31211d666dbae0e50d629dfc05802d610982228b7bbd6c4dcfcee2a" Nov 26 15:37:59 crc kubenswrapper[4785]: I1126 15:37:59.409247 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-gkxdl" event={"ID":"5539d39a-e2bc-4e7f-8b5a-a5e3e10c4ba4","Type":"ContainerStarted","Data":"650c6c67d3e5fb0fe41946a1c55b7fce91708c5465ab96aa7392f2d192557a46"} Nov 26 15:38:09 crc kubenswrapper[4785]: I1126 15:38:09.512102 4785 generic.go:334] "Generic (PLEG): container finished" podID="c7809fca-9387-4705-a57d-eaf21ba07762" containerID="d38e7c646b10d374027944a3aedcefa7a7c587fe4360e18ea4ff8c2b2a174efc" exitCode=0 Nov 26 15:38:09 crc kubenswrapper[4785]: I1126 15:38:09.512729 4785 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-s5v8p/must-gather-hrpl6" event={"ID":"c7809fca-9387-4705-a57d-eaf21ba07762","Type":"ContainerDied","Data":"d38e7c646b10d374027944a3aedcefa7a7c587fe4360e18ea4ff8c2b2a174efc"} Nov 26 15:38:09 crc kubenswrapper[4785]: I1126 15:38:09.513198 4785 scope.go:117] "RemoveContainer" containerID="d38e7c646b10d374027944a3aedcefa7a7c587fe4360e18ea4ff8c2b2a174efc" Nov 26 15:38:09 crc kubenswrapper[4785]: I1126 15:38:09.951249 4785 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-s5v8p_must-gather-hrpl6_c7809fca-9387-4705-a57d-eaf21ba07762/gather/0.log" Nov 26 15:38:16 crc kubenswrapper[4785]: I1126 15:38:16.735391 4785 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-s5v8p/must-gather-hrpl6"] Nov 26 15:38:16 crc kubenswrapper[4785]: I1126 15:38:16.736212 4785 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-s5v8p/must-gather-hrpl6" podUID="c7809fca-9387-4705-a57d-eaf21ba07762" containerName="copy" containerID="cri-o://50ecfc14415efadbf461a9f6865dd747b1dd3a13c30f20afb0333e3e1b33a1ff" gracePeriod=2 Nov 26 15:38:16 crc kubenswrapper[4785]: I1126 15:38:16.740852 4785 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-s5v8p/must-gather-hrpl6"] Nov 26 15:38:17 crc kubenswrapper[4785]: I1126 15:38:17.130650 4785 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-s5v8p_must-gather-hrpl6_c7809fca-9387-4705-a57d-eaf21ba07762/copy/0.log" Nov 26 15:38:17 crc kubenswrapper[4785]: I1126 15:38:17.131772 4785 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-s5v8p/must-gather-hrpl6" Nov 26 15:38:17 crc kubenswrapper[4785]: I1126 15:38:17.246015 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wvqzl\" (UniqueName: \"kubernetes.io/projected/c7809fca-9387-4705-a57d-eaf21ba07762-kube-api-access-wvqzl\") pod \"c7809fca-9387-4705-a57d-eaf21ba07762\" (UID: \"c7809fca-9387-4705-a57d-eaf21ba07762\") " Nov 26 15:38:17 crc kubenswrapper[4785]: I1126 15:38:17.246199 4785 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/c7809fca-9387-4705-a57d-eaf21ba07762-must-gather-output\") pod \"c7809fca-9387-4705-a57d-eaf21ba07762\" (UID: \"c7809fca-9387-4705-a57d-eaf21ba07762\") " Nov 26 15:38:17 crc kubenswrapper[4785]: I1126 15:38:17.254067 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c7809fca-9387-4705-a57d-eaf21ba07762-kube-api-access-wvqzl" (OuterVolumeSpecName: "kube-api-access-wvqzl") pod "c7809fca-9387-4705-a57d-eaf21ba07762" (UID: "c7809fca-9387-4705-a57d-eaf21ba07762"). InnerVolumeSpecName "kube-api-access-wvqzl". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 26 15:38:17 crc kubenswrapper[4785]: I1126 15:38:17.328602 4785 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c7809fca-9387-4705-a57d-eaf21ba07762-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "c7809fca-9387-4705-a57d-eaf21ba07762" (UID: "c7809fca-9387-4705-a57d-eaf21ba07762"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 26 15:38:17 crc kubenswrapper[4785]: I1126 15:38:17.348003 4785 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wvqzl\" (UniqueName: \"kubernetes.io/projected/c7809fca-9387-4705-a57d-eaf21ba07762-kube-api-access-wvqzl\") on node \"crc\" DevicePath \"\"" Nov 26 15:38:17 crc kubenswrapper[4785]: I1126 15:38:17.348036 4785 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/c7809fca-9387-4705-a57d-eaf21ba07762-must-gather-output\") on node \"crc\" DevicePath \"\"" Nov 26 15:38:17 crc kubenswrapper[4785]: I1126 15:38:17.585857 4785 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-s5v8p_must-gather-hrpl6_c7809fca-9387-4705-a57d-eaf21ba07762/copy/0.log" Nov 26 15:38:17 crc kubenswrapper[4785]: I1126 15:38:17.586431 4785 generic.go:334] "Generic (PLEG): container finished" podID="c7809fca-9387-4705-a57d-eaf21ba07762" containerID="50ecfc14415efadbf461a9f6865dd747b1dd3a13c30f20afb0333e3e1b33a1ff" exitCode=143 Nov 26 15:38:17 crc kubenswrapper[4785]: I1126 15:38:17.586479 4785 scope.go:117] "RemoveContainer" containerID="50ecfc14415efadbf461a9f6865dd747b1dd3a13c30f20afb0333e3e1b33a1ff" Nov 26 15:38:17 crc kubenswrapper[4785]: I1126 15:38:17.586626 4785 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-s5v8p/must-gather-hrpl6" Nov 26 15:38:17 crc kubenswrapper[4785]: I1126 15:38:17.618151 4785 scope.go:117] "RemoveContainer" containerID="d38e7c646b10d374027944a3aedcefa7a7c587fe4360e18ea4ff8c2b2a174efc" Nov 26 15:38:17 crc kubenswrapper[4785]: I1126 15:38:17.670373 4785 scope.go:117] "RemoveContainer" containerID="50ecfc14415efadbf461a9f6865dd747b1dd3a13c30f20afb0333e3e1b33a1ff" Nov 26 15:38:17 crc kubenswrapper[4785]: E1126 15:38:17.670836 4785 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"50ecfc14415efadbf461a9f6865dd747b1dd3a13c30f20afb0333e3e1b33a1ff\": container with ID starting with 50ecfc14415efadbf461a9f6865dd747b1dd3a13c30f20afb0333e3e1b33a1ff not found: ID does not exist" containerID="50ecfc14415efadbf461a9f6865dd747b1dd3a13c30f20afb0333e3e1b33a1ff" Nov 26 15:38:17 crc kubenswrapper[4785]: I1126 15:38:17.670874 4785 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"50ecfc14415efadbf461a9f6865dd747b1dd3a13c30f20afb0333e3e1b33a1ff"} err="failed to get container status \"50ecfc14415efadbf461a9f6865dd747b1dd3a13c30f20afb0333e3e1b33a1ff\": rpc error: code = NotFound desc = could not find container \"50ecfc14415efadbf461a9f6865dd747b1dd3a13c30f20afb0333e3e1b33a1ff\": container with ID starting with 50ecfc14415efadbf461a9f6865dd747b1dd3a13c30f20afb0333e3e1b33a1ff not found: ID does not exist" Nov 26 15:38:17 crc kubenswrapper[4785]: I1126 15:38:17.670900 4785 scope.go:117] "RemoveContainer" containerID="d38e7c646b10d374027944a3aedcefa7a7c587fe4360e18ea4ff8c2b2a174efc" Nov 26 15:38:17 crc kubenswrapper[4785]: E1126 15:38:17.671240 4785 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d38e7c646b10d374027944a3aedcefa7a7c587fe4360e18ea4ff8c2b2a174efc\": container with ID starting with d38e7c646b10d374027944a3aedcefa7a7c587fe4360e18ea4ff8c2b2a174efc not found: ID does not exist" containerID="d38e7c646b10d374027944a3aedcefa7a7c587fe4360e18ea4ff8c2b2a174efc" Nov 26 15:38:17 crc kubenswrapper[4785]: I1126 15:38:17.671270 4785 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d38e7c646b10d374027944a3aedcefa7a7c587fe4360e18ea4ff8c2b2a174efc"} err="failed to get container status \"d38e7c646b10d374027944a3aedcefa7a7c587fe4360e18ea4ff8c2b2a174efc\": rpc error: code = NotFound desc = could not find container \"d38e7c646b10d374027944a3aedcefa7a7c587fe4360e18ea4ff8c2b2a174efc\": container with ID starting with d38e7c646b10d374027944a3aedcefa7a7c587fe4360e18ea4ff8c2b2a174efc not found: ID does not exist" Nov 26 15:38:18 crc kubenswrapper[4785]: I1126 15:38:18.868230 4785 scope.go:117] "RemoveContainer" containerID="50f09461476963cbe51c87d941006a786c0461375c73c06f76c6b5dc5956c448" Nov 26 15:38:18 crc kubenswrapper[4785]: I1126 15:38:18.901339 4785 scope.go:117] "RemoveContainer" containerID="4b0ef06591a646674dc835100c6d1a286b80e10cc357cffd7c1e1b3636a5c35b" Nov 26 15:38:18 crc kubenswrapper[4785]: I1126 15:38:18.964377 4785 scope.go:117] "RemoveContainer" containerID="081d20860d7400cd61f7bde70e0d3481e66c32caf9cbbe4dabee7fa14820e4bb" Nov 26 15:38:19 crc kubenswrapper[4785]: I1126 15:38:19.044448 4785 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c7809fca-9387-4705-a57d-eaf21ba07762" path="/var/lib/kubelet/pods/c7809fca-9387-4705-a57d-eaf21ba07762/volumes" Nov 26 15:38:46 crc kubenswrapper[4785]: E1126 15:38:46.736822 4785 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/system.slice/systemd-hostnamed.service\": RecentStats: unable to find data in memory cache]" Nov 26 15:39:19 crc kubenswrapper[4785]: I1126 15:39:19.046734 4785 scope.go:117] "RemoveContainer" containerID="90c5acb30ad680d4d4e83c48bedd33d933eabe341995fc406429a70569aeead2" Nov 26 15:39:19 crc kubenswrapper[4785]: I1126 15:39:19.072547 4785 scope.go:117] "RemoveContainer" containerID="e3f5c65ba95b26d1475bb9e2dd916bd6c2a42d6816987e2943c8b1bf84c9985a" Nov 26 15:39:19 crc kubenswrapper[4785]: I1126 15:39:19.094617 4785 scope.go:117] "RemoveContainer" containerID="85198a4a0b020c0771901c8d52ecdc9a47a1b24b484dbd636cadcf60cb843423" Nov 26 15:39:19 crc kubenswrapper[4785]: I1126 15:39:19.131417 4785 scope.go:117] "RemoveContainer" containerID="db0b9c6c5995e44ceafc79a1d2c6dc67ba5ec7dd91f1a2bde6b657c37abe34fc" Nov 26 15:39:19 crc kubenswrapper[4785]: I1126 15:39:19.172574 4785 scope.go:117] "RemoveContainer" containerID="25e29c4311627efe08667078492a435d58d0181767c202437d86567030f68d9f" Nov 26 15:39:19 crc kubenswrapper[4785]: I1126 15:39:19.192990 4785 scope.go:117] "RemoveContainer" containerID="6f40a2db002e05e9f10bc8ff950b20eac68d64a61c9fba802fb86e21b04752ba" Nov 26 15:40:07 crc kubenswrapper[4785]: I1126 15:40:07.289699 4785 patch_prober.go:28] interesting pod/machine-config-daemon-gkxdl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 26 15:40:07 crc kubenswrapper[4785]: I1126 15:40:07.290438 4785 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-gkxdl" podUID="5539d39a-e2bc-4e7f-8b5a-a5e3e10c4ba4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 26 15:40:19 crc kubenswrapper[4785]: I1126 15:40:19.256524 4785 scope.go:117] "RemoveContainer" containerID="5e4a529b7319bcf2293f91ed3726ea197be6826d2eacab8b5611b309a0522ab5" Nov 26 15:40:19 crc kubenswrapper[4785]: I1126 15:40:19.280812 4785 scope.go:117] "RemoveContainer" containerID="5f5693f19addbc2d9415e35aff03b3d7eb99db31d8eb9a265fd90aa26357c093" Nov 26 15:40:19 crc kubenswrapper[4785]: I1126 15:40:19.310579 4785 scope.go:117] "RemoveContainer" containerID="cf7bd40cb3cb3a3bf0cffa98e568dc4b7a5659a4fd41b0030090d7c17e876acf" Nov 26 15:40:37 crc kubenswrapper[4785]: I1126 15:40:37.289292 4785 patch_prober.go:28] interesting pod/machine-config-daemon-gkxdl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 26 15:40:37 crc kubenswrapper[4785]: I1126 15:40:37.289947 4785 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-gkxdl" podUID="5539d39a-e2bc-4e7f-8b5a-a5e3e10c4ba4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused"